Fully autonomous vehicles have the potential to unlock exciting, new opportunities for the in-cabin experience. Before we get to full autonomy, however, carmakers will need to determine creative ways in which manual and autonomous driving modes can coexist. As voice interfaces become mainstream entertainment and productivity tools, they will help facilitate the interaction between vehicle and driver from manual to autonomous experiences. Voice, unlike other means of communication, can convey a sense of urgency, making it easy to quickly accomplish tasks, when the driver needs to keep their hands on the wheel and eyes on the road, as well as when the driver becomes the passenger.
To showcase the possibilities of in-vehicle voice control, Amazon’s Alexa Auto team is working with Elektrobit, an automotive software developer with a long history developing the underlying human-machine interface (HMI) architecture for the car. Elektrobit has been a trusted speech-integration partner for top automotive brands including Audi, BMW, General Motors, and Volkswagen. Elektrobit and Alexa Auto first demonstrated their advanced technology interactive full-screen display at CES 2019. It is now accessible to automakers and suppliers interested in a first-hand look.
The demo is based on the Amazon Alexa Auto SDK, and shows how Alexa can help drivers move from manual to autonomous driving through an interactive full-screen display, allowing visitors to experience the handover from manual to automated driving utilizing Elektrobit software.
According to recent research from J.D. Power, consumers have expressed their desire for the same voice service they have at home to be available in the car. To support this, Alexa helps the consumer navigate to their destination, make calls, play music, find local businesses, connect with their smart homes, and so much more – utilizing the tens of thousands of Alexa skills just like at home. Many of these skills continue to get contextualized for the in-cabin experience.
The futuristic demo allows you to imagine yourself on a trip in the future where there is an autonomous route ahead. Alexa informs you that it is coming up so you can simply push the buttons on either side of the steering wheel to engage the autonomous driving feature. Alexa will inform you that you are now in autonomous mode and you get to enjoy the ride and engage with Alexa in all new ways that were not available while in manual mode. For instance, you could also ask Alexa to play your favorite streamed video content onto the screen in front of you (of course this capability is only in autonomous drive, not in manual drive).
Let’s assume that a car accident has happened up ahead, and Alexa can either require you to take back control as the car shifts back to manual mode. Or perhaps, it only requires a slowing down of the vehicle to handle the traffic conditions and Alexa shares that the vehicle is slowing down, but will remain in autonomous driving mode.
Either way, when the autonomous route is coming to an end, Alexa can speak up and let you know that a countdown has started to move back to manual mode. And, if the driver doesn’t re-engage on time, the car can autonomously pull over to the side safely and parks until the driver takes control. Of course, even back in manual mode, Alexa is available for entertainment and productivity features while the driver keeps their hands on the wheel and eyes on the road.
While the demo is a great experience, the bottom line is that Elektrobit’s deep expertise industry knowledge and collaboration with Alexa has brought to life how automakers can bring new voice-forward experiences to customers faster with Alexa.
Source: Alexa Developer Blog