BMW “Envisioning” Project
Bot Framework VA SALES and Marketing 2018
The Project:
Microsoft has been working with BMW to create a “branded assistant” experience in their automobiles for awhile, working with the MCS team and dedicated designers, voice teams and Cortana. What the managed to make was a hybrid version of Cortana in a car.
The Bot Framework has a new product called the Virtual Assistant, it has a set of core skills that can be used to create a similar experience that they hope to share with other businesses.
Voice enabled Personal Assistant integrated into the car providing end users the ability to perform traditional car operations (e.g. navigation, radio) along with productivity focused scenarios such as moving meetings when you're running late, adding items to your task list and proactive experiences where the car can suggest tasks to complete based on events such as starting the engine, traveling home or enabling cruise control. Adaptive Cards are rendered within the Head Unit and speech integration performed through Push-To-Talk or Wake Word interactions.
My Role:
An Android tablet version of the VA was developed to be used for sales teams to show clients some of the potential use cases. Talk to the bot have it answer some questions, book a meeting, set up service with BMW, find a place for a dinner. Envisioning is essentially making up what something could be like for people have a hard time with having ideas.
Version 1: was to skin the existing tablet as a similar tool to a car interface, build the UI components to make it look like something that could exist on the dash of car.
Version 2: was just a nicer and more elaborate version, that was made to look more like electric cars
“This was not meant to be a literal depiction that could be used in a car, VUI design is a different animal, and requires a huge team… but to make reasonable proof of concept, that I was able to do.“
Research and Sketches
Based on the scenarios that the team had set up before starting the project I already had a pretty good idea of the flows that would need to be demonstrated. I knew we would need the following flows: 1) Raise the interior temperature while driving. 2) Find a restaurant for dinner and book a reservation for a group. 3) Warning light for a low tire or oil change, book appointment with BMW and go to closest dealer for service. 4) Weather for destination I’m flying too, combined with directions to the airport.
This was 2018 so car navigation had been around for a bit, but early Tesla’s were just starting to bring in the larger ipad type displays… a lot more work would need to go into determining what was enabled while driving, type size for driving, hands free function we used a “Wake Phrase” to keep the branding as generic as possible for demos to other businesses, “Hey Assistant”
First Working Prototype
A lot of the features we showed in this protoype were just part of a demo and not everything works, such as the trip timers and audio. A very effective demo though, and could be easily used by sales teams on a surface to demonstrate the voice and connectivity features. Latency continues to be a bit of an issue as words have to be translated through a complex flow that is still a little slow. Voice - Text - Language Understanding - Text - Voice.
Second Prototype
The first prototype was enough to get the team interested, this version was a little more complicated, but presented how the tablet would simulate the interaction between the user and the Virtual Assistant. This comp was presented to leadership, sales and engineering.
Working in car Prototype
After many long months of concepting, engineering and demos we were able to get the BMW team to implement our tech into the headunit of a prototype X7 for the 2019 Microsoft Build Conference. With a mixture of concepts that we envisioned and plenty of redesign from the BMW Design Team. This was an amazing project, the video below is of Dewain Robinson Principal Program Manager on our team working with the prototype in the car.