Human-Machine InteractionMotion DesignPrototypingUsability Testing
Tsahao Yu – Interaction Design InternShuai Deng – Senior Interaction DesignerJia Jian – Interaction Design LeadYingjie Ding – Visual Designer
May 20195 Months
Traditionally consumers give high preference to a car’s road performance, now there is a trend that the intelligent ability is becoming an important selling point for electronic cars. Equipped with sensors and strong processor, Xpeng’s new product, P7 sedan, is defined as the next-gen smart car, featuring personalized recommendation, location-based service, and of course, Level-3 autopilot. At the same time, driving is an eyes-busy and hands-busy task, interaction with the in-car entertainment system undeniably adds cognition load to the driver and arises safety concern.
Elements that need to be frequently interacted in driving context are put under easy reaching zone.
Info Flow feature provides personalized recommendation based on the status of the car and user habits. Only one tap is needed to perform the task.
Personal Assistant provides reliable voice interaction. Besides, users are allowed to use more than one mode at a time to interact with the system.
I conducted research by informally interviewing both friends who drive cars with traditional and modern infotainment systems, asking goals of employees from other teams, and gathering information from HMI studies and reports.
EfficiencyOrganizedReliabilityClear FeedbackDriving AttentivenessSafetyRich FunctionalityGratification (Money Well Spent)
Brand ReputationIntelligenceFuturistic CharacteristicUser Satisfaction
User SatisfactionEffective NotificationProblem SolvingCurrent User Number
The vehicle cockpit is evolving with the transition to electrification from mechanical components. With the evolvement, car manufacturers are incorporating an iPad-like touchscreen to provide better entertainment and assisted driving services. When designing for this new media, we cannot simply use mobile or desktop design principles without applying it to new context.
Before digging into our infotainment system interface design, I looked into other familiar computing platforms and analized their design principles for UI placement.
Most users are right-handed and thus feel more comfortable to leave the cursor on the right top part of the screen. Besides, The outer edges of the user interface can be reached with greater speed due to the pinning action of the screen.
Fitts’ law states that the amount of time required for a person to move a pointer to a target area is a function of the distance to the target divided by the size of the target. Therefore, we should try to arrange UI components in a way that can minimize movement.
To do that, in desktop context, designers should understand users’ cursor position when entering an application or a web page. When designing for mobile, designers need to understand how users hold the device, with one hand or two, in landscape or portrait mode. When driving, drivers need to use at least one hand to hold the steering wheel. In countries where people drive on the right, drivers have to interact with in-car system with their right hand. Therefore, we should make sure interactive content is easy to reach from the right hand position on the steering wheel.
Tesla Model 3
A common problem of current in-car touchscreen UI is that important command menus are placed too far from driver’s hand position. For instance, the red lines on above images indicate the distance between the normal right hand position and the control area. This will lead to longer time to interact and less time to pay attention to unexpected stimuli on the road. Further UI position will also cause noticeable body movement, thus make the driver have less control of the vehicle.
We did a usability testing for the in-car touchscreen and divided the screen area into several zones based on their reachability. Generally, the nearer it is to the right hand position, the easier it is to reach. It is worth mentioning that there is a flat surface intersects vertically at the bottom of the screen, which makes it hard to aceess the bottom 100px area. Therefore, we should avoid placing interactive content there.
We rated the UI components based on their usage frequency and relevance to driving. Dock provides entrance to several driving-related features, including climate control and defogging. Info Flow is a shortcut list that predicts the tasks that the driver may want to perform. These two features are highly related to driving, therefore we should ensure they are esay to reach. The App/Map area doesn’t require drivers to perform frequently. The status bar is the least important and it shows information like the network status, battery status, air quality index.
I made several wireframes with the above UI components. To maximize reachability, the first option is used. As a result, the most important components, dock and info flow, are placed in the easy-reaching zone, other components are arranged to further areas.
* visual design was created by Yingjie Ding.
An analysis on drivers’ inattention by National Highway Traffic Safety Administration established links between distraction behaviors, crashes, and near crashes. In real-life observation, we also found that interaction with infotainment system placed a visual demand on the drivers, which is met at the expense of the glances towards rear view mirror and to the forward roadway. Needless to say, complicated in-car interaction could place drivers at an increased crash risk.
Traditional car owners usually use their smartphones to navigate, answer the phone, and listen to music. However, lacking the integration with the car, smartphone cannot access or control the hardware, including seat heater, battery status and driving mode.
Besides, text size on mobile phone is not glanceable from an arm’s distance. Therefore, it requires more attention to interact for drivers.
In order to occupy less visual attention from the driver, some modern cars incorporate a personal assistant to let users interact with voice. However, users have to speak specific key words or press a physical button to activate the assistant. From time to time, the so-called smart assistant fails to recognize what the user have said, and is usually called “Artificial Stupidity”. Its unreliability and its nature of passiveness result in a poor user experience.
Intelligent Recommendation Card
To drive the car safely, drivers need to pay enough attention to the road and keep at least one hand on the wheel. Hearing and speaking is the last two senses that are not heavily engaged in operating of the car. Therefore, we provide voice interaction throughout the system, which can be activated by the voice button or speaking a keyword. Additionally, drivers can interact with pop-up windows with steering wheel buttons.
To ensure every pop-up notification can be interacted with all of voice, touchscrenn and buttons, I helped copywrite the input and output scripts of voice control, and completed the list of all interactions of pop-up windows.
However, only using one method at a time to finish complicated tasks could be challenging and increases the cognition load, no matter what method it is. On the other hand, a string query can bring users to the interface that they need, or provide several options for them to further select by glancing at the screen and touching. This is less distractive and way more efficient than using only one method.
The process of interaction does not stop at a user’s input moment, they need clear feedback to confirm that their commands are effective and are properly handled by the system. Traditionally, this feedback can be easily communicated by the tactile sense of physical buttons. Touchscreen, however, requires user’s additional visual resource to confirm. Therefore, I use clear motion effect to communicate a valid interaction.
*I made early animations. Final design was finished by motion designers.
Throughout the design process, I really put on users’ shoes to understand their position constraints and cognition constraints, and considered the balance of usability and user experience to create the best fit for the users. However, I realize one limitation of our work is that our testing method is insufficient to carry out a quantifiable measure of usability. If we had the resource, I would do further eye-tracking test to understand which interface or interaction requires more attention resources and yields a safety concern.
Plus, regarding the deficiency of tactile feedback of touchscreen, even though we communicate feedback through responsive and clear motion, it still requires drivers to take their eyes off the road for a split second to make a confirmation. In a future product, we want to add a haptic engine to communicate non-visual feedback to make these touch surfaces even less distractive. Info Flow could also better understand their needs by tracking eye movement and provide relevant real-time response.