Xmart OS-P7 In-Car System

Providing intelligent infotainment system with safer driver interaction.
Skills

Human-Machine Interaction
Motion Design
Prototyping
Usability Testing

Members

Tsahao Yu – Interaction Design Intern
Shuai Deng – Senior Interaction Designer
Jia Jian – Interaction Design Lead
Yingjie Ding – Visual Designer

Duration

May 2019
5 Months

Introduction

Problem Brief

Traditionally consumers give high preference to a car’s road performance, now there is a trend that the intelligent ability is becoming an important selling point for electronic cars. Equipped with sensors and strong processor, Xpeng’s new product, P7 sedan, is defined as the next-gen smart car, featuring personalized recommendation, location-based service, and of course, Level-3 autopilot. At the same time, driving is an eyes-busy and hands-busy task, interaction with the in-car entertainment system undeniably adds cognition load to the driver and arises safety concern.

Design Statement

We need to create an infotainment system that is intelligent, interactive, yet not distractive.

Overview

Interface with High Reachability

Elements that need to be frequently interacted in driving context are put under easy reaching zone.

Interaction that Avoids Distraction

Info Flow feature provides personalized recommendation based on the status of the car and user habits. Only one tap is needed to perform the task.

Multimodal Interaction

Personal Assistant provides reliable voice interaction. Besides, users are allowed to use more than one mode at a time to interact with the system.

Research

I conducted research by informally interviewing both friends who drive cars with traditional and modern infotainment systems, asking goals of employees from other teams, and gathering information from HMI studies and reports.

Research Goals
  • Understand users’ in-car interaction habits
  • Discover pain points while using in-car system
  • Pinpoint specific issues within the broad problem
  • Identify Design Principles by Understanding the Context
Research Insights
  • Users are not used to in-car touchscreen interaction, mostly because they cannot perform tasks in one step like in the old days with physical buttons, and the feedback is not clear.
  • More than 50% Xpeng G3 owners use personal assistant every week, especially when they are driving, and most of them think the assistant still needs improving.
  • Interface is not user-friendly, as sometimes drivers need to move the body to interact.
  • More than half users concern the complicated system would interfere with driving safety.
  • Several near crash events in the dataset exhibited infotainment system use, which suggests drivers’ insufficient attention to their driving environment.
  • Computing power of P7’s processor makes it possible to provide more intelligent feature.
  • P7 is a product of premium level to represent Xpeng’s vision of automobiles.
Stakeholder Value Analysis
Car Owner

Efficiency
Organized
Reliability
Clear Feedback
Driving Attentiveness
Safety
Rich Functionality
Gratification (Money Well Spent)

Passengers

Entertainment
Safety
Functionality

Internal Team

Brand Reputation
Intelligence
Futuristic Characteristic
User Satisfaction

In-Car App Developer

User Satisfaction
Effective Notification
Problem Solving
Current User Number

Design Goals
  • Usability
  • Intelligence
  • Safety

Ideate & Design

The vehicle cockpit is evolving with the transition to electrification from mechanical components. With the evolvement, car manufacturers are incorporating an iPad-like touchscreen to provide better entertainment and assisted driving services. When designing for this new media, we cannot simply use mobile or desktop design principles without applying it to new context.

1. Interface Arrangement

Before digging into our infotainment system interface design, I looked into other familiar computing platforms and analized their design principles for UI placement.

Desktop

Most users are right-handed and thus feel more comfortable to leave the cursor on the right top part of the screen. Besides, The outer edges of the user interface can be reached with greater speed due to the pinning action of the screen.

Mobile
For mobile devices, we should understand how users really hold their devices. The situation varies between different screen sizes. But a general pattern is that the bottom area is usually within the easy-reaching zone.
Insights

Fitts’ law states that the amount of time required for a person to move a pointer to a target area is a function of the distance to the target divided by the size of the target. Therefore, we should try to arrange UI components in a way that can minimize movement.

To do that, in desktop context, designers should understand users’ cursor position when entering an application or a web page. When designing for mobile, designers need to understand how users hold the device, with one hand or two, in landscape or portrait mode. When driving, drivers need to use at least one hand to hold the steering wheel. In countries where people drive on the right, drivers have to interact with in-car system with their right hand. Therefore, we should make sure interactive content is easy to reach from the right hand position on the steering wheel.

Current Problem

Xpeng G3

Tesla Model 3

A common problem of current in-car touchscreen UI is that important command menus are placed too far from driver’s hand position. For instance, the red lines on above images indicate the distance between the normal right hand position and the control area. This will lead to longer time to interact and less time to pay attention to unexpected stimuli on the road. Further UI position will also cause noticeable body movement, thus make the driver have less control of the vehicle.

Ideate

We did a usability testing for the in-car touchscreen and divided the screen area into several zones based on their reachability. Generally, the nearer it is to the right hand position, the easier it is to reach. It is worth mentioning that there is a flat surface intersects vertically at the bottom of the screen, which makes it hard to aceess the bottom 100px area. Therefore, we should avoid placing interactive content there.

We rated the UI components based on their usage frequency and relevance to driving. Dock provides entrance to several driving-related features, including climate control and defogging. Info Flow is a shortcut list that predicts the tasks that the driver may want to perform. These two features are highly related to driving, therefore we should ensure they are esay to reach. The App/Map area doesn’t require drivers to perform frequently. The status bar is the least important and it shows information like the network status, battery status, air quality index.

I made several wireframes with the above UI components. To maximize reachability, the first option is used. As a result, the most important components, dock and info flow, are placed in the easy-reaching zone, other components are arranged to further areas.

Result

* visual design was created by Yingjie Ding.

2. Key Feature: Info Flow

Current Problem

An analysis on drivers’ inattention by National Highway Traffic Safety Administration established links between distraction behaviors, crashes, and near crashes. In real-life observation, we also found that interaction with infotainment system placed a visual demand on the drivers, which is met at the expense of the glances towards rear view mirror and to the forward roadway. Needless to say, complicated in-car interaction could place drivers at an increased crash risk. 

Traditional car owners usually use their smartphones to navigate, answer the phone, and listen to music. However, lacking the integration with the car, smartphone cannot access or control the hardware, including seat heater, battery status and driving mode.

Besides, text size on mobile phone is not glanceable from an arm’s distance. Therefore, it requires more attention to interact for drivers.

In order to occupy less visual attention from the driver, some modern cars incorporate a personal assistant to let users interact with voice. However, users have to speak specific key words or press a physical button to activate the assistant. From time to time, the so-called smart assistant fails to recognize what the user have said, and is usually called “Artificial Stupidity”. Its unreliability and its nature of passiveness result in a poor user experience.

Design Opportunities
Journey Map of a One-way Trip
  1. Users feel nervous when they have to perform other tasks while ensuring driving safety at the same time. We should help users finish those tasks faster and more conveniently to reduce burden.
  2. Current personal assistant can only follow users’ order. Users are not satisfied with the intelligence. The assistant would be more helpful if it could learn users’ habits and predict their desires.
Solution

Info Flow provides shortcuts for users

to quickly perform tasks.

Intelligent Recommendation Card

App Card

We found that the average number of apps a driver interacts with in one trip is no more than 5. Therefore, adding recent apps into Info Flow can help drivers access their frequent apps much faster and easier. 

3. Interaction Methods

MultiModal Interaction

To drive the car safely, drivers need to pay enough attention to the road and keep at least one hand on the wheel. Hearing and speaking is the last two senses that are not heavily engaged in operating of the car. Therefore, we provide voice interaction throughout the system, which can be activated by the voice button or speaking a keyword. Additionally, drivers can interact with pop-up windows with steering wheel buttons.

To ensure every pop-up notification can be interacted with all of voice, touchscrenn and buttons, I helped copywrite the input and output scripts of voice control, and completed the list of all interactions of pop-up windows.

However, only using one method at a time to finish complicated tasks could be challenging and increases the cognition load, no matter what method it is. On the other hand, a string query can bring users to the interface that they need, or provide several options for them to further select by glancing at the screen and touching. This is less distractive and way more efficient than using only one method.

Interaction Feedback

The process of interaction does not stop at a user’s input moment, they need clear feedback to confirm that their commands are effective and are properly handled by the system. Traditionally, this feedback can be easily communicated by the tactile sense of physical buttons. Touchscreen, however, requires user’s additional visual resource to confirm. Therefore, I use clear motion effect to communicate a valid interaction.

*I made early animations. Final design was finished by motion designers.

Reflection

Throughout the design process, I really put on users’ shoes to understand their position constraints and cognition constraints, and considered the balance of usability and user experience to create the best fit for the users. However, I realize one limitation of our work is that our testing method is insufficient to carry out a quantifiable measure of usability. If we had the resource, I would do further eye-tracking test to understand which interface or interaction requires more attention resources and yields a safety concern.

Plus, regarding the deficiency of tactile feedback of touchscreen, even though we communicate feedback through responsive and clear motion, it still requires drivers to take their eyes off the road for a split second to make a confirmation. In a future product, we want to add a haptic engine to communicate non-visual feedback to make these touch surfaces even less distractive. Info Flow could also better understand their needs by tracking eye movement and provide relevant real-time response.