As we saw in the introduction to this series, in 2015 Acura’s dashboards and interior controls were poorly regarded by the automotive press. “Busy,” “redundant,” “illogical,” and “confusing” were popular criticisms; this wasn’t a good look for a company that was trying to re-kickstart their original mantra of “precision crafted performance.”
At the 2016 L.A. Auto Show, the company took the unusual step of unveiling not a concept car–they’d already done that with the Precision Concept earlier that year–but a standalone concept interior, replete with a futuristic, next-generation HMI (Human-Machine Interface).
While it borrowed some styling cues from the NSX, the Precision Cockpit’s controls were boiled down to a minimum; the center console contained a single knob and, below that, what the company would call their True Touchpad Interface, a position-mapped way of interacting with the dashboard screen. This screen, by the way, had been placed atop the dashboard–the better to keep drivers’ eyes on the road–and intentionally placed far back and out of physical reach.
“The system we conceived has been created with the driver in mind,” Acura Executive Creative Director Dave Marek explains. “A traditional touchscreen approach is intuitive, what you see is what you press. But it also forces a compromised placement of the screen – close to the driver and out of the driver’s natural line-of-sight. A traditional remote interface – found in many luxury cars – solves these problems, but creates a new one: the interaction between the remote and the display is indirect and clumsy.
“The Acura Precision Cockpit’s touchpad overcomes these issues by using absolute position mapping for the first time in the driving environment, combining the flexibility and usability of a touchscreen with the comfort and reduced driver distraction of a remote.”
The Precision Cockpit also featured Acura’s “Digital Meter,” a screen located behind the steering wheel that rendered difficult-to-see obstacles ahead. This system would also, using AI, predict and illustrate potential paths of moving obstacles, communicating this back to the driver–or an autonomous system, in a driver-less future.
These new sorts of interfaces and technologies didn’t design themselves, of course. Nor are they the purview of the stylist, clay modeler and digital modeler that we looked at previously, nor the interior designer we’ll look at next. Instead they are the domain of a dedicated HMI Designer, which in this case is an antiques-loving, design and media arts graduate named Shaun Westbrook. Here we chat with Westbrook on his unusual profession and how it’s executed.
Core77: Can you describe your position?
Shaun Westbrook: I’m a Principal HMI Lead. HMI is Human-Machine Interface, dealing with the human interaction between the car, both hardware and software. And in the future, with AI. I’ve been at Acura for about three years, so I’m fairly new blood.
We’ll go out on a limb and guess that few kids say “I want to be an HMI designer when I grow up.” What led you towards this aspect of design?
I grew up building computers with my father as a hobby. I had Tandy computers and 486s all over my bedroom and played games on floppy disks. Also, my father worked in aerospace and brought home a lot of blueprints; I’d go over them and memorize bombers and jets, and started drawing them first in CorelDRAW, then AutoCAD.
I had a PlayStation too, but that [old-school] era of computer graphics really got my attention. Gaming was really PC-oriented back then, so I experimented with a variety of joysticks and steering wheels. I found that some worked well, others were awkward and didn’t.
I had been engaged with this stuff at an early age, but yeah, I never said “I want to be an HMI person when I grow up.” I think people today don’t even know about HMI; it’s an emerging field.
Are there even HMI degree programs? What did you study?
I graduated from UCLA’s Design Media Arts program, which I’d merged with computer science. I took advantage of the school’s research department, so I was hanging around with neuroscience majors and taking classes all around the campus. I was very research-oriented.
I knew how to code from an early age and pursued that career at an early time. Merged with design, that really became the early days of computer graphics. HMI, when it comes to interaction, I’d say a lot of that background really helped put me in the right direction.
How did that bring you into car design?
My job progression started with what I consider the first smart apps, which were smart ads. This was before HTML5. A lot of the Internet ads were focused around building interactions, so I got a lot of experience developing things for agencies and large movie brands. Ironically, a lot of you probably have come across a lot of my ads in previous years. That was my fault.
Then the Smart TV era hit. I was very excited about that, helping to build and develop user experience for the living room. Doing that, I learned a lot about how people engage with entertainment content, how people want to make purchases and how people define what a luxury experience is for infotainment and entertainment.
Then smartphones, the iPhone, iOS and Android all kicked in, so I applied my background to some major brands for interaction with those apps and user experiences.
All of this playing around with gadgets and exploring different things triggered something in my mind. I realized that there is a career path out there, and I didn’t know about it. People look at, say, Tony Stark’s character in the Marvel movies and go “Hey, that’s cool, but it’s so make believe. It doesn’t exist.” The reality is, it does. It’s HMI and it gives you the ability to look into a heads-up display. Not to mention Tony Stark drives an NSX in the Avengers movie–that helped me say “I want to work for Acura.”
What does your work here consist of?Our priorities are the user experience and how to minimize driver distraction. I’m trying to bring a lot of the experiences from consumer electronics to help create easy-to-use, intuitive interfaces. My goal is to apply my interactive background and my love for computers and graphic design with something that can help society.
Automotive technology is much better today, yet if you look at [car accident] stats over the years, you’ll see there’s actually more crashes. With the current youth culture, we’re too busy looking at our phones, being distracted. There’s a lot of the things we ought to look out for, especially when it comes to hitting pedestrians. That’s a big theme that I think we projected and put forward in the Precision Cockpit.
In a way I feel like I’m redeeming myself–my first career out of college with the smart ads, my job was to create distraction, and I did feel guilty for that. Now, with the Precision Cockpit we showed how you can reduce driver distraction. That’s Acura’s new direction and something that felt very rewarding to be able to work on.
How does the Precision Cockpit reduce driver distraction?
For one thing, we have an interface that uses what we call absolute-position mapping. It’s this touchpad in the lower middle, which corresponds with the screen higher up in the center of the dashboard. You’ll notice the position of the center screen is really high up, keeping the driver’s eyes in the direction of the road, and it’s also out of reach. How it works is, you [control items on the screen] by placing your finger on the touchpad; but it’s not like a laptop–the actual coordinates are mapped. Meaning if you touch the top left or the bottom right of the touchpad, that registers on the top left or bottom right of the screen.
A lot of the different crashes that are occurring these days are due to bad peripheral vision. If we could have our driver’s [eyes aimed in the] direction to have their peripheral vision in the road ahead, we really feel that that’s the best user experience. You saw our RDX today, and it has the touchpad and the high center screen. With the RDX we’ve implemented a lot of the features that we showed from the Precision Cockpit itself.
Every design discipline has to ask themselves different sorts of questions in order to achieve their goals. What are the things that HMI designers ask themselves?
“How do you create a visual language around autonomous?” “How do you sketch out these concepts of looking for pedestrians?” “How do you maintain performance?” And how to put all of that together, while creating both a simplicity and a luxury feel. These are very exciting areas that I’m happy to work on.
What does an HMI designer use in terms of research?
We’re constantly working with new tools, things related to detecting cognitive load, looking at biometrics, doing user testing, finding new approaches that technology offers to make sure our customers can really have the best experience. We have engineering and studies done in our Ohio R&D center. We’ve partnered with the Ohio State University and they have a drive-distraction simulator where, for longer than the RDX was in development, we were developing the Human-Machine Interfaces to support the car. We simulate all kinds of things. We’ve put actual people in the driver-simulation lab. We monitor their behavior and simulated scenarios and try to measure things like distraction.
Speaking of Honda’s R&D center, Acura is a relatively small company, but parent company Honda is obviously massive. What kinds of resources does that put at your disposal?
Part of my process in the research phase is trying to look at existing solutions that might not be obvious to most people. And one example of how being at this company helped: We not only manufacture cars, motorcycles and all sorts of other power tools and equipment, but we also make a jet plane. So I traveled to Honda’s jet factory and R&D studio in North Carolina to learn from the aviation industry’s HMI. I asked them “What’s the most important thing you’ve designed for pilots, who are already flying autonomously?”
The answer was what’s called Synthetic Vision Mode. That’s the mode that they monitor on screen that’s almost like a video game graphic, but it’s showing that, say, there’s a hill in front of you because it could be cloudy that day and you wouldn’t see it. So the pilots monitor the Synthetic Vision Mode while the plane is essentially driving itself, you could say, in the air.
I took that inspiration and applied it to the Precision Cockpit’s Digital Meter concept [which renders hard-to-see obstacles on a screen]. That’s an advanced vision mode meant to show the importance of preparing for autonomous.
Things like the Synthetic Vision Mode and preparing for autonomous are pretty hi-tech or futuristic, but in your presentation you also showed slides of antique objects for inspiration. Can you talk about that?
When it comes to inspiration for HMI, most people might think it’s just the far future that can inspire you. But looking back at the Age of Antiquity, I feel there’s actually a lot you can learn for HMI. While there weren’t any computers, laptops or anything like that, they had technology. Any kind of human experience or user experience was there.
If you look at the astrolabe that I have, it’s an antique. That’s kind of the original Google Maps or the first smartphone. You put this on a ship, and it took a skilled person to decipher it, but you could use it to navigate by the stars. That was a luxury of sorts, and it also provided performance. How can we emulate that today? How can we evolve it into simplicity?
Luckily here in Torrance, where the Design Studio is, we have a lot of antique shops. I spend a lot of my lunch breaks going through them, trying to discover different treasure troves that I find very fascinating. Studying the craftsmanship and figuring out how users interacted with them back then. Because my theory is that all users have the same problems, the same user needs, and we’re just fulfilling them in different eras.
Up Next: The HMI cannot exist in a vacuum, but must be integrated into the car by the interior designer, who also needs to consider passenger comfort, ingress/egress, driving position and more. Stay tuned for our chat with the interior designer.