The future of the car is unmanned. Everyone knows it, and it was exponentially exposed to CES 2018 with a host of companies showing why their particular flavor of autonomous driving was better than the company just a few feet above the Convention’s Las Vegas Parking Center.
The fact is that true self-driving cars – those that can drive in any street and take you door-to-door – are probably still several years away. The not so revolutionary but more immediate trend in automotive technology that will change the way we drive in the next few years is connected cars.
The next generation of cars will always have mobile connections and internal sensors, which will allow automakers to re-imagine their car experiences with touchscreens, voice interactions and even gesture control. It sounds awesome and looks really elegant, but, after what I’ve tried at CES, the dashboards of the future may not be what we want, or even a very good experience as a whole.
The owners of the upgrade car will feel the earliest in technology that is already commonplace on dashboards: touch. However, the touchscreens of tomorrow will be a leap forward of the resistive touch screens that are the norm in today’s cars (unlike Tesla): they will be much closer to the capacitive multitouch experiences we are used to iPhones and iPads.
The dashboard iPad
The poster child for these is Mercedes Class A from 2018 . Delivered very soon, the new car has a 10-inch touch screen, big but not too big, and as responsive as a smartphone. The user interface has a great iconography designed to not distract, and it is very customizable: you can even configure it to display nothing while driving (which would be my preference).
With regard to the big-screen dashboards, the most bizarre I’ve seen was the Byton SUV concept . The 40 inch display (!) Is recessed, and it extends from one side of the windshield to the other. This is not a touch screen; instead, the driver controls everything with buttons and a touch screen on the steering wheel.
As the car stops, the Byton activates a new type of interaction: gestures. You can point and select items on the screen with your fingers, as you would with Microsoft Kinect. It’s an IU experience that I’ve never found satisfying, and if it was used to control the car’s functions, it would be downright dangerous, but fortunately Byton was wise enough to turn it off. the control of gestures while the car was moving.
Watch whose car is talking
Forget touch and point on the screens, though; In the end, what we really want is to simply talk to our cars and have them do things for us. Basically, we want Alexa in the car.
A number of automakers have already announced that they were working on this topic, but the CES has seen them grow, including Toyota, the world’s largest automaker. Panasonic has announced that it would work to bring Alexa and Google Assistant to the infotainment systems that it provides to automakers.
At least some automakers, however, think that they can create a better voice experience for their own cars, and one of their demos – the Mercedes Class A that I mentioned earlier. Above – almost convinced me. The car assistant was excellent at analyzing the natural language and inferring the intention. Instead of saying clinical instructions like, “Mercedes, lowering the cabin temperature by 2 degrees,” you can just say, “Mercedes, it’s hot”, and it will perform the same action. Preparing specific songs, genres and radio stations was just as simple and natural.
I also had the opportunity to experience the vocal experience of the latest Toyota Camry XLE, which uses voice recognition technology from a company called Voicebox. Again, he might interpret natural commands like, “I really need to call Sam.”