The future of the car is unmanned. Everyone knows it, and it was exponentially exposed to CES 2018 with a host of companies showing why their particular flavor of autonomous driving was better than the company just a few feet above the Convention’s Las Vegas Parking Center.

The fact is that true self-driving cars – those that can drive in any street and take you door-to-door – are probably still several years away. The not so revolutionary but more immediate trend in automotive technology that will change the way we drive in the next few years is connected cars.

The next generation of cars will always have mobile connections and internal sensors, which will allow automakers to re-imagine their car experiences with touchscreens, voice interactions and even gesture control. It sounds awesome and looks really elegant, but, after what I’ve tried at CES, the dashboards of the future may not be what we want, or even a very good experience as a whole.

The owners of the upgrade car will feel the earliest in technology that is already commonplace on dashboards: touch. However, the touchscreens of tomorrow will be a leap forward of the resistive touch screens that are the norm in today’s cars (unlike Tesla): they will be much closer to the capacitive multitouch experiences we are used to iPhones and iPads.

The dashboard iPad

The poster child for these is Mercedes Class A from 2018 . Delivered very soon, the new car has a 10-inch touch screen, big but not too big, and as responsive as a smartphone. The user interface has a great iconography designed to not distract, and it is very customizable: you can even configure it to display nothing while driving (which would be my preference).

 The dashboard of the new Mercedes A-Class makes the voice and touches the main ways to interact. "Data-fragment =" m! C2ab "data-image =" https: //i.amz.mshcdn. com / OyUATf6vhrPXU_pxKLGQIm7MZ3k = / https% 3A% 2F% 2Fuploads% 2Fcard% 2Fimage% 2F693132% 2F0d70f87c-6a2b-4c21-9d1b-3e67578148ff.jpg "data-micro =" 1 " /> </p>
<p> The dashboard of the new Mercedes A-Class makes hearing and touching the main ways to interact. </p>
<p> Image: Pete Pachal / Mashable </p>
<p> The dashboards like the A-Class make me feel uncomfortable, however, since they completely give up the most pimples hard. This is not a trend of which I am a fan, especially because I find that keeping your eyes on the road is easier if you can find your way around a dashboard with touch. Of course, that's why Class A also offers conversational voice interactions, allowing you to simply talk to your car to do things like play music or turn on AC power. More on that in a minute. </p>
<p> <img alt=

With regard to the big-screen dashboards, the most bizarre I’ve seen was the Byton SUV concept . The 40 inch display (!) Is recessed, and it extends from one side of the windshield to the other. This is not a touch screen; instead, the driver controls everything with buttons and a touch screen on the steering wheel.

As the car stops, the Byton activates a new type of interaction: gestures. You can point and select items on the screen with your fingers, as you would with Microsoft Kinect. It’s an IU experience that I’ve never found satisfying, and if it was used to control the car’s functions, it would be downright dangerous, but fortunately Byton was wise enough to turn it off. the control of gestures while the car was moving.

Watch whose car is talking

Forget touch and point on the screens, though; In the end, what we really want is to simply talk to our cars and have them do things for us. Basically, we want Alexa in the car.

A number of automakers have already announced that they were working on this topic, but the CES has seen them grow, including Toyota, the world’s largest automaker. Panasonic has announced that it would work to bring Alexa and Google Assistant to the infotainment systems that it provides to automakers.

At least some automakers, however, think that they can create a better voice experience for their own cars, and one of their demos – the Mercedes Class A that I mentioned earlier. Above – almost convinced me. The car assistant was excellent at analyzing the natural language and inferring the intention. Instead of saying clinical instructions like, “Mercedes, lowering the cabin temperature by 2 degrees,” you can just say, “Mercedes, it’s hot”, and it will perform the same action. Preparing specific songs, genres and radio stations was just as simple and natural.

I also had the opportunity to experience the vocal experience of the latest Toyota Camry XLE, which uses voice recognition technology from a company called Voicebox. Again, he might interpret natural commands like, “I really need to call Sam.”

 The Toyota Camry XLE dashboard, with voice interactions working with Voicebox. "Data-fragment =" m! E557 "data-image =" 3A% 2F% 2Fuploads% 2Fcard% 2Fimage% 2F693130% 2Fef26fa90- e35a-4f1e-b2bf-47f1a50e7ae7.jpg "data-micro =" 1 "/> </p>
<p> The Toyota Camry XLE Dashboard, with voice interactions powered by Voicebox </p>
<p> Image: Pete Pachal / Masahble </p>
<p> The promise of all these systems is that, as cars connect , the experience can be updated at any time. Tesla has long been offering software updates over the air, and other automakers have followed suit. When  5G  becomes a reality, the trend will only accelerate. </p>
<p> Many of these interfaces are bumpy. When I was in the Mercedes, I had trouble understanding the word "jazz". The control of Byton's gestures looks like a disaster. And there are still questions about security and confidentiality to be addressed. But the car experience is changing, and driving a car in the 2020s will probably be a bit empty without Alexa, Google, or any of their talkative cousins ​​as a passenger. I hope you were not too much in love with these pimples. </p>
<p> <img alt=

Source link


Please enter your comment!
Please enter your name here