Newer cars already incorporate some artificial intelligence, but soon it will transform every aspect of driving, from navigation and voice commands to autonomy.

The goal is to create a perfect experience for everyone in the vehicle, from making it simpler to use voice commands to find an Italian restaurant with a good wine list, to enabling a self-driving car to know when it is okay to drive off the road to avoid an obstacle.

There is nothing to fear from this rapid expansion of machine intelligence, but a few spectacular failures outside the automotive world do show humans need to play a supervisory role over AI programs for the foreseeable future.

These are key takeaways from a diverse panel of experts at the recent WardsAuto User Experience Conference discussing how AI and cognitive learning can recast our time behind the wheel.

AI and neural networks already have become part of driving for those who use smartphone personal assistants through Apple CarPlay and Android Auto. Adding machine intelligence to vehicles also promises to become big business for companies such as IBM, speech recognition supplier Nuance and software and chip supplier NVIDIA.

AI is driving the fourth industrial revolution and will transform many industries like steam and electricity did in earlier times, says Sanford Russell, head of autonomous driving at NVIDIA. Neural networks and what NVIDIA calls deep learning emerged in 2012 and “blew away 25 years of programming expertise,” he says.

IBM is making a big play in the medical field using AI and now is looking to make a splash in automotive as Watson, the company’s cloud-based computer system, uses what IBM calls cognitive computing to comb through vast amounts of published research and data to analyze information and diagnose problems.

IBM is rebranding itself as a cognitive computing company and has gotten out of most of the older hardware businesses with which it has been associated. 

At the WardsAuto UX Conference, Dan Ricci, global automotive leader at IBM Cognitive Solutions, tells attendees IBM is an established automotive supplier with about $5 billion in annual auto-related revenue.

Now it is looking to expand further by making Watson a Siri-like assistant inside the vehicle that provides more personalized services than smartphones by incorporating vehicle information and the owner’s manual to perfect the driving experience.

Ricci outlines a scenario where Watson enhances the user experience by monitoring miles traveled and letting drivers know when preventative maintenance is required; explains the function of components such as the timing belt and why it needs to be replaced; and even schedules service appointments and arranges loaner vehicles.

Watson also can act as a co-pilot, advising the best routes and when and where the adaptive cruise control should be activated to maximize fuel economy.

Tayler Blake, a machine learning expert at Pillar Technology, says interfaces that use algorithms to converse in natural language are becoming important because they simplify the user interface and make it more personal at the same time. “No matter how cool your user interface is, it would be even cooler if there is less of it,” she says.

“In 2016, we’re living faster than ever. I want to perform tasks with as little interaction as possible, but I want it to seem relevant and personal. Algorithms simplify bots like Siri. How to integrate algorithms with current (user experience) design practices, that’s the big question,” she says.