Smart cities will depend on the Internet of Things to monitor and react to changes in transit, utility use, and their residents’ behaviors. Tier 1 automotive supplier Xevo Inc. is already using machine learning and cloud-based analytics to help drivers in connected cars.
In December, UIEvolution Inc. acquired Surround.io Corp. and renamed itself Xevo. The company has developed technology to improve the driver experience.
Xevo is combining big data gathered from millions of mobile sensors on the roads today with artificial intelligence to gather insights about driving habits, vehicle status, and traffic patterns that can be analyzed and fed back to drivers in real time.[note style=”success” show_icon=”false”]
- Through a combination of in-car processors, cloud services, and machine learning, Xevo is bringing IoT to roadways now.
- Xevo’s AI systems take head-unit data and provide recommendations directly to drivers.
- Major automakers are working to integrate nationwide training models, engineering discipline, and dynamic updates to improve the driving experience, even before self-driving cars become available.
Crew has decades of experience
Executives at both Kirkland, Wash.-based UIEvolution and Seattle-based Surround previously worked at Microsoft Corp.
“We knew Satoshi Nakajima, and it was obvious right away that there was an opportunity,” said John Ludwig, CEO of Surround and now president of the AI group at Xevo.
Nakajima worked on Windows 95 and Internet Explorer 3 and is now chief scientist at Xevo. Ludwig was a vice president at Microsoft and founded Surround in 2013.
“We started in the PC business and saw the Internet, mobile, and cloud waves,” Ludwig told Robotics Business Review. “Connected cars are the next major device — this wave will be bigger than those before it.”
Connected cars get smarter
Even before self-driving cars reach the market, AI can improve daily commutes. Connected cars can not only provide information, entertainment, and marketing to drivers and passengers, but they can also convey data on component wear, movement, and traffic patterns.
“A huge transformational shift is taking place in the automotive industry; the very nature of the car experience is changing dramatically,” Ludwig said. “We saw a chance to massively improve the driver experience through the user interface and machine learning.”
As its name implied, UIEvolution worked to develop the user interface and UIE CloudConnect in-vehicle “infotainment” platform. With Surround.io, Xevo has broadened its focus from middleware to machine vision, deep learning, and cloud services.
“Ten million cars on the road have some form of Xevo software in their head unit,” said Brian Woods, chief marketing officer at Xevo. “Our biggest customers include Honda, Toyota, its Lexus division, and others. As you can imagine, that’s lots of edge or endpoint devices.”
BI Intelligence predicts that 94 million connected cars will ship in 2021, representing 82% of the market, compared with 21 million vehicles today. It also expects that 381 million connected cars will be operating by 2020.
Big data behind the wheel
The challenge is processing data from those millions of vehicles, combining it with GPS to see shifting traffic patterns, and providing useful recommendations to individual drivers.
“The development of AI and machine learning in the past four to five years has an obvious application to cars, with the amount of data they’re creating,” Ludwig said. “With experience from Amazon Web Services, Azure, and Surround.io, we can categorize video and audio data on a massive scale.”
Xevo Cloud integrates with existing in-car technologies and allows for updates as software and vehicles change. Xevo is a member of the SmartDeviceLink Consortium for in-car apps, along with Ford, Toyota, and others.
“The connected car has a camera or two today, but it will have even more in the future,” Ludwig added. “The key is to have a machine learning engine in the car processing data, extracting high-value bits, and presenting them to a user in a way that’s valuable.”
“One way to do that is compressing the data. At the same time, the car needs a trained model to understand that data,” he said. “We need many vehicles collecting data in the street, the cloud-side component, and reliable delivery back to the cars. We need both pieces to work well in high-volume scenarios.”
“We have the tools to create models of the whole U.S.,” said Woods. “We can make a dumb road smart.”
Watching wear and tear
“There’s lots of fascinating things we can do even without camera data,” said Woods. “We’ve already got tons of data and seen unexpected results.”
Xevo can let partner automakers know which parts are wearing faster than others. “There’s lots of OBDC [open database connectivity] data already available to Toyota technicians,” Ludwig said.
“Since we’re gathering data from multiple vehicles, most OEMs will want data for their customers, so we ensure that the data remains theirs,” he added. “Most manufacturers won’t want to pool their data.”
Safer and smoother rides
With a suite of internal and external cameras and aggregate data, Xevo’s AI could ultimately detect when drivers are distracted or tired.
In addition, Xevo can conduct big data analysis to determine vehicle speed and location. Cars from the same automaker can communicate with one another rather than rely on environmental tags or just their own observations.
“Connected cars can provide V2V [vehicle-to-vehicle] situational awareness for safety and mechanical reasons,” said Woods. “We can do cross-sectional stuff and predictive analysis of things like the availability of parking. As more users opt in, they can learn about the weather, road conditions, etc.”
Xevo’s systems can detect problems such as an accident or traffic jam and alert drivers to alternate routes. In addition, Xevo Sync connects smartphones to in-car systems.
“End users could benefit from a scorecard,” said Ludwig. “How am I doing? ‘By the way, based on the usage of your car, you might need new tires. Your driving patterns suggests that rainy weather is more challenging for you,’ and you might be able to look at your route history and glean information from other drivers to find a more efficient route to work.”
“Eventually, end users could provide data for insurance discovery, but that’s not our first goal, and it would be opt-in,” he said. “Once they have that feedback on navigation, safety, and the state of the vehicle, end users might want to opt in for better insurance or to get a better deal on tires.”
“We must be super respectful of privacy and security,” Ludwig noted. “Our core vision is pushing enough compute power out to the edge and doing an analysis of the full range of car data plus video data to provide advice to drivers at the point of capture.”
“Raw data that’s potentially sensitive or uniquely identifying never has to flow up to the cloud,” Ludwig explained. “It’s low-level information — canvas data from the internal bus of the car. This includes data about brakes, performance, etc., and entertainment system interaction data.”
“Another example of what we can do with existing data, based on a University of Washington paper — is generate a ‘fingerprint’ of the driver from canvas data — how they turn the wheel, depress pedals,” he said. “Since that’s unique the car can know if it’s the approved driver.”
A peek into the future
Ludwig and Woods showed an impressive real-time map of traffic patterns and other mechanical data from 4,000 cars in Las Vegas. Xevo is able to combine this with weather, real estate, and other information.
“We’ve shown a preview of what the car experience will look like in the future,” Ludwig said.
Toyota is working with Xevo Cloud‘s Hyperscale AI, which it said “lets automakers run massive data analytics and training models in the largest AI server farm in the world.”
“Instead of weeks of computation and manual analysis, we make it easy to test millions of machine-learning models simultaneously. This means unique applications for individual drivers and vehicles,” according to Xevo. “This technology is already live in our Xevo test vehicles, which are on the roads today gathering data and generating results.”
Xevo is also working with several partners, including two delivery fleet companies, which Ludwig declined to identify. “It’s a great way to kickstart this and collect a lot of data in a high-usability case,” he said.
“They’re interested in a lot of information, and there are fewer privacy concerns,” Ludwig said. “They can validate drivers, detect variance from their own fingerprints, and notice how different shifts work. Efficiency from rerouting or dealing with weather makes every mile driven more valuable.”
“We’re working on two things,” he said. “For existing connected cars that already have a lot of canvas data, there’s an opportunity to do analytics based just on that data that’s highly valuable, such as about tire wear and recommendations, without any video data.”
“Beyond that, we’ll be placing cameras inside and outside cars to see driver behavior and the environment and build models based on that,” Ludwig said. “I’m a huge believer in giving tons of info to the end user. People will want it and take advantage of it in different ways.”
Xevo’s multi-tenanted architecture could also help with smart cities as the U.S. investigates the required infrastructure.[note style=”success” show_icon=”true”]
More on Self-Driving and Connected Cars:
- Autonomous Cars Accelerate Toward the Future
- Service Robotics Strategy: Put the Customer First
- Argo AI Gets Picked Up by Ford for $1 Billion
- How Human-Robot Interaction Advanced at Willow Garage
- Reasons Why 2017 Will Be the Year of Robotics
- Machine Vision Investments Eye Safety, New Apps
- Cloud Robotics Will Lead to General-Purpose Robots, Says Toyota’s Kuffner
- Assessing the Size of the IIoT Market
Self-driving on the horizon
With all the major automakers working on self-driving cars, how would Xevo’s offerings fit in?
“The exact boundary will play out over time, but my guess is we’ll have a mixed environment,” said Ludwig. “In some cases, there will be an active driver who’ll need more information.”
“In other cases, in autonomous mode, our systems will play great together,” he predicted. “We’re not focused on solving the autonomous driving challenges; we’re focused on making driving better for the driver.”
“We’ll take advantage of any substrated data, such as from lidar, to help drivers,” Ludwig said. “We want to give people more info about different situations to help make them make better decisions.”
“What if you knew that you were approaching an area of the city that has more accidents at night or a section of the road that tends to get slippery?” he asked. “Maybe you’d drive a different way.”
“We could provide the platform for a mountain of data,” said Woods. “We could enhance autonomy while at the same time helping drivers.”
“By focusing on delivering valuable information to the end user rather than on automation, we’re on a quicker path to commercialization and can avoid regulatory challenges,” Ludwig said.