Hand-Tracking Tech Watches Riders in Self-Driving Cars to See If They’re Ready to Take the Wheel

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/cars-that-think/transportation/self-driving/handtracking-tech-for-the-next-generation-of-autonomous-cars

While tracking someone’s hands may sound simple, it can be hard to do in the cramped confines of a car, where there are only a few good spots to place a camera. A driver’s hands can also become occluded by one another or by objects, and cameras may be hindered by, for example, the harsh lighting of the sun on the driver’s arm.

In their new approach, Yuen and Trivedi took an existing program for tracking the full-body movements of people and adapted it to track the wrists and elbows of a driver, and also of a passenger, if present. It distinguishes between the right and left joints of both riders in the front seats. The researchers then develop and applied machine learning algorithms to train the system to support Level 3 autonomous technology. They trained the system with 8,500 annotated images.

“The approach is capable of highly accurate, and very efficient hand detection, localization, and activity analysis in a very wide range of real-world driving situations, involving multiple humans and multiple vehicles,” says Trivedi.

Their analysis shows that the system was able to identify the location of each of eight joints present (the right/left-side elbows/wrists of both passenger/driver) with 95 percent accuracy. However, the system has a localization error of 10 percent when estimating the average length of someone’s arms.

Some instances where the tracking system did not work include when the driver was wearing unique clothing with heavy artistic texturing that was not represented in the training set, and when one of the driver’s arms blocked the camera’s view of the other arm.

The researchers say some of the problems encountered during their tests can be addressed by placing the camera in a better location to avoid occlusions, using multiple camera views, and increasing the training dataset to include more variety in clothing.

“This project is part of our larger research effort on the development of safe autonomous vehicles,” says Trivedi. He adds that the team is talking with at least one potential client about using this technology in a commercial setting, but said he couldn’t divulge which company has expressed interest.