Prof. Henry Liu and Michigan Traffic Lab Featured in Forbes
University Of Michigan deploys augmented reality system to aid testing of automated vehicles.
University Of Michigan deploys augmented reality system to aid testing of automated vehicles.
View the Forbes article here: University of Michigan Deploys Augmented Reality System to Aid Testing of Automated Vehicles
All things considered, there have been remarkably few accidents so far as a result of real-world testing of automated vehicles. However, those vehicles have all had safety drivers and engineers on board to monitor the situations and take over when there is any doubt about the vehicle’s ability to handle things. Unfortunately, this sort of testing isn’t particularly suitable for evaluating a lot dangerous edge cases such as a vehicle running a red light or making a dangerous left turn. That’s where the University of Michigan has applied augmented reality at the Mcity test facility in Ann Arbor, Mich.
Mcity opened in June 2015 as a public private partnership between the University and a group of automakers, suppliers and other companies including insurers. The 37-acre test track on the university’s research campus on the northeast side of the city has a variety of road types and replicas of real-world environments for testing connected and automated vehicles.
A research team in the Michigan Traffic Lab led by Prof. Henry Liu has developed the augmented reality system that enables engineers and researchers to evaluate challenging and dangerous scenarios in a safe environment. The traffic lab controls all of the signalling and tracks the vehicles on the facility. It collects and archives data from those vehicles for the use of the researchers.
In addition to taking in data, the lab can feed out data to the vehicles via the vehicle-to-infrastructure communications system. Using the same dedicated short-range communications (DSRC) technology that Cadillac has for the vehicle-to-vehicle (V2V) system in the recently updated CTS, the traffic lab can place virtual vehicles around the physical test vehicles. The system uses a powerful simulation environment to calculate the trajectory of multiple other vehicles in the area and the position, speed and heading is broadcast over the V2I system. The virtual vehicles can be tracked relative to the test vehicle on the screen in the lab while the tester reacts in real time to what it thinks it is “seeing.”
In a demonstration this week, we saw an actual car running a red light as an autonomous test car recognized its presence and stopped to yield the right of way at the intersection prior to making a left turn. The same test was then repeated with a virtual car running the red with the same response. While similar tests of software could be run entirely in simulation, there is also value to doing this with physical prototypes to evaluate other hardware and software elements of the system. The benefit of this augmented or mixed reality environment is that it is much safer, with no risk of real crashes as well as much less expensive since it doesn’t require a fleet of vehicles and drivers to simulate the road environment.
The augmented reality system has only be online for about one month but it will be available going forward to all of the researchers and companies using Mcity.