[youtube https://www.youtube.com/watch?v=hhUNgfUxDL4&w=560&h=349]
Researchers at École polytechnique fédérale de Lausanne (EPFL) have combined data from two autonomous cars to create a wider field of view, extended situational awareness, and greater safety.
Autonomous vehicles get their intelligence from cameras, radar, light detection and ranging (LIDAR) sensors, and navigation and mapping systems. But there are ways to make them even smarter. Researchers at EPFL are working to improve the reliability and fault tolerance of these systems by sharing data between vehicles. For example, this can extend the field of view of a car that is behind another car.
Using simulators and road tests, the team has developed a flexible software framework for networking intelligent vehicles so that they can interact.
Cooperative perception
“Today, intelligent vehicle development is focused on two main issues: the level of autonomy and the level of cooperation,” says Alcherio Martinoli, who heads EPFL’s Distributed Intelligent Systems and Algorithms Laboratory (DISAL). As part of his PhD thesis, Milos Vasic has developed cooperative perception algorithms, which extend an intelligent vehicle’s situational awareness by fusing data from onboard sensors with data provided by cooperative vehicles nearby.
The researchers used cooperative perception algorithms as the basis for the software framework. Cooperative perception means that an intelligent vehicle can combine its own data with that of another vehicle to help make driving decisions.
They developed an assistance system that assesses the risk of passing, for example. The risk assessment factors in the probability of an oncoming car in the opposite lane as well as kinematic conditions such as driving speeds, the distance required to overtake, and the distance to the oncoming car.
Difficulties in fusing data
The team retrofitted two Citroen C-Zero electric cars with a Mobileye camera, an accurate localization system, a router to enable Wi-Fi communication, a computer to run the software and an external battery to power everything. “These were not autonomous vehicles,” says Martinoli, “but we made them intelligent using off-the-shelf equipment.”
One of the difficulties in fusing data from the two vehicles involved relative localization. The cars needed to be able to know precisely where they are in relation to each other as well to objects in the vicinity.
For example, if a single pedestrian does not appear to both cars to be in the same exact spot, there is a risk that, together, they will see two figures instead of one. By using other signals, particularly those provided by the LIDAR sensors and cameras, the researchers were able to correct flaws in the navigation system and adjust their algorithms accordingly. This exercise was even more challenging because the data had to be processed in real time while the vehicles were in motion.
Although the tests involved only two vehicles, the longer-term goal is to create a network between multiple vehicles as well with the roadway infrastructure.
In addition to driving safety and comfort, cooperative networks of this sort could eventually be used to optimize a vehicle’s trajectory, save energy, and improve traffic flows.
Of course, determining liability in case of an accident becomes more complicated when vehicles cooperate. “The answers to these issues will play a key role in determining whether autonomous vehicles are accepted,” says Martinoli.
[youtube https://www.youtube.com/watch?v=hhUNgfUxDL4?rel=0]
École polytechnique fédérale de Lausanne (EPFL) | Networked intelligent vehicles