Chris Middleton explains why the future is bright for autonomous transport, as long as it doesn’t rain, you aren’t an ethnic minority, and you don’t live around a corner.
First the good news: autonomous, connected, and electric vehicles finished 2020 on a strong note, according to research from analyst firm CBInsights.
After a Covid-related weak start to 2020, the end of the year saw companies in the zero emissions, vehicle electrification, and autonomous car spaces attract significant investor interest.
Overall, venture funding to these sectors hit $27.6 billion – five percent down year on year – but with a record number of exits: 107.
If investments are a guide, US’ dominance of the space is slipping, according to the research. North America’s deal share fell for the fourth consecutive year, to 32 percent, perhaps reflecting the Trump administration’s lack of support for green initiatives and new technologies.
By contrast, Europe’s share rose from 22 percent of global investments to 25 percent, while Asia’s tally stands at 39 percent – down two percent on 2019.
With autonomous drone deliveries also poised to become a reality over the next decade, as aviation authorities cautiously permit beyond visual line of sight (BVLOS) flights in test areas, it might seem that a new era of driver- and pilot-free transport is upon us. But there are challenges.
While autonomous cars or trucks might find it easy to navigate grid-based US cities or the long, straight roads between them, network and satellite connectivity can be a severe problem in so-called ‘urban canyons’ of skyscrapers.
Meanwhile, the more complex road networks in most European cities could also present challenges and obstacles, as would patchy or non-existent Wifi, 5G and/or edge connectivity in some areas – including towns and cities.
In these circumstances, vehicles would have to rely on accurate onboard sensors, intelligence, and navigation systems. But can they do so today?
The existence of autonomous rovers on Mars says they can, but there is a world of difference between slowly navigating a dead planet and moving at speed in crowded, messy, unpredictable cities populated by millions of people.
There is another risk on a living planet: the weather. Research published this month by the Intelligent Vehicles Group at WMG, a specialist engineering department at Warwick University, finds that heavy rain affects the ability of LiDAR sensors to detect objects accurately at a distance.
Researchers used WMG’s 3xD simulator to test LiDAR sensors in all weather conditions – in this case using a virtual model of Coventry. Simulation is a vital stage in the development of autonomous vehicles, as driving millions of miles virtually is much safer than driving them in the real world.
LiDAR sensors work by emitting beams of near-infrared light, which reflect off objects. Researchers found that if a beam hits a raindrop at a short distance from the transmitter, the water can reflect enough light back to the receiver for it to perceive it as an object.
With cars moving at speed, this would present a constant problem in bad weather.
Droplets also absorb some of the light, degrading sensor performance – issues that are inevitably more severe in heavy rain. The research, Realistic LiDAR with Noise Model for Real-Time Testing of Automated Vehicles in a Virtual Environment, has been published in the IEEE Sensors Journal.
Dr Valentina Donzella, from WMG, University of Warwick said, “Ultimately we have confirmed that the detection of objects is hindered to LiDAR sensors the heavier the rain and the further away they are.
“This means that future research will have to investigate how to ensure LiDAR sensors can still detect objects sufficiently in a noisy environment.
“The developed real-time sensor and noise models will help to further investigate these aspects, and may also inform autonomous vehicles manufacturers’ design choices, as more than one type of sensor will be needed to ensure the vehicle can detect objects in heavy rain.”
There are other challenges for the developers of autonomous vehicles to overcome.
One is the risk of accidental or systemic bias in the imaging or data-based systems that vehicles rely on to make decisions. This is not a hypothetical problem. In 2019, researchers from Georgia Tech published a research paper called Predictive Inequity in Object Detection which explored whether detection systems in autonomous vehicles performed equally well with light- and dark-skinned pedestrians.
The results were alarming: researchers found that systems were consistently better at identifying pedestrians with light skin tones than those with darker skin.
They found that the problem was partly rooted in the imaging systems, and partly in the training data, which included 3.5 times more white people than black. More data equals more accuracy, but these systems need to be able to identify minority groups equally well, which means weighting the training data to compensate.
So it may be a while before you can hail that driverless taxi with confidence, even as limited services make their way onto city streets in the US – largely following simple, preset routes.