Researchers at Heriot-Watt University, Edinburgh, have developed sensors which provide data that could help autonomous vehicles see and operate safely in adverse weather.
According to the Radiate project, until now almost all the available labelled data has been based on sunny, clear days. This resulted in no public data available to help develop autonomous vehicles that can operate safely in rain, snow, and fog.
The team, based at Heriot-Watt’s Institute of Sensors, Signals and Systems, has published a new dataset that includes three hours of radar images and 200,000 tagged road actors including other vehicles and pedestrians.
Professor Andrew Wallace and Dr Sen Wang have been collecting data since 2019, when they kitted out a van with lidar, radar and stereo cameras, and geopositioning devices.
They drove the car around Edinburgh and the Scottish Highlands to capture urban and rural roads at all times of day and night, purposefully chasing bad weather.
Wallace said: “Datasets are essential for developing and benchmarking perception systems for autonomous vehicles. We’re many years from driverless cars being on the streets, but autonomous vehicles are already being used in controlled circumstances or piloting areas.
“We’ve shown that radar can help autonomous vehicles to navigate, map and interpret their environment in bad weather, when vision and lidar can fail.”
The team believe by labelling all the objects the system identified on the roads, they’ve provided another step forward for researchers and manufacturers.
Wang added: “We labelled over 200,000 road objects in our dataset – bicycles, cars, pedestrians, traffic signs and other road actors. We could use this data to help autonomous vehicles predict the future and navigate safely.
“When a car pulls out in front of you, you try to predict what it will do – will it swerve, will it take off? That’s what autonomous vehicles will have to do, and now we have a database that can put them on that path, even in bad weather.”
The researcher’s ultimate goal is to improve perception capability. “We need to improve the resolution of the radar, which is naturally fuzzy,” said Wallace.
“If we can combine hi-res optical images with the weather-penetrating capability of enhanced radar that takes us closer to autonomous vehicles being able to see and map better, and ultimately navigate more safely.”