US car giant Ford will share data from an entire year of driving from two of its self-driving cars to the wider academic research community to help fast-track autonomous driving technology.
In particular, the extra data will help researchers gain important insights into how the vehicles identify objects and hazards in a wide variety of driving conditions.
Ford's information will be shared through the Amazon open data program to provide “the data [the community] needs to create effective self-driving vehicle algorithms.”
Full autonomous driving technology in cars relies on a combination of sensors to register the car’s environment – such as cameras (to identify objects), LiDAR (to measure the distance of objects), telemetries of GPS and trajectory sensors (to calculate speed and forces) – and the self-driving software’s ability to ‘learn’.
“Every second a self-driving vehicle is operating, it’s gathering information about the world around it,” Ford said in a statement.
“Without all this data, self-driving cars wouldn’t even be able to leave a parking lot.”
However, merely telling a self-driving software what to do is not sufficient for autonomous driving – a system must also be able to learn and update itself.
“High-quality data is needed to help engineers and researchers create software that can properly teach self-driving vehicles how to analyse their environments,” the Ford statement said.
Ford’s data is especially valuable for development of autonomous driving technology because it was compiled by two separate self-driving cars operating at the same time over an entire year.
By operating two separate self-driving cars simultaneously – including the use of freeways, tunnels, residential neighbourhoods, airports and densely populated urban areas – Ford’s data gathered information on how each self-driving car reacted to the other in varying environments.
In a future where autonomous vehicles could make up a growing proportion of vehicles on the road – self-driving cars will need to know how to negotiate manoeuvres with each other.
Ford has acknowledged that objects – including hazards – can often be obstructed from a vehicle’s sensory equipment, however in plain sight of another car close by.
“One vehicle has limited 'vision' in terms of what it can see … which is because the vehicle’s sensors could not penetrate those areas,” the Ford statement said.
“But with multiple vehicles in the same general area, it’s feasible one would detect things the others simply cannot, potentially opening up new routes for multi-vehicle communication, localization, perception and path planning.”
This technology would be similar to Volkswagen’s ‘Car2X’ – recently announced for the new Golf performance line – that enables cars to communicate with nearby infrastructure, allowing suitably-equipped roads to share information and warn of emergency conditions such as breakdowns or sudden braking manoeuvres.
A large hurdle that automotive engineers have experienced in the development of autonomous vehicles has been writing software that recognises objects in varying traffic conditions, weather and environments.
The hurdle is so great that Tesla has recently begun rewriting its Autopilot system with self-learning software based on augmented imagery (with the aim of software that can learn to predict how imaging will change in varying environments) to overcome it, however Ford points to the value of its newly-released dataset for the issue.
“This dataset spans an entire year, it includes seasonal variations and varied environments throughout Metro Detroit,” Ford said. “It features data from sunny, cloudy and snowy days.”
What this means, is that a hazard – such as a tree, for example – may look very different to a self-driving car’s autonomous software on a clear autumn morning than it might on a rainy summer’s night or in the evening covered in snow.
Ford’s first dataset is available for download here. The blue oval brand says it will continue updating the website until all of its logs have been uploaded.
Note: images sourced from Ford's original statement.