It’s a scary thought, but software testing for self-driving cars could be carried out entirely in the virtual world. The news comes after a breakthrough in simulating vehicle sensors.
Software specialist rFpro has developed a virtual model of the real world that allows accurate testing of a vehicle’s ‘perception’.
It does so by being able to recreate the most unpredictable details of real-world driving.
It’s known as the Ground Truth. The model can simulate real-world phenomena that could corrupt a sensor’s perception of the environment.
Previously, virtual testing wasn’t suited to issues like difficult lighting conditions and reflections, which could ‘confuse’ autonomous systems in the real world.
“Most system modelling begins with ideal sensor models in order to validate the algorithms and control systems of the vehicle, but this bypasses any limitations in the sensors themselves,” said Chris Hoyle, rFpro’s technical director.
“Difficult lighting conditions, or the reflections in a shop window can corrupt a sensor’s perception of the vehicle’s surroundings, leading to potentially catastrophic errors. Thorough validation of a CAV or ADAS-equipped vehicle must include the sensors’ ability to recognise and characterise the features of its environment.”
Future legislation will likely demand that virtual testing be more comprehensive than at present. Especially before use on the public road is allowed.
“Physical modelling means simulating the materials and properties of every object encountered by the vehicle and its onboard sensors, rather than just an abstract representation of it as used by most systems,” continues Hoyle.
“With several years’ experience of creating digital twins of city streets, rural roads, proving grounds and test tracks, we understand the complexities of modelling features like changing weather conditions or road surfaces.
“Our engineers are constantly being challenged by our customers to bridge the gap between simulated and real-world testing. Whether this is 8 stereo, 4K cameras with live exposure control and real motion-blur modelled, or even a radar model picking up the micro-doppler from a pedestrian moving their arm. All of this must be possible for successful simulation and can all be done using rFpro.”