Self-Driving Cars Struggle To Recognise Dark-Skinned Pedestrians

A new report has found sensors and cameras used in autonomous cars are better at detecting people with lighter skin tones.

Researchers at the Georgia Institute of Technology used eight image-recognition systems to analyse different images of pedestrians.

The people in the photos were separated into two groups of lighter and darker skin, and then the program tried to identify the pedestrians.

The results were a bit worrying -- because no matter what time of day the photo was taken, the systems were always five per cent worse at identifying people with dark skin.

Cameras on an Nvidia self-driving car, Las Vegas. IMAGE: Justin Sullivan via Getty

While the report, Predictive Inequity in Object Detention, is concerning, it shouldn't be taken too seriously.

It hasn't been peer-reviewed and it didn't test any of the exact systems that are used in current self-driving cars.

Kate Crawford, who researches social implications of AI, said there's a good reason for that:

But the issue is one that experts should take note of, considering more autonomous vehicles are being tested on the roads -- in Australia and around the world -- to ensure they are safe to be on the roads.

READ MOREDriverless Car Is About To Crash. Who Decides Who Gets To Live?

READ MORECar On Autopilot Gets Confused And Crashes On Highway

Concerns about the safety of the vehicles spiked in 2018 after a woman was hit and killed by an Uber-operated self-driving vehicle in the United States.

Just this week, Uber was found not criminally liable for the fatal crash in Arizona, but the back-up driver -- who was watching a TV show at the time of the crash -- could face criminal charges.