Advertisement

Lip-Reading Drones, Emotion-Detecting Cameras: How AI Is Changing The World

Artificial Intelligence is moving at such a rapid pace that governments simply can't keep up, and it's having a huge impact on each and every one of us.

AI can now flag people based on their clothing, behaviour or race, log an individual's emotions, understand their actions and predict their next move.

It can detect when luggage is left unattended, or if someone is loitering; it can even recognise when an individual is acting 'unusual' based on others around them.

AI is everywhere and getting more advanced every day.   

Facial recognition technology, in particular, has made leaps and bounds, partially thanks to tagged photographs on Facebook and Instagram as well as government-collected images such as drivers licenses and ID cards.

The quality of cameras has also drastically improved, so much so that they no longer just record, they can 'see' in real-time.

Facial recognition is becoming more common than you would think Photo: Getty

There are facial recognition towers, cameras hidden in street lights and, among the most impressive advances, lip-reading drones.

These drones are so clever they can identify a missing person (even based off an image 20 years old), can identify potential weapons and act on criminal 'buzz words,' according to intelligence firm Skylark Labs.

Specific lip-reading programs can decipher what people are saying from a distance while gait-analysis software can identify an individual just by the way they walk.

"Even if the drone is at 300ft, it can still operate effectively,” Dronestream CEO Harry Howe said.

While these particular drones are still in the testing phase, many intruding technologies are being used around the country.

Take China, for example. It's Skynet system claims it can scan all 1.3 billion citizens within seconds. There are 200 million cameras scattered around the country which can track identity thieves, find fugitives, catch sleeping students and spot jaywalkers. This particular surveillance system led to 2000 arrests from 2016 to 2018.

Countries like Malaysia, Jamaica, Germany and Poland are considering installing similar systems, while a number of facial recognition trials have been conducted right here on Australian soil.

Photo: Global Defence Technology

The issue with automated surveillance is that it is not 100 percent accurate.

When a person in the photo is a white man and he's standing still, the software can identify him correctly 99 percent of the time, but with darker skin tones comes errors of up to 35 percent, according to the New York Times.

READ MORE: Teen Sues Apple For $1 Billion Over Facial-Recognition Arrest

But even without cameras, we can be identified with data constantly being recorded by smartphones, laptops and BlueTooth technology.

Phone records, GPS history, emails, social media insights, is all data that can be collected, stored and analysed and then on-sold to companies and authorities to provide an insight into people's lives and behaviours.

The good news is, people around the world are beginning to notice and are starting to make a stand. Just this week, protestors in Hong Kong ripped up facial recognition towers as pro-democracy demonstrations turned violent.

The Swedish Data Protection Authority handed out a AU$30,000 fine after facial recognition technology was trialled on high-school students to keep track of attendance.

According to the DPA ruling, although the school secured parents' consent to the monitoring, detecting attendance was not a legally adequate reason to collect such sensitive personal data.

San Francisco has pulled the plug on facial recognition entirely. In May, legislators ruled that the technology as it exists is unreliable and represented an unnecessary infringement on people's privacy.

READ MORE: Concerns Raised Over CCTV Facial Recognition Trial

READ MORE: Taylor Swift Used Facial Recognition Tech To Scan For Stalkers

While these largescale privacy and personal data sharing issues may seem distant and irrelevant to many Australians, experts have assured that they are not.

"AI is already here," Executive Director of the Jean Monnett Centre of Excellence at the University of South Australia, Anthony Elliott, told 10 daily.

"Anytime anyone orders an Uber, receives an Amazon recommendation, talks to Siri, contacts their bank, you're at the receiving end, you're engaging with AI whether you know it or not," he said.

Photo: Getty

Australia doesn't have a real AI strategy, according to Elliott. 

He has suggested that everyday Aussies be educated on AI with a course similar to what is being run out of the University of Helsinki in Finland.

"This fabulous course called Elements of AI has been taken by tens-of-thousands of Finnish people and many of the top 200 companies automatically enrol their employees into the course so they become more literate," he said.

He has also called for more governance and regulation that allows individuals greater personal control over their data and how it is actually being used.

While Australia's privacy laws have merit and serve their purpose in many ways, Geoff Holland, Lecturer at the University of Technology Sydney's Faculty of Law, told 10 daily they don't go near far enough.

"Under the privacy principles, there are limits on the collection of personal information, what it can be collected for, the storage of it and also on the use of that personal information," Holland said.

We don't ever really know where our data is going. Photo: Getty

"The laws as they currently stand, apply to those circumstances where there is clearly identifiable information, but it doesn't take data matching into account," he continued.

Holland explained that data matching allows sources to be able to re identify an individual from the various sources of information.

"Government departments, they have data sharing and data matching and that is permitted under the privacy principles because of the very specific legislation that regulates it," he explained.

While there are federal privacy laws, each state has its own rules about the collection and use of personal data which can be difficult to navigate.

In June, it was revealed that Stadiums Queensland quietly switched on facial recognition software on sports fans and concertgoers. Patrons were monitored in real time with data potentially stored and shared with other agencies.

While there were signs warning about the use of CCTV, there was nothing to suggest facial recognition software was in operation.

Under Queensland law, the rollout, even without an individual's consent, was completely legal.

The Gabba is one of a number of venues owned by Stadiums Queensland. Photo: Google Maps

Victoria, New South Wales and WA have also trialled similar technology.

Holland explained that the law, whether it be developed through the courts or through legislation, is taking time to catch up.

"In my view, it doesn't specifically cover the circumstances where there can be an abuse of personal information," Holland said. "We need to look at how we can better have the law adapted to meet situations where there is artificial intelligence in the fold".

He explained that the age-old "if you've got nothing to hide why be concerned?" argument is outdated and that people need to be aware of what is going on.

"We have a reasonable expectation that if we are going about our ordinary lives in public, that we should be able to do so with our privacy respected -- particularly when it comes to being able to have conversations," he said before adding that it can hinder democracy and also change the way we behave.

"Rather than behaving normally when you're walking down the street, you tend to become wary of the possibility of being under surveillance and you, therefore, modify your behaviour," he said. "The behavioural changes can have an impact on society and social relations generally".