Advertisement

Concerns Raised Over CCTV Facial Recognition Trial

Public CCTV cameras will be fitted with facial recognition technology in Perth, in the vein of similar trials which have been decried as "hopeless".

The trial, to be held in Perth, will use artificial intelligence technology to recognise faces, count people and track vehicle movements.

While the initiative will be launched across cameras in East Perth, the city's council insists only three of the cameras can be fitted with the facial recognition software at any one time.

The trial has been met with scepticism from critics, with The Guardian reporting allegations the council failed to adequately consult with the community about the trial.

PHOTO: AAP

But in a statement to 10 daily, the City of Perth defended their process as "open and transparent", with the trial first announced in December 2018.

According to a council spokesperson, the initiative would not only target unlawful activity but was also hoped could help identify other persons of interest, such as missing people.

READ MORE: The Way People Judge Your Appearance Could Change Your Future

READ MORE: AI Explained: Everything You Need To Know About Our Robot Overlords

The spokesperson insisted the facial recognition technology would only be activated at the request of Western Australia police, who would need to provide the City with images of persons of interest.

"There are many additional capabilities of the camera analytics including providing data around pedestrian numbers, vehicle types and counts, cyclist numbers which will assist planning and urban decision making such as the placement of infrastructure or enhance the development of transport solutions," the spokesperson said.

But while the council said the year-long trial would be subject to ongoing evaluation and any data collected would automatically be deleted after 31 days, the trial has raised concerns about surveillance being carried out nationwide on Australians.

How Does Facial REcognition Technology WOrk?

Carsten Rudolph, Associate Professor at Monash University and Director of Oceania Cyber Security Centre, said the way facial recognition technologies operate depends on the goal of the program.

Rudolph told 10 daily there were two main options currently available. One uses AI on a small scale, to find a particular person; while a much wider application can utilise a large database of people or images.

"One option is you search for one particular person, then you basically would compare all faces that you automatically detect on an image with that particular person and you ... can indicate that that person was in that spot at that time," Rudolph said.

He said a much wider application would occur where cameras are instead surveilling a large number of people that you want to match to the system.

PHOTO: Getty Images

Rudolph explained this would essentially mean that, by using a large database of photos, the AI system would be 'trained' to classify faces detected in an image.

"In terms of going from identifying single people to having an identification of everybody moving in that space, that would be more of a virtual identity system... if you roll it out on a large scale," he said.

But Rudolph questioned why local government would want to implement this technology in the first place, considering the failure of similar trials overseas.

Should You Be Worried ABout AI TEchnology?

While concerns around privacy have been raised about AI and facial recognition technology, some experts say a lack of privacy laws in Australia could mean there's little that individuals can do to protect themselves.

Barrister and Senior Lecturer at the University of Technology Sydney's Faculty of Law, Pip Ryan, said any discussion around facial recognition CCTV must consider there is no general right to privacy in Australia.

"Privacy in Australia is really only about the way your personal data is stored, collected and managed," Ryan said.

She explained that an image of an individual's face which is de-identified in the AI system doesn't constitute a breach of privacy under Australian laws.

PHOTO: Getty Images

She said privacy laws would only come into play where the cameras become a nuisance in a private place, adding while there are some council rules about where CCTV cameras can be placed, this would be a "very tricky" area to test -- considering the local council is the body running this initiative in this case.

"If they're going to match it with our personal data, then suddenly they owe a stack of obligations to make sure it's stored in a certain way," Ryan said, adding the agency would then also be obligated to tell people their data was being used.

She said police and councils could justify the technology by promising to target certain crime, promoting personal safety, public health and national security.

"This is really tricky stuff, which is why they probably feel emboldened to try it out," Ryan said.

But both Ryan and Rudolph said failed international trials raised real questions about how effective the technology could be, and issues of racial profiling.

PHOTO: Getty Images

A similar UK trial last year was branded a failure by several reports, with claims the technology identified the wrong person nine out of ten times.

Ryans said the trial proved to be "very unsuccessful, inaccurate and hopeless."

Rudolph said it raised more serious concerns around bias, with the UK trial finding black people and those from ethnic minorities were potentially unfairly exposed to false identification.

He said this could be a real concern in Australia as well, if facial recognition technology was used to generically scan an area to see which groups were gathering in which locations, and where crime hotspots were.

He said Indigenous Australians could face unfair discrimination under the technology, if the system were to use a photo-based database from police or corrective services, as the Indigenous population is vastly over-represented in prisons compared to in the general community.

IMAGE: Getty Images

Ryan said while the way that artificial intelligence systems aggregate certain data is a big concern.

"What you're doing is aggregating certain data and creating negative inferences with no positive evidence," she said.

She said that could lead to law enforcement implementing a range of new measures including crime prevention, curfews and bans on certain activities in places deemed "hotspots".

"At the moment we are in a world where we are not there yet," she said -- but warned Australia was on its way to getting there soon.

Contact the author: vgerova@networkten.com.au