HomeNewsAustralia's privacy regulator has just closed its case against "problematic" facial recognition...

Australia's privacy regulator has just closed its case against “problematic” facial recognition company Clearview AI. What now?

The Office of the Australian Privacy Commissioner announced this week that it take no further motion against facial recognition company Clearview AI. This was a major victory for probably the most controversial technology corporations on this planet.

In 2021, Australia's privacy regulator ruled that Clearview AI had breached privacy laws because the corporate collected hundreds of thousands of photos from social media sites like Facebook and used them to coach its facial recognition tool. It ordered the corporate to stop collecting images and delete those it already had.

However, there was no evidence that Clearview AI complied with this instruction. And earlier this 12 months Media reports indicated that the corporate in Australia was continuing to conduct business as usual and was collecting more images from residents.

Given this, why did the Data Protection Authority suddenly stop pursuing Clearview AI? What does this mean for the broader fight to guard people's privacy within the age of huge tech corporations? And how could the law be modified to offer the authority a greater likelihood of keeping corporations like Clearview AI in check?

An extended-drawn-out battle

Clearview AI is a facial recognition tool that has been trained on over 50 billion photos taken from social media sites like Facebook and Twitter, in addition to the web typically.

The company behind it was founded in 2017 by an Australian citizen, Hoan Ton-That, who now lives within the United States. The website Claims The tool can discover the person in any photo with 99% accuracy.

Beginning of the month Ton-That told Inc.Australia He expects the corporate's growth within the US to speed up rapidly.

There will likely be more of those larger corporate deals, particularly with the federal government. And then there are 18,000 state and native agencies in law enforcement and government alone. This may very well be an organization that's doing over a billion or two billion dollars a 12 months.

The tool was initially offered to police authorities in countries similar to the USA, Great Britain and Australia for testing. The war-torn Ukraine needed Clearview AI is designed to discover Russian soldiers involved within the invasion of Ukraine.

But the technology quickly sparked controversy – and legal resistance.

In 2022, the UK Data Protection Authority fined Cleaview AI A$14.5 million for violating data protection laws. However, the choice was later liftedsince the British authorities didn’t have the ability to impose fines on a foreign company.

France, Italy, Greece and other countries of the European Union In addition, each parties imposed fines of $33 million or more on Clearview AI.They imposed further penalties if the corporate didn’t comply with legal requirements.

In the US, the corporate was faced with a category motion lawsuit. which was settled in JuneThe settlement allowed the corporate to proceed selling this tool to U.S. law enforcement agencies, but to not the private sector.

In Australia, the privacy regulator ruled in 2021 that Clearview AI had violated the country's privacy laws by collecting images of Australians without consent. It ordered the corporate to stop collecting the pictures and delete the collected images inside 90 days, but didn’t impose a wonderful.

So far, there is no such thing as a evidence that Clearview AI has complied with the Australian Privacy Commissioner’s order. He reportedly still collects pictures of Australians.

Lack of assertiveness and resources

Yesterday Data Protection Officer Carly Kind described Clearview AI’s practices as “disturbing”. However, she also said:

Taking under consideration all relevant aspects, I’m not convinced that further motion is vital right now in the precise case of Clearview AI.

This is a disappointing decision.

If an organisation doesn’t comply with a call, the supervisory authority can take legal motion under data protection law, but on this case it has decided to not accomplish that.

The undeniable fact that no further motion has been taken against Clearview AI confirms the weakness of current data protection laws in Australia. Unlike other countries, significant penalties for breaches of knowledge protection laws are very rare in Australia.

The decision also underlines that the regulator doesn’t have the vital enforcement powers under current data protection laws.

To make matters worse, the regulator doesn’t have enough resources to analyze several large cases. Investigation against Bunnings and Kmart over the usage of facial recognition technology has been pending for greater than two years.

What could be done?

It is hoped that upcoming reforms to Australia's privacy law will each strengthen Australia's privacy law and provides the Privacy Commissioner greater enforcement powers.

However, it’s questionable whether general data protection law is sufficient to adequately regulate facial recognition technologies.

Australian experts are as an alternative calling for special rules for high-risk technologies. Former Australian Human Rights Commissioner Ed Santow suggested a model law to manage facial recognition technologies.

Other countries have already begun to develop special rules for facial recognition tools. The Artificial Intelligence Law recently passed within the European Union prohibits certain uses of this technology and sets strict rules for its development.

However, Studies show Many countries all over the world are still struggling to create appropriate regulations for facial recognition systems.

The Australian government should seriously consider concrete measures to forestall corporations like Clearview AI from using Australians' personal data to develop such technologies, and may introduce clear rules about when facial recognition can and can’t be used.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read