Human Rights Commission wants moratorium on expanding facial recognition

By

Until laws are passed to regulate automated decision making.

Australia’s human rights watchdog has taken a bite out of the rapidly expanding facial recognition and biometric matching market, proposing the government institute an interim ban on using the controversial technology for decision making because it is flaky and struggles with people of colour.

Human Rights Commission wants moratorium on expanding facial recognition

In a sharply critical assessment of unregulated technology allowed to run amok, the Australian Human Rights Commission says a “legal moratorium on the use of facial recognition technology in decision making that has a legal, or similarly significant, effect for individuals” is needed “until an appropriate legal framework has been put in place.”

The push for a ban is contained in the AHRC’s latest discussion paper (pdf) on the interplay between human rights obligations and recent and emerging technology like artificial intelligence, automation and machines and sensors that can see and hear and could soon have serious consequences.

It comes as other regulators including the Australian Competition and Consumer Commission intervene to rein in a raft of market and competition distortions created by so-called Big Tech platforms like Apple, Google and Facebook that have created data oligopolies.

While technology vendors and platforms routinely dismiss or pay mere lip service to human rights and privacy laws and obligations, previous Australian and international reviews of laws governing the use of technology and data have produced tough regulation like data breach disclosure laws and GDPR.

At the time they were first floated in a review of the Privacy Act, mandatory data breach notifications were dismissed by data hoarding businesses and the much of technology industry as an unworkable mass or red tape that would drive-up compliance costs for little or no return.

Despite substantial opposition, the data breach scheme became a reality, with the AHRC’s latest paper, dubbed “Human Rights and Technology” now following a broadly similar path to demolish the ‘all-care, no-responsibility’ free for all with strict new standards, obligations and laws.

In terms of outright bias and discrimination against already marginalised groups, facial recognition probably takes the cake in the discussion paper thanks to its many inherent biases and high error rates for people whose ethnicity doesn’t fit easily into an algorithm.

“The Commission is concerned about the risks of using facial recognition technology in decision making that has a legal, or similarly significant, effect for individuals. In addition to its privacy impact, the current underlying technology is prone to serious error, and this error is often more likely for people of colour, women and people with disability, among others,” the AHRC report says.

“The Commission’s preliminary view is that legislation is needed for the use of facial recognition technology, and that this should include robust safeguards for human rights.”

The watchdog has also labelled existing legislation and safeguards as “insufficient” and says a new legal framework is needed.

“This framework should be developed in consultation with experts, public authorities, civil society, and, importantly, the broader public who will be the subjects of this type of surveillance,” the AHRC said.

Artificial intelligence also comes in for a big serve, with a series of proposed new disclosure measures and liability regimes floated to compel better machine behaviour.

In an example that will no-doubt irritate the government, the AHRC uses the example of botched robodebt assessments as the wrong way to use data feeds and algorithms, noting that “the majority of respondents to a national survey were uncomfortable with the Australian Government using AI to make automated decisions that affect them.”

“In Australia, community concern associated with practices such as Centrelink’s automated debt recovery program is emblematic of broader concerns about how new technologies are used by the public and private sectors,” the report notes.

At a broader level, the AHRC wants a regime of “Accountable AI-informed decision making” introduced where owners of algorithms are essentially held liable when laws are broken and their machines breach obligations.

At a simple level there’s a proposal for any accountability regime to disclose to humans when a material decision is being taken by a machine, but notes there are many existing laws and principles that could be applied.

The AHRC’s five major areas to make AI-informed decision making are as follows:

• ensuring that AI-informed decision making complies with all relevant legal requirements, and is expressly authorised by law when carried out by government

• promoting transparency, so that individuals are informed where AI has been a material factor in a decision that affects them

• ensuring that AI-informed decisions are explainable, in the sense that a reasonable and meaningful explanation can be generated, that is communicated to any affected individual on demand

• clarifying who is legally liable for using an AI-informed decision-making system, with strong incentives for acting responsibly in developing and using such systems

• identifying appropriate mechanisms for human oversight and intervention.

The AHRC also proposed that the government “should engage the Australian Law Reform Commission to conduct an inquiry into the accountability of AI-informed decision making.”

In other words, the people who gave the Privacy Act its first set of teeth could soon be back on the job.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Transport for NSW unveils first enterprise-wide tech strategy

Transport for NSW unveils first enterprise-wide tech strategy

Defence counts $1.5bn-plus investment for enterprise data and ICT

Defence counts $1.5bn-plus investment for enterprise data and ICT

Queensland TMR's tech team seeking CTO

Queensland TMR's tech team seeking CTO

NSW Electoral Commission CISO moving on

NSW Electoral Commission CISO moving on

Log In

  |  Forgot your password?