.


Right: Australia's Border Force uses SmartGates facial recognition software to look for threats at airports around the country.

Found a word you're not familiar with? Double-click that word to bring up a dictionary reference to it. The dictionary page includes an audio sound file with which to actually hear the word said.



Arguments against Australia's Identity-matching Services Bill and the Passports Amendment Bill

1. The Bills have an excessively wide scope
It has been claimed that  that the Bills expanding Australia's use of facial recognition technology do not make it sufficiently clear who can access the data drawn on and the matches obtained. This, it is claimed, allows for misuse of the data and an ongoing expansion of its use in ways not stipulated by the proposed legislation.
Microsoft president Brad Smith warned in December 2019, 'The facial recognition genie, so to speak, is just emerging from the bottle. A government ... could follow anyone anywhere, or for that matter, everyone everywhere. It could do this at any time or even all the time. This use of facial recognition technology could unleash mass surveillance on an unprecedented scale.'
In an opinion piece published in The Conversation on October 25, 2019, Sarah Moulds, Lecturer of Law at the University of South Australia, stated, 'Much of the detail about precisely who can access the system and what limits apply is not set out in the bills. This will be determined through government regulation or subsequent intergovernmental agreements.' https://theconversation.com/why-the-governments-proposed-facial-recognition-database-is-causing-such-alarm-125811
Dr Moulds explained further, 'Legal bodies have argued that amendments are needed to tighten the boundaries of who can access the identity-matching services and for what purposes. They note that as currently drafted, the proposed laws give too much discretionary power to government officials and actually create opportunities for identity theft.' https://theconversation.com/why-the-governments-proposed-facial-recognition-database-is-causing-such-alarm-125811
The Law Council of Australia has stated there is a need to 'ensure that identification information produced in response to a request for an identity-matching service is not used for any purpose other than establishing or verifying the identity.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf
The Australian Privacy Foundation has further argued that the proposal is highly invasive, as the system could be integrated into a number of other systems that collect facial data, including closed-circuit television.
The Foundation has stated, 'We are on our way to automated and real-time surveillance of public spaces.' https://www.theguardian.com/technology/2019/sep/29/plan-for-massive-facial-recognition-database-sparks-privacy-concerns
Dr Moulds has pointed to two additional related concerns, stating, '[Unregulated expansion of access to the surveillance] is particularly problematic when coupled with the potential for the rapid spread of facial recognition technology in Australian streets, parks and transport hubs...Another concern is that it could be used by a wide range of agencies to confirm the identity of any Australian with government-approved documentation (such as a passport or driver's license), regardless of whether they are suspected of a crime.' https://theconversation.com/why-the-governments-proposed-facial-recognition-database-is-causing-such-alarm-125811  
This last apprehension Dr Moulds voices relates to the difference between a system which searches for a particular individual suspected of significant wrongdoing and one which conducts searches with far less obvious justification. The concern Dr Moulds raises is that because a system can be used it will be with the potential for ever-greater intrusions into people's lives.
The Law Council of Australia has stated that there is a need to 'limit the use of [facial identification technology] to the detection, investigation or prosecution of offences that carry a maximum penalty of not less than three years' imprisonment.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf
The Law Council of Australia has also stated, 'To ensure that there are more clearly defined limits on the legitimate and proportionate use of identity-matching services proposed in the IMS Bill, as well as greater oversight and transparency, the Law Council recommends that the [Bills] be amended to introduce...safeguards for when [facial recognition] is accessed by local government and non-government organisations [and that] notice...be given to individuals about the collection and use of their identifying information.'
The Council has also noted that there is currently no provision for 'mandatory training of empowered individuals within local government or non-government organisation about permitted uses.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf

2. The Bills undermine civil liberties, especially the right to privacy
There has been concern expressed that these Bills and the systems they would allow represent a substantial risk to the liberties of Australian citizens.
The Australian Human Rights Commission stated in its most recent submission to the federal Parliament, 'The Commission continues to hold serious concerns that the Bill would impinge on a number of human rights...Rights that are particularly likely to be limited are the right to privacy, freedom of movement, the right to non-discrimination, and the right to a fair trial, though this is not an exhaustive list.' https://which-50.com/human-rights-groups-sound-alarm-on-governments-facial-recognition-laws/
Arthur Moses SC, president of the Law Council of Australia, has stated, 'Misuse of this technology would undermine the rights of individuals, as well as the community's trust in the system and its operation.' https://www.afr.com/politics/federal/facial-recognition-bill-knocked-back-20191024-p533s6
Angus Murray, junior vice-president of the Queensland Council of Civil Liberties (QCCL), has stated, 'The question for society is whether this is technology we can or should be using. It's a slippery slope to a place that's probably irretrievable if we end up with technology like this around all the cities of Australia. We may decide this isn't the way we want to go - the concept of privacy and freedom of association disappears fairly rapidly.' https://www.afr.com/technology/your-face-is-about-to-end-your-privacy-how-do-you-feel-about-that-20190130-h1anyy
Human Rights Commissioner, Edward Santow, has claimed that the two proposed bills are 'unprecedented' in their impact on Australians' privacy. He notes that the Department of Foreign Affairs and Trade anticipates processing thousands of identity-matching requests a day if the bills are passed - compared to a few hundred per year currently. https://www.afr.com/technology/your-face-is-about-to-end-your-privacy-how-do-you-feel-about-that-20190130-h1anyy
The Australian Privacy Foundation has argued the proposal is highly invasive, because the system could be integrated into a number of other systems that collect facial data, including closed-circuit television. A spokesperson for the Foundation has stated, 'We are on our way to automated and real-time surveillance of public spaces.' https://www.theguardian.com/technology/2019/sep/29/plan-for-massive-facial-recognition-database-sparks-privacy-concerns
The position put by the Australian Privacy Foundation has been elaborated by Professor Toby Walsh, a Fellow of the Australian Academy of Science and an Artificial Intelligence expert, who has observed, 'The widespread use of facial recognition is going to change the nature of our society. It changes the world we're in. It's this idea that you can be watched anytime. Even if no one is watching you, even if you never come to any harm, that [still] changes what you do because you don't have the privacy to question.' https://which-50.com/cover-story-australias-dangerous-foray-into-facial-recognition/
The apprehension appears to be that when people know themselves potentially to be under constant observation their freedom of action is diminished. They are no longer able to behave spontaneously or unselfconsciously as they consider themselves the object of ongoing judgement, even for their innocent or innocuous behaviour.
Professor Walsh has further stated, 'We're used to the idea that there are loads of CCTV cameras around. But that was before we had face recognition. In the past we knew no one was looking, there were too many cameras for people to be looking at ... [CCTV] actually wasn't invading our privacy. And now we can just upgrade those cameras with software that will be invading our privacy. It will be able to identify people in real-time. It will be able to track you in real-time.' https://which-50.com/cover-story-australias-dangerous-foray-into-facial-recognition/
Similar concerns have been expressed by Christie Hill, Deputy Advocacy Director, American Civil Liberties' Union, San Diego, about the negative potential of facial recognition technology for individual's privacy. Hill has stated, 'We're living in an age when machines can collect information about nearly everything we do - from the places we go to the emotions we feel to the people we hang out with - and have the capability to transmit this data to each other and to our government.
When nearly any device can be turned into a hyper-powerful surveillance tool, it's up to us to ensure technology makes us more, not less, safe.' https://www.sandiegouniontribune.com/opinion/story/2019-09-06/facial-recognition-tool-civil-liberties
Privacy apprehensions regarding the use of facial recognition are widespread in numerous jurisdictions, including in the United States. An October 2018 survey in the United States conducted by the Brookings Institution found 42 percent of people thought facial recognition was an invasion of privacy, against 28 percent who disagreed - and 30 per cent who were unsure. https://www.afr.com/technology/your-face-is-about-to-end-your-privacy-how-do-you-feel-about-that-20190130-h1anyy

3. The technology is not failsafe
Opponents of the widespread use of facial recognition technology in Australia and overseas argue that it is not foolproof and that when errors are made the consequences can be dire for the individuals involved.
On August 29, 2019, Forbes published an article by Naveen Joshi, the founder and chief executive officer of Allerin, an engineering and technology solutions company. Joshi stated, 'No technology is 100 percent accurate and efficient; we all know that. And facial recognition tech is no different. There could be chances of this technology making false claims, which can then lead to undesirable consequences.' https://www.forbes.com/sites/cognitiveworld/2019/08/29/the-implementation-of-facial-recognition-can-be-risky-heres-why/#1b5b17d77863
Joshi cites the recent case of Ousmane Bah who sued Apple for $1 billion for wrongfully accusing him of the theft of $1,200 worth of merchandise from an Apple store in Boston based on mistaken facial recognition identification. The potential for more damaging misidentifications has been stressed. Jay Stanley, a policy analyst at the American Civil Liberties Union, has stated, 'One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse.' https://www.chicagotribune.com/consumer-reviews/sns-facial-recognition-bias-20191226-cldfnnmqbzf6lp5w622jnw7oga-story.html
Critics are concerned that misidentification via facial recognition technology could result in the death of an individual who has been inaccurately matched by the technology and whom law enforcers therefore mistakenly believed to be a dangerous criminal. It has been suggested that this is a particular risk given that racial minorities, who already attract a disproportionate amount of police attention, appear to pose a problem for programmers attempting to delineate their features algometrically with enough precision. Several studies have revealed that Artificial intelligence (AI) systems such as facial recognition tools rely on machine-learning algorithms that are 'trained' on sample datasets. If darker skinned groups within a given populations are underrepresented in benchmark datasets, then the facial recognition system will be less successful in identifying black faces. https://www.abc.net.au/news/2020-02-04/fact-check-facial-recognition-darker-skin/11781192
On December 29, 2019, The Chicago Tribune published an article explaining the potential for error when using facial recognition technology. The article cited a recently released report from the United States National Institute of Standards and Technology which observed 'The majority of commercial facial-recognition systems exhibit bias.' Among a database of photographs used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans. The Institute observed that the identification systems falsely identified African American and Asian faces 10 times to 100 times more than Caucasian faces. https://www.chicagotribune.com/consumer-reviews/sns-facial-recognition-bias-20191226-cldfnnmqbzf6lp5w622jnw7oga-story.html
Accuracy concerns have been raised regarding the technology to be expanded under the Morrison government's proposed legislation. Australian Human Rights Commissioner Edward Santow has stated, 'Errors are not evenly distributed across the community. So, in particular, if you happen to have darker skin, that facial recognition technology is much, much less accurate. When you use that technology in an area where the stakes are high, like in policing or law enforcement, the risks are very significant.' https://www.abc.net.au/news/2020-02-04/fact-check-facial-recognition-darker-skin/11781192
An ABC fact check published on February 14, 2020, noted 'Three leading software systems correctly identified white men 99 per cent of the time, but the darker the skin, the more often the technology failed.
Darker-skinned women were the most misidentified group, with an error rate of nearly 35 per cent in one test, according to...research...conducted by Joy Buolamwini, a computer scientist at the Massachusetts Institute of Technology's (MIT) Media Lab.' https://www.abc.net.au/news/2020-02-04/fact-check-facial-recognition-darker-skin/11781192

4. The Bills allow for electronic surveillance without enough oversight of the process and those who request it
Critics of the two bills seeking expanded use of facial recognition technology in Australia are concerned that there are insufficient provisions to oversee how the technology is being used, allowing for misapplications that might never be detected. They object that access to databases can be obtained without an adequate review process and that there is no mechanism for checking that the surveillance and data access powers that the legislation would give government and some corporate bodies is being appropriately applied.   Liberal MP Andrew Hastie, the chair of the bipartisan parliamentary joint committee on intelligence and security which scrutinised the two bills, warned the legislation lacked 'robust safeguards' and 'appropriate oversight mechanisms.' In making this statement the committee was echoing the views of a wide range of legal authorities and civil rights bodies. https://www.thesaturdaypaper.com.au/news/law-crime/2019/10/26/duttons-plan-surveillance-state/15720084008972?cb=1585819622
The Law Council of Australia has stated, 'To assure that uses are reliably and verifiably legitimate and proportionate, controls and safeguards are reasonably required...it is critical to ensure that the legislation which enables the use of this type of technology does not permit a creep toward broad social surveillance in Australia.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf
The Law Council of Australia has further noted, 'The proposed limits on the number of images presented for matching to a participating authority does not in practice limit the number of images requested to those numbers because multiple requests may be made by a participating authority around the putatively matched image.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf
As proposed under the Passport Amendment Bill  'computer programs could automate the sharing of passport-related information without human oversight with the potential to negatively affect the individuals concerned. https://www.afr.com/politics/federal/mass-surveillance-the-facial-recognition-bill-explained-20191029-p5358t
Ben Seo writing for the Australian Financial Review in an article published on October 31, 2019, stated, 'Sceptics saw plenty of reasons for concern in the proposed laws because they...did not require warrants, and contained automated decision-making.' https://www.afr.com/politics/federal/mass-surveillance-the-facial-recognition-bill-explained-20191029-p5358t
The bipartisan parliamentary joint committee on intelligence and security which scrutinised the two bills in October 2019 noted this lack of oversight and accountability. The committee argued enforcement agencies should be required to obtain a warrant before accessing certain facial recognition services. The committee was also concerned that where decision making was automated, as is outlined in some of the provisions of the legislation there is no scope for within agency oversight even at the time a decision is being taken. The committee recommended that the bill be changed to ensure that automated decision-making can only be used for decisions that 'produce favourable or neutral outcomes for the subject'. This provision is intended to ensure that individuals cannot be harmed by surveillance and data matching operations undertaken without human oversight.
https://www.afr.com/politics/federal/mass-surveillance-the-facial-recognition-bill-explained-20191029-p5358t
The committee further claimed that the manner in which the bills have been drafted is neither sufficiently explicit nor transparent for citizens to be properly informed of the impact that the legislation, if passed in its current form would have on their lives. The committee's report concluded, 'A citizen should be able to read a piece of legislation and know what that legislation authorises and what rights and responsibilities the citizen has in relation to that legislation. This is especially important in the case of the IMS bill which has the potential to affect the majority of the Australian population...
It is clear that the Identity-matching Service Bill does not inform the citizen reader in this way.' https://www.afr.com/politics/federal/mass-surveillance-the-facial-recognition-bill-explained-20191029-p5358t

5. An expansion of facial recognition technologies is not proportionate
Critics claim that the proposed legislation is an example of government and administrative over-reach seeking powers that are not necessary to address the problems offered as their justification. Critics further argue that most of these problems are already being tackled by other laws and enforcement practices.
The Law Council of Australia has expressed these concerns arguing that aspects of the Identity-Matching Services (IMS) Bill involve infringements on the right to privacy that are not justified by the supposed benefits to be gained. The Council has stated, 'As currently drafted, the IMS Bill will allow state and territory agencies to share and seek to match facial images and other biographical information for persons suspected of involvement in minor offences. The Law Council considers that this may not be a necessary or proportionate response.' https://www.lawcouncil.asn.au/docs/3ceb93a3-3de6-e911-9400-005056be13b5/3692%20-%20Review%20of%20the%20IMS%20Bill%20and%20Passports%20Bill.pdf
The Allens Hub for Technology, Law and Innovation, publicly launched on 14 March 2018, is an independent community of scholars based at University of New South Wales, Sydney. This group of academics has also argued that the proposed legislation is not proportionate, that is, that it is not required to address the problems in regard to which it has been proposed.
The group has stated, 'The proposed system and legislation should be proportionate. This requires a demonstration that the  legislation  is  reasonably  necessary  in  pursuit  of a  legitimate objective  and  that  its  impact  on fundamental individual rights are proportionate to this objective... [The]indiscriminate retention of biometric data of all people, with the broad ability to access and match images for any offences could be considered equally disproportionate.' https://www.allenshub.unsw.edu.au/sites/default/files/inline-files/Allens%20Hub%20Review%20Identity%20Matching%20Services%20Submission%202019%20for%20KB%20web%20upload.pdf
Looking particularly at the issue of protections against Australian nationals suspected to be terrorists who seek to return to Australia, critics of the proposed new laws argue that they are disproportionate as the country already has ample legislation in place for this purpose. In an opinion piece published in the University of New South Wales Newsroom, Sangeetha Pillai, Senior Research Associate at the University's Law School, argued, 'The government hasn't explained why Australia's extensive suite of existing anti-terrorism mechanisms doesn't already adequately protect against threats posed by Australians returning from conflict zones.' https://newsroom.unsw.edu.au/news/business-law/there%E2%80%99s-no-clear-need-peter-dutton%E2%80%99s-new-bill-excluding-citizens-australia
Pillau went on to explain, 'Australia's 75 pieces of legislation provide for criminal penalties, civil alternatives to prosecution, expanded police and intelligence powers, and citizenship revocation. And they protect Australia from the risks posed by returning foreign fighters in a variety of ways.
For example, a person who returns to Australia as a known member of a terrorist organisation can be charged with an offence punishable by up to 10 years' imprisonment. Where the person has done more - such as fight, resource or train with the organisation - penalties of up to 25 years each apply.' https://newsroom.unsw.edu.au/news/business-law/there%E2%80%99s-no-clear-need-peter-dutton%E2%80%99s-new-bill-excluding-citizens-australia
Pillau also argued that Australia already has a more than adequate capacity to protect citizens from suspected terrorists against whom there is insufficient evidence to bring charges. She explains, 'A control order may be imposed on a person in cases where they are deemed a risk but there is not enough evidence to prosecute. This restricts the person's actions through measures such as curfews and monitoring requirements.
Evidence shows the existing measures work effectively. Police and intelligence agencies have successfully disrupted a significant number of terror plots using existing laws...' https://newsroom.unsw.edu.au/news/business-law/there%E2%80%99s-no-clear-need-peter-dutton%E2%80%99s-new-bill-excluding-citizens-australia