Cameras are insanely cheap now. What used to be a technology that cost thousands of dollars and generated amounts of data that was expensive to store is now counted in single digits. Because they're cheap, cameras are installed in so many areas and placed on so many things, just because we can. Next time you're outside, look around – if you're in an urban area chances are a camera is pointing at you.
The technology to identify a face is also incredibly cheap. Microsoft's Azure cloud computing platform can identify a face for as low as $0.0004 (that's 1000 "faces" for 40 US cents), where it can be combined with indefinite storage for around $0.00001 per month. What used to be science fiction stuff is now a commodity that anyone with a basic knowledge of computer programming can leverage. It's no surprise then that facial recognition technology is abused.
At its core, facial recognition technology does what it says on the tin – recognise faces. It can be as basic as "yes this is a human face", or can go all the way to "Jane Citizen, 123 Any Street, Big City, USA". There's dozens of mathematical algorithms that disassemble an image of a face down to the position and size of a person's eyes, nose, cheeks, jaw, or other distinguishing features, then compare that to a reference image and give a determination on how likely those features match. As cameras capture more advanced data beyond photos (3D images, thermal images, depth/radar information, etc.), the more advanced the algorithms get in trying to find a match in a database.
You know the scene in old movies where a cop trawls through big folders of old mug shots to see if they can find someone that matches a photo or a CCTV image taken at a crime scene? It's like that, but done by a computer really quickly and because it's a computer, it can do it over and over, on billions of records without getting bored or tired, unlike a human.
Barely a week passes without news informing us that facial recognition was used to enable some horrendous injustice somewhere in the world. From 2024 alone:
- Israel using data from Google Photos to identify Palestinians without their consent or knowledge, but was often incorrect and wrongly accused people of being linked to Hamas.
- Harvard students taking a pair of Meta's Rayban smart sunglasses and connecting it to Pimeyes (a face search engine) to "dox" people in real time as they walk around Boston.
- Bunnings, one of Australia's largest retail stores, got fined for capturing the faces of everyone entering their stores without consent to check if they're in a database of "risky" customers.
- Police in the USA using DNA from decades old crimes to generate faces of potential suspects, then running it through facial recognition software to try and crack cold cases despite experts saying this is utterly bonkers.
- Students supporting Palestine at the University of Melbourne being identified and punished via a combination of wi-fi login data and CCTV footage run through facial recognition software.
The common thread in all these scenarios is that the collection of faces these facial recognition systems are using to try and identify people was never intended to be used for this purpose. Nobody gave explicit permission for their faces to be in a database, yet in a database they are – because it's all just 1s and 0s that cost practically nothing to copy and store, so why not use them for whatever purpose we can think of? What’s the worst that can happen, right? YOLO and technology meet again.
What makes matters worse is that facial recognition software is often incorrect. In perfect conditions (well lit, high resolution photos, with a person looking at the camera) facial recognition software in 2024 is incredibly accurate, but not all photos being fed into these systems are good quality. CCTV footage is often low resolution, most people won't be looking directly at the camera, and false positives increase exponentially for anyone that isn't a white male.
Sex workers are at a higher risk of running afoul of facial recognition systems due to society’s strong desire to track and monitor them. Here's two examples of that in action:
- A sex worker in London had police show up to their home for a "welfare check" because they may be a "victim of trafficking". It's not explained what prompted the police to show up or do this check beyond typical police bullying, but it's not a stretch to assume that police saw images of this sex worker online, ran it through their database (drivers licences, passports, etc) to obtain their details and pay them a visit just to remind them they're being watched.
- A "face-in" Canadian sex worker visiting the USA was detained in 2014 at Boston airport and received a 5-year ban on entering the USA after they were identified as a "duo" colleague of a fellow Canadian sex worker who was "face-out" and received a 10-year ban a year earlier. US authorities identified the face-out Canadian sex worker via facial recognition, then were able to link the face-in sex worker to the face-out worker by association.
We live in a world that contains a vast network of cameras watching us all the time without our knowledge or consent, backed by computer systems that cost fractions of a cent to analyse our faces, in the hands of ghouls and tech bros with few checks and balances – now what? Short of wearing a balaclava everywhere and keeping your internet presence as locked down and anonymous as possible (a huge challenge!), there's not much we can practically do. The vast majority of us are already in the facial recognition databases set up by governments and the companies we interact with during our everyday lives.
The strongest action we can take is to keep resisting the systems that inflict facial recognition upon us and support organisations that are fighting valiantly to expose where facial recognition systems are used, how they work, detail their flaws, and lobby governments for laws that ensure consent is given when the systems are developed and penalties for their misuse.
Organisations to support include:
- Digital Rights Watch (Australia)
- Electronic Frontiers Foundation (USA)
- American Civil Liberties Union (USA)
- European Digital Rights (Europe)
- Noyb (Europe)
- Big Brother Watch (UK)
- Open Rights Group (UK)
Got a tech question for Ada? She wants to hear from you!
Ada answers all your questions about tech, the online world, and staying safe in it. No question is too silly, no hypothetical is too far-fetched! Learn to leverage devices, systems, and platforms to your benefit.