Recent news of Amazon’s engagement with law enforcement to provide facial recognition surveillance (branded ‘Rekognition’), along with the almost unbelievable news of China’s use of the technology means that the technology industry needs to address the darker, more offensive side of some of its more spectacular advancements.
Facial recognition technologies, used in the identification of suspects, negatively affects people of color — to deny this fact would be a lie.
And clearly, facial recognition-powered government surveillance is an extraordinary invasion of the privacy of all citizens, and, a slippery slope to losing control of our identities, altogether.
There’s really no ‘nice’ way to acknowledge these things.
I’ve been pretty clear about the potential dangers associated with current racial biases in face recognition, and open in my opposition to the use of the technology in law enforcement.
As the Black chief executive of a software company developing facial recognition services, I have a personal connection to the technology both culturally, and socially.
Having the privilege of a comprehensive understanding of how the software works gives me a unique perspective which has shaped my positions about its uses. As a result, I (and my company) have come to believe that the use of commercial facial recognition in law enforcement or in government surveillance of any kind is wrong — and that it opens the door for gross misconduct by the morally corrupt.
To be truly effective, the algorithms powering facial recognition software require a massive amount of information. The more images of people of color it sees, the more likely it is to properly identify them. The problem is, existing software has not been exposed to enough images of people of color to be confidently relied upon to identify them.
And misidentification could lead to wrongful conviction, or far worse.
Let’s say the person wrong person is held in a murder investigation. Let’s say you’re taking someone’s liberty and freedoms away based on what the system thinks, and the system isn’t fairly viewing different races and different genders. That’s a real problem, and it needs to be answered for.
There is no place in America for facial recognition that supports false arrests and murder.
In a social climate wracked with protests and angst around disproportionate prison populations and police misconduct, engaging software that is clearly not ready for civil use in law enforcement activities — does not serve citizens– and will only lead to further unrest.
Whether you believe government surveillance is ok, or not, using commercial facial recognition in law enforcement is irresponsible and dangerous.
While the rest of the world speculated the reasons we are being monitored, the Chinese government has been making the reasons they are watching all 1.4 billion of its citizens transparent– and it’s not for their safety.
China’s use cases for Face Recognition software for surveillance are actually an outstanding example of why we have never and will never engage with government agencies – and why it’s an ethical nightmare to even consider doing so.
China is currently setting up a vast public surveillance network of systems that are utilizing Face Recognition to construct “social credit” systems, which rank citizens based on their behavior, queuing rewards, and punishments, depending on their scores. They’ve already proven in the case of arresting one man spotted by their CCTV network in a crowd of 60,000 people exactly how poorly this could go.
The exact protocol is being guarded, but examples of ‘punishment worthy’ infractions include jaywalking, smoking in non-smoking areas, and even buying too many video games. ‘Punishment’ for poor scores includes travel restrictions and many other punishments.
Yes. Citizens will be denied access to flights, trains— transportation— all based on the ‘social behavior’ equivalent of a credit score. If all of this constant surveillance sounds insane, consider this: right now the system is piecemeal, and it’s in effect in select Chinese provinces and cities.
Imagine if America decided to start classifying its citizens based on a social score?
Imagine if America and its already terrifying record of racial disparity in the use of force by the police – and had the power and justification of someone being “socially incorrect”?
Recently, we read about Amazon Face Rekognition being used in law enforcement in Oregon. They claimed that it won’t be a situation where there’s a “camera on every corner” as if to say that face recognition software requires constant, synchronized surveillance footage.
In truth, Rekognition and other software simply requires you to point the software at whatever footage you have — social media, CCTV footage, or even police bodycams. And that software is only as smart as the information it’s fed — and if that’s predominantly images of, for example, African Americans that are “suspect,” it could quickly learn to simply classify the black man as a categorized threat.
Facial recognition is a dynamic tool which helps humanize our interactions with machines. Yet, desperate for more data, we’re seeing a preview in China of face recognition, when used for government surveillance, truly dehumanizing entire populations.
It’s the case of an amazing technology capable of personalizing experiences, improving interactions and creating positive feelings— being used for the purpose of controlling citizens. And that, for me, is absolutely unacceptable. It’s not simply an issue for people of color, either – eventually, scanning software of any kind could measure the gait (the way you walk), the gestures, the emotions of anyone considered “different” by the government.
It is said that any tool, in the wrong hands, can be dangerous.
In the hands of government surveillance programs and law enforcement agencies, there’s simply no way that face recognition software will be not used to harm citizens. To my core, and my company’s core, we truly believe this to the point that we have missed out on very, very lucrative government contracts. I’d rather be able to sleep at night knowing that I’m not helping make drone strikes more “effective.”
We deserve a world where we’re not empowering governments to categorize, track and control citizens. Any company in this space that willingly hands this software over to a government, be it America or another nation’s, is willfully endangering people’s lives. And letters to Jeff Bezos aren’t enough. We need movement from the top of every single company in this space to put a stop to these kinds of sales.
No comments:
Post a Comment