Amazon makes facial recognition software. To make money from the venture they have been selling it to law enforcement. This raised a stink is some circles and now employees of Amazon are saying it’s wrong to sell the software to the police. The letter got one thing right, and one things wrong. First what it got right is that they state they “refuse to contribute to tools that violate human rights.” Good for them. If they feel that’s what it does then they do have a moral or ethical obligation to stand up for that belief.
Where they get it wrong is when they say, “…we demand a choice in what we build, and a say in how it is used.” No, you don’t get to demand that. You can ask. You can strike in hopes of persuading them but it’s not your company, nor your product. Workers demanding this sort of thing is reminiscent of Marx to me. The company owns the product and hires the worker. The worker has his compensation and has no ownership over the product. Marx was wrong.
Is their protest really valid?
Now, I want to set all that aside for a moment. While I said they got it right that they should protest against something they feel is wrong, I did not say I agree that it is wrong. It’s good when people feel something is wrong to speak their mind about it, but that doesn’t mean they are right or that others must agree. Part of protesting in this way is to bring attention to it and try to bring people over to your side.
We are compelled to ask the question: is facial recognition software immoral in the hands of the police? I give the answer I find I often give in such situations – It depends. Most certainly it can be. There is the matter of ethics vs morality. I will argue that for police there is no ethical dilemma but there might be a moral one. If the software is used to scan a database of criminal records to find a particular suspect’s history, I find no moral or ethical problems with the use of facial recognition software in that case. That is simply using it as a search algorithm for already existing data. It’s nothing more than a convenience.
Facial recognition software is also used on crowds of people to locate a suspect. That use begins to stray across the line for me but since it is in a public space I can understand arguments supporting it. There is a known suspect that the police are already looking for at a location where they believe that person might be. But this use makes me uncomfortable.
The software can also be used on that crowd to gather random data. That is crossing the line because it creates a database of faces of people who are not suspected of a crime. There is no reason for the police to be gathering their data. I am against random fingerprint, DNA, or facial recognition databases.
Do we really know what the police will use it for?
The letter from the employees to Jeff Bezos reads, in part, “We don’t have to wait to find out how these technologies will be used. We already know that in the midst of historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses — this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized. “
What they’ve written is a mixed bag of ideas. Perhaps it was carefully worded on purpose or perhaps they didn’t think through what they wrote very well. What happens is that there are ideas that many people will agree with, wrapped up with some fallacies or unprovable notions. For example, I’m against militarization of the police and against a surveillance state. I bet most people on the left and the right will tell you they are against that if you ask them using those terms. Yet they are wrong when they write that we don’t have to wait to see how they will be used and that we know it will be used to oppress blacks and immigrants. It’s also false to say that ICE is currently engaged in human rights abuses. There is no evidence of that at all.
A company is in the business of making money and police departments are a major customer of facial recognition software. If Amazon wishes to make a profit on it then they must target police departments as customers. If they do decide that it is immoral for them to do so then it would likely be best to sell off the software to another company without any moral issues with selling it to police.
In my thinking, the software isn’t the issue, nor is selling to the police. Amazon should continue to do so. The issue is in how the police use the software. To restrict that Amazon workers should instead write letters to their lawmaker and encourage them to restrict how the police (federal and local) use the software and forbid them to collect data on non-suspects.
I get so frustrated that lawmakers always seem to get a pass. It’s really quite amazing to me. They are actually the only people who can do anything about all this. They are the ones that make the laws and if you don’t like a law, or feel a right isn’t being properly protected, they are the ones, and the courts, that need the attention. Writing a letter to Bezos, even if Amazon stops selling the software to police, won’t really do anything. It just pushes the notion down the line and it doesn’t go away. Pushing your lawmaker to restrict the use of such software by police actually fixes the fears you have and provides a remedy if violated. Go to the lawmakers people! Stop giving them a pass.
Illustration by Alex Castro / The Verge Workers at Amazon have demanded that their employer stop the sale of facial recognition software and other services to the US government. In a letter addressed to Amazon CEO Jeff Bezos and posted on the company’s internal wiki, employees said that they “refuse to contribute to tools that violate human rights,” citing the mistreatment of refugees and immigrants by ICE and the targeting of black activists by law enforcement. The letter follows similar protests at Googl