Donate search
Listen Now The Erick Erickson Show streaming live arrow_right_alt close


  • Facebook
  • Twitter
  • send Email
  • print Print

Bravo to Microsoft: Pro-White Bias in Facial Recognition is a Problem

The hip city of Brookhaven, Georgia, just inside the 285 perimeter, is watching you, with the help of citizens. “Partnering with Ring using the Neighbors app will give officers a technological advantage when investigating crimes,” Brookhaven Police Chief Gary Yandura said in a 2019 press release. The neighboring metro Atlanta communities of Dunwoody and Chamblee also use cameras–their own and those of citizens who willingly partner–to track people.

Chamblee actually has cameras capable of reading license plates at every major entry and exit road in the city. This helps to stem the epidemic of car thefts by gang members, drug addicts and others who use the cars to commit other crimes, since those vehicles are not related to the drivers. Last year, at my workplace, we actually had a criminal steal an employee’s pickup truck from the parking lot in broad daylight, in sight of other employees. One of them reported it to the truck’s owner. (“Hey Jimmy*, whoever you let borrow your truck left in a hurry, and nearly smashed my car!” “What!?” *Not really named Jimmy.)

Personal doorbell cloud service “Ring” has partnered with law enforcement, allowing users to team up with police, giving them eyes on the street, well, everywhere. But now, with police reform being the watchword in woke culture, not to mention libertarian circles, both Amazon (which owns Ring) and Microsoft have banned sales of facial recognition surveillance tech to police, until we have some federal legislation on its use.

This is a smart take from Microsoft president Brad Smith.

We will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology,” Smith said.

Microsoft won’t sell police its facial-recognition technology, following similar moves by Amazon and IBM, The Washington Post, June 11, 2020

We do need reform here. In a chilling remark from November 2019, a Ring public policy Vice President told Massachusetts Sen. Ed Markey the company had placed “few restrictions” on how the data could be used by police.

Asked about Ring’s plans regarding adding facial-recognition capabilities to its cameras, Huseman wrote that it was a “contemplated but unreleased feature” that would be made available to the public only with “thoughtful design including privacy, security and user control.” Huseman also listed other security cameras that offer facial-recognition features and wrote, “We do frequently innovate based on customer demand.”

Police can keep Ring camera video forever and share with whomever they’d like, Amazon tells senator, The Washington Post, November 19, 2019

If you own a Ring doorbell, you can’t stop it from storing your video on the Internet (the “cloud” is an essential part of the service), but you can turn off the law enforcement sharing feature. And when police request video, you still have to approve each request to your device. But the police don’t need to tell you what they’re investigating, or why they want your video.

Federal agencies have long used cellphone data to track people of interest, usually under a warrant. But now the Trump administration–under the Department of Homeland Security–bought access to a commercial database containing location data from regular smartphone apps that have location data turned on. So when you’re using Waze to navigate traffic, or a weather app, your location could be sold without your knowledge to some data aggregator.

This data can be harvested by the DHS, or any other police agency that pays for it, and coupled with government-owned cameras, Internet-of-Things devices, doorbell video and other sources to find whoever/whatever they want. So why is Microsoft opposing the use of facial recognition software by police until Congress regulates it?

The problem here is that facial recognition software has a race issue. The algorithms don’t do a good job recognizing non-white faces. Wired noted this in 2019: “US government tests find even top-performing facial recognition systems misidentify blacks at rates five to 10 times higher than they do whites.”

The Department of Homeland Security has also found that darker skin challenges commercial facial recognition. In February, DHS staff published results from testing 11 commercial systems designed to check a person’s identity, as at an airport security checkpoint. Test subjects had their skin pigment measured. The systems that were tested generally took longer to process people with darker skin and were less accurate at identifying them—although some vendors performed better than others. The agency’s internal privacy watchdog has said DHS should publicly report the performance of its deployed facial recognition systems, like those in trials at airports, on different racial and ethnic groups.

When you’ve got bias in the system, based on skin tone, that’s not a well-developed technology. We think it’s bad that Trayvon Martin was surveilled by George Zimmermann because he was a black teen in a hoodie–that led to Martin confronting Zimmermann (I am not arguing that Martin was an angel, only that it was Zimmermann who made the first action), and Martin’s death.

It’s also bad that self-proclaimed vigilantes in coastal Georgia tracked down and killed a black man, Ahmaud Arbery, because he was running in a community where other crimes were allegedly observed being committed on video by young black men.

Imagine if police cameras misidentified people based on their race, and with police reform moving toward always-on body cameras, and police cruisers bristling with cameras, this could be a current issue, not some dystopian fiction.

In places like London, nearly every square foot of public space is covered by video and facial recognition, looking for terrorist activity or other violent crime. Before major cities in America turn into London, we need to be especially cognizant of the racial implications in using facial recognition in a widespread policing application.

The federal government needs to study this, and Congress needs to come up with some protections for our rights. Once the surveillance state is in full operation, we could be headed to a much worse replay of George Floyd-type events, when a violent perpetrator is misidentified and an innocent person is killed by police.

Plus, I just don’t like the idea of being watched all the time. A society based on liberty should not be founded on a surveillance state. Good for Microsoft and Amazon for forcing the issue. Now it’s up to Congress and the Trump administration to come up with an answer.


  • Facebook
  • Twitter
  • send Email
  • print Print


More Top Stories

This Statistic Hints At A Worse Economy Ahead

A third of renters and homeowners missed their July payments.

To Return To School Or Not To Return To School, That Is The Question

It’s July. Usually, this is the time of year that a parent’s fancy turns to thoughts of the kids going back to school. This year, with Coronavirus running rampant around the country, a great many …

by Gene

Harvard Goes Online for 2020

According to a tweet from Darren Rovell, Harvard has announced that “all course instruction will be taught online for the 2020-21 academic year.” Another source from CNBC is saying that so …