Advertisement

SKIP ADVERTISEMENT

A Major Police Body Cam Company Just Banned Facial Recognition

Its ethics board says the technology is not reliable enough to justify using.

Credit...Claire Merchlinsky

Mr. Warzel is an Opinion writer at large.

Axon, the company that supplies 47 out of the 69 largest police agencies in the United States with body cameras and software, announced Thursday that it would ban the use of facial recognition systems on its devices.

“Face recognition technology is not currently reliable enough to ethically justify its use,” the company’s independent ethics board concluded.

Even as facial recognition systems are rolled out by privacy companies — from airlines to smartphone makers — institutions nationwide are balking at government use of algorithmically powered surveillance tools.

In May, San Francisco’s Board of Supervisors voted to ban use of facial recognition technology by the city’s police and other agencies. Other cities, including Berkeley and Oakland, Calif., and Somerville, Mass., are also mulling or close on bans. Earlier this month, California lawmakers announced they’re considering a statewide ban on facial recognition in police body cams.

In a 28-page report, Axon’s ethics board, which was handpicked by members of the Policing Project at New York University School of Law, argued that the technology “does not perform as well on people of color compared to whites, on women compared to men, or young people compared to older people.”

The report also cautioned that facial recognition is especially prone to inaccuracy when used with police body cameras, which frequently operate in low-light conditions and produce shaky footage.

“The tech is just not accurate enough,” Barry Friedman, founding director of N.Y.U.’s Policing Project and a member of the ethics board, told me. “Until that’s fixed we don’t need to say another word. And that could be years.”

Axon’s move is a rare departure from the “move fast and break things” style of innovation traditionally associated with new technologies. And it may very well indicate that, when it comes to facial tracking and privacy, policing may be where we draw the line.

It’s a crucial moment for facial recognition. Though most police departments have yet to deploy it, some uses by law enforcement have been troubling.

This year researchers found that Detroit had signed a $1 million deal with a vendor to continuously screen hundreds of public cameras throughout the city without citizen approval. In May, Clare Garvie, a facial recognition researcher at Georgetown Law, revealed sketchy tactics used by the New York Police Department to match security camera footage with potential suspects who looked like celebrities.

[If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.]

The technical limitations and biases of facial recognition technology are not well understood even by the companies that market the systems, which makes oversight of its use in the real world particularly problematic. Critics, meanwhile, worry that widespread deployment of the technology risks laying the foundation of a comprehensive surveillance state (just look at China).

“There’s a race to the bottom right now with this technology, and the challenge is to stop that elevator before it goes through the ground floor,” Mr. Friedman said. It’s something Axon’s ethics board report fought to change. According to the ethics board report, in early conversations about facial recognition, Axon initially argued that it “could not dictate to customers how products were used, nor its customers’ policies, and that it could not feasibly patrol misuse of its product.” That’s Big Tech’s version of “guns don’t kill people, people kill people.” And it’s a view that’s very widely held across the industry.

Mr. Friedman hopes that Axon’s pledge will force other vendors to think about where the new technology might be headed and how it could impact the most vulnerable. “We want them to remember that just because you can build it, doesn’t mean you should.”

The ultimate goal of the ethics board goes a step further: forcing the company to see that the customer for Axon products is not law enforcement but “the community that those law enforcement and public safety organizations serve.”

Axon’s ban isn’t necessarily foolproof. An Axon representative confirmed that law enforcement officials could potentially download Axon body cam footage and then transfer it to a third-party service, like Amazon’s Rekognition. However, Eric Piza, an associate professor at the John Jay College of Criminal Justice, said that the process is time-consuming and requires spending the money for yet another tech service. “If six officers respond to a scene, that’s hours of manpower and extra expense, which might reduce the likelihood they use the technology,” he said.

Still, Mr. Piza sees Axon’s moratorium as an important step. “Everyone’s concerned about big data policing and that they put privacy above short-term financials is not something that we see enough.”

Axon’s decision won’t completely stop law enforcement from using the technology — police departments could still use it on surveillance videos, for instance. And true progress will have to come from regulation at the city, state or federal level.

But the move demonstrates the potential for independent ethics boards to help guide technology companies whose products could drastically alter public life. If the stewards of our biggest technology companies don’t operate with an internal conscience, the least they could do is outsource one.

Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

glossary replacer

Charlie Warzel, a New York Times Opinion writer at large, covers technology, media, politics and online extremism. He welcomes your tips and feedback: charlie.warzel@nytimes.com | @cwarzel

A version of this article appears in print on  , Section A, Page 22 of the New York edition with the headline: A Major Police Body Cam Maker Just Banned Facial Recognition. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT