A diagram of the facial recognition doorbell from Amazon's patent application.
A diagram of the facial recognition doorbell from Amazon's patent application. Courtesy ACLU

The tech company’s proposed facial-recognition camera system could be a civil libertarian’s nightmare.

Remember Jane Jacobs’ famous dictum that to create safe and healthy neighborhoods, “there must be eyes on the street”? Amazon has come up with a rather dystopian twist on that concept.

In a new, publicly available patent application, the tech giant presented a product that would incorporate facial scanning technology (like Amazon’s “Rekognition,” which can capture and identify a large number of people’s faces in real time) into Ring, the video doorbell that Amazon acquired earlier in the year. The goal is to identify not just those who ring the doorbell, as “smart” video doorbells like Google’s Nest currently do. A Rekognition-powered model would allow users to receive detailed information about who is approaching the house in real time, “enabling users to make more educated decisions on whether the person is suspicious or dangerous, and also whether or not to identify law enforcement, family members, neighbors of the like,” the patent reads.

With this doorbell, what users are really opening the door to is bias, the American Civil Liberties Union (ACLU) fears.

“It’s rare for patent applications to lay out, in such nightmarish detail, the world a company wants to bring about,” Jacob Snow, a technology and civil Liberties attorney at the ACLU of Northern California wrote in a blog post. “Amazon is dreaming of a dangerous future, with its technology at the center of a massive decentralized surveillance network, running real-time facial recognition on members of the public using cameras installed in people’s doorbells.”

As users of the social-media service NextDoor and many a neighborhood listserv know, online community chat-boards can be notorious platforms of paranoia, broadcasting accounts of “suspicious activity” that are often rooted in racial bias. Adding a doorbell facial recognition system to this chatter could take the consequences of that fear to new and troubling levels.

Imagine, for a second, a group of volunteers approach a neighborhood as a part of a voter registration drive. If Amazon’s technology is installed in that neighborhood, the faces of these individuals could be scanned from multiple vantage points, and potentially shared with the government. If any of them match the “database of suspicious persons”—likely a criminal database kept by the government—the system could ping police or other neighbors. Or, in another iteration, if a caller’s face doesn’t match with a list of “authorized people” created by a user, the system could add that image to the user’s own list of suspicious persons and raise the alarm accordingly.

Privacy experts and civil liberties advocates have long warned about facial recognition technology. For one, it has been found to be less reliable on people of color and women and known to generate false matches. Second, the technology is largely unregulated. There aren’t a lot of rules around who can collect, use, and share this data—and how.

Georgetown Law’s Center on Privacy and Technology compiled information that showed that the FBI and police departments all over the country are using facial recognition technology, and found that these agencies subject around 117 million American civilians to a “virtual line-up” without their consent. “[T]he FBI has built a biometric network that primarily includes law-abiding Americans,” the authors write. “This is unprecedented and highly problematic.” Imagine if that unregulated technology to categorize people as potential criminals was installed in every household. (In fact, Thomas Brewster at Forbes was able to create a similar doorbell face scanner using Rekognition for under $10 earlier this year—but of course, it didn’t ping the police.)

Amazon has already faced other criticism for Rekognition. The tech company has piloted this technology with police departments without proper safeguards in place, and pushed it on Immigration and Customs Enforcement (ICE), the agency that rounds up and deports undocumented immigrants in the interior of the country. Democrats in Congress have sent several letters to Amazon CEO Jeff Bezos demanding more transparency. Civil rights groups have objected. Shareholders have expressed concern. And Amazon employees have staged protests, asking the company to stop selling the technology to law enforcement and ICE. Per the employee letter, made available by Gizmodo:

We don’t have to wait to find out how these technologies will be used. We already know that in the midst of historic militarization of police, renewed targeting of Black activists, and the growth of a federal deportation force currently engaged in human rights abuses—this will be another powerful tool for the surveillance state, and ultimately serve to harm the most marginalized.

In a blog post from June, Amazon noted that “there has been no reported law enforcement abuse of Amazon Rekognition” and that their use policy forbids “any kind of illegal discrimination or violation of due process or privacy right.”

The doorbell facial scanner is not the only surveillance tool Amazon has envisioned recently: The company also acquired two patents for a wristband that would allow the wearer’s actions to be monitored and tracked. Such devices could be used to send feedback to warehouse employees, nudging them to get back to tasks such as packaging orders. “They want to turn people into machines,” Max Crawford, a former Amazon warehouse worker told the New York Times regarding the wristband. “The robotic technology isn’t up to scratch yet, so until it is, they will use human robots.”

It’s not clear that Amazon will actually manufacture these products once it has secured the patents. And of course, it is hardly the only tech company that is collecting and sharing sensitive data without consent. But what these applications represent, as ACLU’s Snow points out, is a vision for the future, one that integrates increasingly powerful surveillance technology into homes, workplaces, and everywhere in between.  

It is also arguably the opposite of what Jacobs was talking about in The Death and Life of Great American Cities. Her idea of “eyes on the street” was that neighbors would “see” each other, get to know each other, and collectively reduce the fear of urban spaces. The technology that today’s “smart cities” are being encouraged to integrate—bugged LED street lights, license plate readers, invasive drones, and cellphone trackers—seem to embrace a very different worldview: These eyes seem to scan everyone, but their gaze may harm specific groups disproportionately, making cities less equitable and welcoming places.

About the Author

Most Popular

  1. Smoke from the fires hangs over Brazil.
    Environment

    Why the Amazon Is on Fire

    The rash of wildfires now consuming the Amazon rainforest can be blamed on a host of human factors, from climate change to deforestation to Brazilian politics.

  2. An aerial photo of downtown Miami.
    Life

    The Fastest-Growing U.S. Cities Aren’t What You Think

    Looking at the population and job growth of large cities proper, rather than their metro areas, uncovers some surprises.

  3. a map of London Uber driver James Farrar's trip data.
    Transportation

    For Ride-Hailing Drivers, Data Is Power

    Uber drivers in Europe and the U.S. are fighting for access to their personal data. Whoever wins the lawsuit could get to reframe the terms of the gig economy.

  4. Transportation

    When a Transit Agency Becomes a Suburban Developer

    The largest transit agency in the U.S. is building a mixed-use development next to a commuter rail station north of Manhattan.

  5. Environment

    What U.S. Cities Facing Climate Disaster Risks Are Least Prepared?

    New studies find cities most vulnerable to climate change disasters—heat waves, flooding, rising seas, drought—are the least prepared.

×