Your AI Doorbell Knows Who’s at the Door Before You Do

We have become accustomed to digital doorbell systems that double as security monitors, so that we can view who is at our front door. However, advances in AI technology have enabled companies to integrate facial recognition into its systems so that your doorbell knows whose knocking before you do. Google’s Nest system has already incorporated facial recognition technology. Google’s Nest Cam IQ, Nest Hello video doorbell, and Nest camera within the Nest Hub Max, use “familiar face detection” so that users can “teach” the camera to be able to “tell the difference between faces you know and don’t know.” Google’s Nest devices progressively begin to recognize people’s faces and give alerts to notify the user whether there is a familiar or unfamiliar face at the door. An individual’s “face data” is saved in the device’s video history so that it can recognize the face in the future. Presently, Google’s Nest device only saves the facial information that the user designates belongs to someone who is familiar and the Nest app deletes the information regarding the facial information that the user marks as unknown. Google encourages users to get an individual’s permission before saving someone’s facial data. Further, Google tells users that local laws may require such permission, however, it does not have a system in place to require one’s permission.

“Adding facial recognition to a digital doorbell system moves away from providing household security and into a new realm of ever-present, always watching intrusions of privacy.”

Google’s facial recognition technology in its Nest doorbells raises concerns about privacy. Although Google encourages its doorbell security system users to get someone’s permission before saving his or her facial information, it does not have any measures in place to ensure that only consenting individuals’ information is saved. Moreover, it is impractical to expect a user to retrieve permission from every person who frequently comes to the door. Would individuals even understand the extent of the facial tracking and indefinite nature of granting permission? Additionally, the possibilities for misuse are endless. All too frequently, racial profiling drives digital doorbell users’ reports of “suspicious” activity. In one instance, police stopped an African American real estate agent because digital doorbell users believed it was “suspicious” for him to ring their doorbell. Having the added ability to save and track the faces of individuals increases the potential for privacy invasions. The familiar and unfamiliar face distinction enhances these concerns. The change of a postal worker could needlessly put a digital doorbell user on high alert because it is not the usual “familiar” face. Adding facial recognition to a digital doorbell system moves away from providing household security and into a new realm of ever-present, always watching intrusions of privacy.

Amazon plans to take facial recognition beyond recognizing the friends of your family and friends. In 2018, it filed for two facial recognition patent applications to use Ring devices in order to automatically alert the police when it recognizes a “suspicious” individual. The facial recognition technology would determine if the Ring audio or video contained either a known criminal, which includes, convicted felons, sex offenders, or individuals on a “most wanted” list, or an apparent “suspicious” person; then the device could automatically send the video to law enforcement. The American Civil Liberties Union has condemned use of facial technology in such a manner, stating that “Amazon is dreaming of a dangerous future.

If Amazon’s plans for its Ring doorbells actualize, your doorbell could recognize what Amazon has vaguely described as a “suspicious” individual and alert the police all before you even know who was at the door. Internet of Things (IoT) technologies, which are interconnected devices that communicate with one another and require no human intervention, already have the ability to make decisions and take action without your knowledge. For example, IoT enabled cars and washing machines can and order parts it recognizes as necessary without your permission. But, when IoT automatic action moves beyond self-ordering a car part and into automatically calling the police, there is significant cause for concern. Individuals would be automatically profiled based on criminal status without reporting from an actual person. Such a system would eviscerate any privacy one has in approaching a doorstep. Furthermore, it is unclear whom the Ring doorbell will recognize as “criminal” or “suspicious.” Will the Amazon Ring doorbell automatically alert the police when your friend, who has outstanding parking tickets, knocks on your door? Such an absurd circumstance could become reality with round the clock video doorbell surveillance coupled with automatic police reporting. Privacy rights are already eroding with the constant facial recognition capabilities of devices like Google’s Nest doorbells, but civil liberties would be seriously attacked if Amazon were to pursue this as the future of doorbell security.

Stephanie Long