Artificial Intelligence and Fourth Amendment Searches

October 2, 2018

In Kyllo v. United States, the court held that the use of “sense-enhancing” technology not in general public use to look at a criminal suspect’s house constituted a Fourth Amendment search. However, with the increase of the use of artificial intelligence in surveillance technology, this standard will likely need to be challenged in the near future. The Kyllo standard focuses on the officer’s agency in utilizing the sense-enhancing technology and therefore does not address situations in which the surveillance technology itself chooses who and what to look at.
Today, police have access to a variety of new AI-enhanced surveillance technology. For example, police in cities like Denver and Milwaukee have found success with the ShotSpotter technology, which combines artificial intelligence with long-range acoustic detection in order to pinpoint the location of gunshots for fast police response. Another company called Hikvision has released surveillance cameras that use neural networks to perform high-accuracy facial recognition on all those in potential high crime areas. Police in both South Africa and China have already begun to utilize these cameras and there is concern about potential future implementation in the United States. Finally, the police use of advanced technology is not limited to just stationary detectors. Police departments in the United States have been using camera-equipped drones in order to enhance their optical capabilities and many suspect that the use of these drones will only increase in years to come. These drones too will benefit from the increasing sophistication of AI systems. Drones have been developed that use deep learning to navigate complex hiking trails and it does not require a great amount of imagination to see the same sort of learning used to allow them to autonomously navigate streets of a city, looking for potential crime.

The Kyllo standard focuses on the officer’s agency in utilizing the sense-enhancing technology and therefore does not address situations in which the surveillance technology itself chooses who and what to look at.

The integration of autonomous drones with advanced optical technology poses an important question in light of Kyllo. When is an officer “using” sense-enhancing technology? In Kyllo, the officer had infrared vision technology in his direct control and chose exactly where to examine the suspect’s house. However, the very nature of AI is such that officers will, at some point, not need to be making decisions about that. As a hypothetical, imagine a situation in which a drone that has learned to navigate the streets of a large city determines certain areas to have a higher incidence of crime. The drone then tends to spend a larger percentage of time in those areas. Such a drone could also learn which types of properties have been the most useful to monitor or examine in the past and at what angle the camera should be pointed to catch the largest amount of criminal activity. If that drone were to, without a warrant or direct instruction from any officer, point its camera at a person’s home and find evidence of criminal activity, would that really be considered to be the officer “using” sense-enhancing technology to uncover evidence of the crime? In this situation, it is difficult to ascribe any sort of intentional action to the officer at all.
For a more direct comparison to Kyllo, consider the following example. One of the drones employed by a city’s police has been navigating the streets and, using an equipped infrared camera, has been collecting large amounts of information regarding the size of physical structures on each property in the city and the heat distributions of those structures. After six months, the police feed information to the drone regarding which structures have contained confirmed drug operations during those six months. Without any more guidance provided by the police, the drone uses the information from those past confirmed drug operation locations and correlates them with the size and heat distributions of the corresponding properties. Going forward, the AI begins to create risk metrics for each property based on the size and heat distribution of that property. It seems highly unlikely that the Kyllo court would be able to use the same logic to deem the scans performed by this drone as being a search.
In solving this issue, the court could implicitly extend the Kyllo ruling to artificial intelligence and say that any visual information acquired about a person’s home without a warrant must be in the visible part of the electromagnetic spectrum and is capped at a some arbitrary resolution. This formulation of the rule would remove concern about the agency of the actor and focus more on the time of information being acquired. When such a case concerning AI-acquired information inevitably reaches the Supreme Court, the court will be forced to weigh allowing police to optimally utilize technology in order to stop crime against the concern of ever dwindling privacy in the face of new technology.