The Due Process Clause of the Constitution guarantees the accused the right to a fair trial but does a fair trial mean that an accused should have access to the latest technology to prove their innocence? Specifically does an indigent defendant, one who cannot afford legal representation, have the right to AI facial recognition software?
In 2019 Andrew Conyln had been charged with vehicular homicide related to an accident that had left his friend dead three years before. The issue, Conyln had not been the one driving. In the immediate aftermath of the wreck a good Samaritan pulled Conyln from the car. As police arrived at the scene, they spoke with the Samaritan who told them what happened. Unfortunately, the police failed to ask the man for his name or contact information and once their conversation concluded he vanished.
The right to a fair trial does not mean the best trial.
Conyln and his lawyers searched tirelessly for this unknown Samaritan who held testimony that would clear Conyln’s name but to no avail. Conlyn’s lawyers, who were aware of the use of facial recognition software by law enforcement, reached out to Clearview AI. The Chief executive of Clearview, Ton-That, granted the request and within seconds of using the tool the Clearview facial recognition software found a match using the footage captured of the Samaritan on body cameras worn by the police.
If AI facial recognition software can provide exculpatory evidence for the defense in a matter of seconds, why is it so rarely in the hands of the accused?
While the Constitution guarantees a fair trial it is silent on requiring the government to provide the accused with the best trial that technology can provide. The Due Process Clause of the 5th and 14th amendments provide that no one shall be “deprived of life liberty or property without due process of law.” The courts have subsequently found that necessary to due process of law is the right of a defendant to a fair trial.
However, a fair trial does not necessarily mean the best trial as was evidenced in the Supreme Court’s ruling in Arizona v. Youngblood. There, potentially exculpatory evidence had been accidently destroyed by the government and therefore the defense could not conduct its own tests. The Court found in Youngblood that absent the defense showing bad faith on the part of the state, the “failure to preserve potentially useful evidence does not constitute a denial of due process of law.” Thus, it is unlikely that the courts would find that an indigent defendant has a constitutional right to AI facial recognition software as of yet.
Facial recognition software is a double-edged sword.
Not all is lost however, if the prosecution has obtained exculpatory evidence using facial recognition software, they are obligated to turn over that evidence per Brady v. Maryland. However, they are not required to turn over the software itself.
Generally, discovery in a criminal case is far more limited than that in a civil proceeding. Further, while the prosecution has the obligation to turn over exculpatory evidence to the defense, it is not required to do the job of the defense. Thus, the state may have the tools which could uncover potentially exculpatory evidence but has no obligation to use them.
Of course, the issue that arises is what if the state either does not have access to AI facial recognition software or believes it would not be useful in proving an element of the charged crime?
In such cases an accused appears to have two options. First, an indigent accused can hope that the state or federal government provides the adequate funds to purchase AI facial recognition software. Second, as with Conyln, the accused can petition the company directly for the use of their software. Clearview has indicated that it wants to make its software available to public defenders with Mr. Ton-That stating “it would be free slash very inexpensive.” Thus, there may be an opportunity in the near future for many more falsely accused to have their names cleared with the help of facial recognition software.
This is an admittedly optimistic view of the use of facial recognition software that ignores the potential 4th amendment violations that may occur when companies like Clearview collect information without consent. It appears then that facial recognition software is a double edge sword that while potentially freeing some from false accusations makes all of us a little less free when it collects information without our consent.
Jonathan is a double Tarheel and native of North Carolina. In law school, Jonathan is a member of the Moot Court Appellate Advocacy Team and a staff member on the North Carolina Journal of Law and Technology. Jonathan’s favorite book series is the Wheel of Time by Robert Jordan.