PREDICT AND SUSPECT: THE EMERGENCE OF ARTIFICIAL LEGAL MEANING

NCJOLT-Vol.23.1_67-122_Maggen

Recent theoretical writings on the possibility that algorithms would someday be able to create law have delayed algorithmic lawmaking— and the need to decide on its legitimacy—to some future time in which algorithms would be able to replace human lawmakers. This Article argues that such discussions risk essentializing an anthropomorphic image of the algorithmic lawmaker as a unified decision-maker and divert attention away from algorithmic systems that are already performing functions that, together, have a profound effect on legal implementation, interpretation, and development. Adding to the rich scholarship of the distortive effects of algorithmic systems, this Article suggests that state-of-the-art algorithms capable of limited legal analysis can have the effect of preventing legal development. Such algorithm induced ossification, this Article argues, raises questions of legitimacy that are no less consequential than those raised by some futuristic algorithms that can actively create norms.

To demonstrate this point, this Article puts forward a hypothetical example of algorithms performing limited legal analysis to assist healthcare professionals in reporting suspected child maltreatment. Already in use are systems performing risk analysis to aid child protective services in screening maltreatment reports. Drawing on the example of algorithms increasingly used today in social media content moderation, this Article suggests that similar systems could be used for flagging cases that show signs of suspected abuse. Accordingly, such assistive systems, this Article argues, will likely cement the prevailing legal meaning of maltreatment. As mandated child-abuse reporters increasingly rely on such systems, the result would be the absence of legal evolution, inhibiting changes to contentious elements in the legal definition of “reportable suspicion,” including, for example, the scope of acceptable physical disciplining. Together with the familiar effect of existing systems, the effect of this hypothetical algorithmic system could have a profound impact on the path of the law regarding child maltreatment, equivalent in its significance to the effect that autonomous algorithmic adjudication would have.

Author: Daniel Maggen

PDF: http://ncjolt.org/wp-content/uploads/sites/4/2021/10/NCJOLT-Vol.23.1_67-122_Maggen.pdf

Volume 23, Issue 1