The use of Artificial Intelligence (AI) and Machine Learning (ML) in criminal justice has been understandably controversial. The recent application of these technologies in the form of risk-needs assessment tools and their potential future application as AI judges has raised a myriad of concerns. While some worry that these algorithmic tools serve to perpetuate pre-existing biases, others worry that they raise serious Equal Protection or Due Process concerns.
Still others offer a more esoteric concern; that is, that AI/ML tools are inherently incapable of moral judgment, which they consider necessary for judicial decision-making, especially in the criminal context. Accordingly, they fear that these tools offer an inadequate means of deciding the fate of criminal defendants and are especially ill-suited to replace judges altogether. It is this critique that this Comment seeks to challenge.
This Comment does not presume to argue that these tools are in fact capable of such moral judgment. Instead, this Comment challenges the premise. It argues that the degree to which a capacity for moral judgment is central to a judge’s role depends largely on the presiding theory of punishment. Under a retributive framing, proponents of the moral judgment concern may well be right. After all, retributivism turns on a judgment about the moral culpability that attaches to a criminal defendant for his or her past acts. But this Comment contends that, under a more utilitarian framing, where concerns over moral culpability largely yield to more forward-facing aims, the purported moral incompetence of these tools is less problematic.
Moreover, this Comment argues that our criminal justice system is (and, indeed, should be) trending away from a retributive framing in favor of a more utilitarian approach instead. Accordingly, at least with respect to their alleged moral incompetence, the use of AI and ML tools in criminal justice may not only be unproblematic, it may indeed be desirable if our preferred theory of punishment is utilitarianism.