You are on page 1of 10

1

Nicholas Anderson
CST300 Writing Lab
September 29, 2022
Ethical Analysis of COMPAS Algorithm

With the rise in use of artificial intelligence (AI) algorithms, ethical questions and

dilemmas are bound to arise. One such dilemma includes the use of an AI algorithm developed

by the company Northpoint for prosecuting criminals. The COMPAS algorithm is used to predict

the risk rate of a criminal re-committing a crime. With human lives involved, how data is being

used to create this rating comes into focus. Does the data used in the COMPAS algorithm

provide a prediction that accurately and fairly predicts the sentencing of humans?

At the core, AI algorithms use statistics. An algorithm uses large data sets to produce a

statistical prediction. AI makes predictions which seem like intelligence, but are really only as

intelligent as the data the algorithm has been trained on. Understanding how the statistical data in

an algorithm has been collected is important because if the data is weighted improperly, the

accuracy of the algorithm can become skewed.

In recent years, the US justice system has decided to use AI algorithms to assist in the

decision making process of incarcerating criminals. The company Northpointe has developed a

proprietary algorithm called the Correctional Offender Management Profiling for Alternative

Sanctions (COMPAS) which uses human data to produce a recidivism, the tendency of a criminal

to reoffend, risk rating. The goal of this algorithm is to assist judges and parole boards in

making unbiased decisions for incarceration. The COMPAS algorithm gives a rating of whether

an individual is likely to recommit crimes. Northpointe faces opposition to their algorithm from

cases stating the algorithm makes biased predictions towards certain ethnic groups.

Issue
2

The bias in the classification model trained by the algorithm becomes a problem when

one group of people is more heavily represented in the data than other groups. A disproportionate

rating can be a reflection of systemic societal factors such as racism. If more data is provided for

the algorithm reflecting one ethnic group, then the algorithm will classify this group as more

likely to recommit crimes. The data can be skewed by the current social system where some

neighborhoods are primarily populated by one ethnic group, and are over policed

(Corbett-Davies, S., Pierson, E., Feller, A., & Goel, S. 2021, December 7). Over policing can

cause individuals to be arrested for trivial crimes, escalating a minor offense to a more serious

one. Systemic racism within our policing system can cause data to become biased, and

potentially cause the algorithm to stereotype.

Stakeholders

America has one of the largest prison populations in the world (Washington, A. L. n.d.).

Inmate sentencing decisions affect the overcrowded prison system, as well as the safety of

America’s streets. With so many inmates coming into the system, the ability to assess every case

and give a fair evaluation to each can be difficult. One stakeholder in the COMPAS dilemma is

the judges and parole boards for whom the COMPAS algorithm has been designed to assist in

making their decision process easier, along with the company NorthPointe which profits from the

COMPAS algorithm. There are stakeholders opposed to the NorthPointe COMPAS algorithm.

The opposing stakeholders include enthic groups and allies who feel the data used to create the

COMPAS algorithm was biased from the start. The COMPAS opposition stakeholders have

provided studies showing the algorithm is largely weighted against people of color. The opposing

stakeholders are people who believe the COMPAS algorithm produces biased decisions based on

biased statistical data used in creating the algorithm, therefore making the algorithm unethical.
3

Stakeholder 1: Against the Algorithm

Values: In America, people have been fighting for civil rights for centuries, and the fight

continues in the age of technology. Stakeholders against the algorithm value equity and fairness

in society, and fight against systemic racism. Stakeholder(s) 1 believes the COMPAS algorithm

is threatening equity for defendants looking for parole or facing incarceration.

Position: Stakeholders, such as civil rights activists, hold the position that the COMPAS

algorithm should not be used due to the bias in the predictions made against people of color. The

fundamental stance is doubt in the accuracy of this algorithm due to systemic racism and racial

profiling. The overall feelings of Stakeholder(s) 1 is skepticism.

Claims: Stakeholders against the COMPAS algorithm have accrued claims of facts to

support their position. Propublica, a newsroom, conducted a study to determine if the COMPAS

algorithm is biased. The study looked at over 100,000 criminal defendants in Broward County,

Florida, and followed the defendants over a two year period. According to the study, populations

of color in the county were rated to be at a higher risk of recidivism than white populations. The

data from the study shows black defendants were misclassified as high risk more often than they

were, and conversely, white defendants were misclassified low risk more often than they were

(Larson, J., Angwin, J., Kirchner, L., & Mattu, S. 2016, May 23). The Propublica study

demonstrates the inaccuracy of the COMPAS algorithm.

Propublica is not the only group which has conducted research regarding the accuracy of

the COMPAS algorithm. Julia Dressel and Hany Farid, a student professor team from Dartmouth

college, conducted a study to make a comparison between the discretion of the algorithm versus

random people. 400 volunteers were enlisted for the study from a crowdsourcing website. The

volunteers were given information about the defendants, and the volunteers then predicted if the
4

defendant would recommit crime. The results of the study found the volunteers were 63%

accurate in their predictions, while the algorithm was 65% accurate (Yong, E. 2018, January 29).

The results demonstrate the COMPAS algorithm is hardly more accurate than random people

when making a prediction of recidivism.

The company NorthPointe has kept the algorithm's details very secretive, due to

proprietary reasons. How does the computer weigh different variables to make decisions, and

potentially cause bias or unreliable results? Northpointe should make it clear how the data

collected is used to make recidivism ratings.

Stakeholder 2: For the Algorithm

Values: The stakeholders who are for the use of the COMPAS algorithm fall into many

categories including potential clients who would use the algorithm for incarceration decision

making, as well as the makers of the product itself. People who believe in the COMPAS

algorithm are confident in the data science backing the algorithm due to the power of statistics.

Statistically, the algorithm should be able to compute more accurate and, theoretically, less

biased results than a human. The system using the algorithm values the ability to expedite

decision making without perceived bias. The proprietors of the algorithm value revenue the

software provides.

Position: Fatigued judges and parole boards appreciate the algorithm for assisting and

giving mathematical reassurance in their decision making. Stakeholders for the algorithm hold

the position that there is enough data proving the algorithm is accurate and unbiased.

Stakeholder(s) 2 believe the COMPAS algorithm helps keep the streets safe by making

mathematically sound predictions in aiding sentencing decisions.


5

Claims: The proponents of the COMPAS algorithm believe the algorithm is a less biased

way to make sentencing decisions. This is a claim of value due to positioning around an idea of

right or wrong. Traditionally, a judge or parole board needing to make the sentencing decision

could have a pre-learned bias from societal conditioning. Stakeholder(s) 2 believe the algorithm

is fair because every inmate answers the same questionnaire the algorithm uses to assess the

recidivism risk level, eliminating human bias. The questionnaire includes questions about,

“criminal involvement, relationships, lifestyle, personality, familial background, and education

level.” (Taylor, A. 2020, September 13). The COMPAS algorithm then predicts a risk assessment

from low, to medium, to high.

Another important claim is that the algorithm works, and the data collected supports the

claim according to the stakeholder for the algorithm. The article by Jackson and Mendoza

provide facts in the form of tables that prove the algorithm's effectiveness. The article goes on to

say that multiple independent entities have validated the effectiveness of the algorithm (Jackson,

E., & Mendoza, C. 2020, March 31). This is a claim of fact.

Argument Question

Does the data for the recidivism algorithm produce biased predictions, and if so should

the algorithm be allowed to assist in incarceration decisions?

Stakeholder 1 Argument :

Framework History: The stakeholders who are against the use of the COMPAS

algorithm hold the Virtue Ethics framework. The Virtue Ethics framework is one of the oldest

frameworks in recorded western history. Plato and Aristotle from ancient Greece are said to have

been the original founders of the framework. The framework focuses on the ethical virtues, such
6

as wisdom and compassion, that an individual possesses. Having the right virtues will allow one

to decide the right from wrong.

Framework Argument: Judges and parole boards should be able to use the virtues of

wisdom, compassion, and insight when dealing with each individual's case. Allowing machines

to make decisions does not allow for the aforementioned virtues to aid in reflecting on the

nuances of each individual’s circumstances and case. The COMPAS algorithm does not have the

ability to discern any virtuous qualities when determining decisions.

Action: Judges and parole boards should make decisions based on their experience in the

field of criminal justice and the evidence provided from the individual case. The value of one's

life should not be determined by the discretion of a computer. The experience a judge has

acquired to discern the fair punishment for a crime cannot be determined by seemingly biased

data. The nuances of human behavior cannot be solely determined by data.

Outcome: The outcome of choosing not to use the algorithm would alleviate the

possibility of computers producing biased results. The livelihoods of individuals who found

themselves in the hands of the law would receive sentencing from a judge rather than a

computer.

The potential negative related to removing the algorithm from service is balanced on the

potential that the algorithm makes more accurate and less biased decisions than a human. If the

algorithm is accurate and unbiased, then potentially more criminals would be free sooner.

Stakeholder 2 Argument:

Framework History: The framework which applies to Stakeholder(s) 2, those in favor

of the algorithm, is Kant's ethics framework. Immanuel Kant lived in the Enlightenment Period

and related the natural world to the supernatural world. The principle of the framework claims all
7

creatures should be held to the same moral standards. In other words, if an action is rational, then

it is rational to all.

Framework Argument: The argument is that the algorithm is rational, and everyone

gets treated equally because the algorithm distributes the same logic to all targets. The advocates

for the algorithm believe in the accuracy and fairness of the algorithm, and therefore, would even

use the algorithm on themselves.

Action: The COMPAS algorithm is justified and fair due to the years of academic

research in the areas of criminology, sociology, statistics, and computer science (Jackson, E., &

Mendoza, C. 2020, March 31). The data has proven that the algorithm is accurate and fair,

therefore, the algorithm should continue being used.

Outcome: The positive of using the COMPAS algorithm falls into several categories.

Judges and parole boards have support in decision making based on statistics, making their jobs

easier. The company Northpointe continues to profit from their product. And, criminals get

sentencing proportional to statistical data.

The negative which comes from using the algorithm is that people of color are

categorized with higher risk rates from the algorithm than are truly accurate. This accompanied

with the inability of the COMPAS algorithm to factor in statistics and circumstances from each

individual case other than the answers to questions on the questionnaire.

My Position

In my opinion, an algorithm should not be used to make a decision that can only be

analyzed correctly with human intuition, experience, and insight. In many court cases, mental

health and other issues are discerning factors. A machine cannot truly analyze and understand a

case from statistics alone. Being a computer scientist, I understand the power of AI algorithms.
8

However, when it comes to something as complex and irrational as human behavior, I find it

difficult to rely on the power of prediction from an algorithm.

My beliefs align me with Stakeholder(s) 1, those who stand against using the algorithm.

While I admit both stakeholders have data which supports their position, my core beliefs align

me with those against the COMPAS algorithm. On a topic where both sides have conflicting

data, one has to use their core beliefs to help find a solution. As a computer scientist, I am wary

of algorithms which use data collected from the complex experience of being a human being.

My recommendation in this dilemma is to not use the COMPAS algorithm. Each

individual case should be carefully considered with case specific information. The ability to

isolate unbiased data is very difficult, if not impossible due to systemic racism. The NorthPointe

company should also make the algorithm more transparent to dispel the idea of bias. This can be

done by revealing how the data is weighted in the risk assessment.

Conclusion

Each side in the COMPAS algorithm debate has data to prove their point viable, so one's

own beliefs decides how one interprets the data. Criminals still would be sentenced either way,

and the potential to make biased decisions could happen with or without the algorithm. This is an

example of an age old ethical problem of discrimination and bias which is surfacing again in

society’s AI technology.
9

References

Corbett-Davies, S., Pierson, E., Feller, A., & Goel, S. (2021, December 7). A computer program

used for bail and sentencing decisions was labeled biased against blacks. it's actually not

that clear. The Washington Post. Retrieved October 9, 2022, from

https://www.washingtonpost.com/news/monkey-cage/wp/2016/10/17/can-an-algorithm-b

e-racist-our-analysis-is-more-cautious-than-propublicas/

Jackson, E., & Mendoza, C. (2020, March 31). Setting the record straight: what the compas core

risk and need assessment is and is not. Harvard Data Science Review. Retrieved

September 16, 2022, from https://hdsr.mitpress.mit.edu/pub/hzwo7ax4/release/7

Larson, J., Angwin, J., Kirchner, L., & Mattu, S. (2016, May 23). How we analyzed the compas

recidivism algorithm. ProPublica. Retrieved September 16, 2022, from

https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm

Taylor, A. (2020, September 13). Ai prediction tools claim to alleviate an overcrowded American

justice system... but should they be used? Stanford Politics. Retrieved September 24,

2022, from

https://stanfordpolitics.org/2020/09/13/ai-prediction-tools-claim-to-alleviate-an-overcrow

ded-american-justice-system-but-should-they-be-used/#:~:text=COMPAS%20takes%20i

nto%20account%20a,capability%2C%20and%20more%20are%20considered.

Washington, A. L. (n.d.). How to argue with an algorithm: Lessons from the compas- propublica

debate. Colorado.edu. Retrieved September 24, 2022, from

http://ctlj.colorado.edu/wp-content/uploads/2021/02/17.1_4-Washington_3.18.19.pdf
10

Yong, E. (2018, January 29). A popular algorithm is no better at predicting crimes than random

people. The Atlantic. Retrieved September 24, 2022, from

https://www.theatlantic.com/technology/archive/2018/01/equivant-compas-algorithm/550

646/

You might also like