Algorithms and Ethics: Moral Considerations in AI

(Image: Prath/Shutterstock)

(Image: Prath/Shutterstock)

The city of Chicago is using an algorithm to predict whihch individuals are likely to be the victim or perpetrator of a crime. It sounds like the premise behind the TV show Person of Interest. But it's not TV, it's real. Chicago, a city that has seen its crime rate surge, is using this algorithm in an attempt to help get crime under control.

That's a good thing if it can help reduce crime. But there are ethical concerns about using data in this way. Do we want to predict the likelihood of someone committing a crime, like in the film Minority Report, and incarcerate them before the crime even happens?

What about other ethical concerns, like coding human biases into the algorithm? Such ethical questions will be raised more frequently as we humans rely on AI to help us make decisions and predict outcomes. The ethics and morality of AI will be the final topic in our current AllAnalytics Academy.

Chicago's Crime Algorithm

Chicago's algorithm has been used to create something called the Strategic Subject List or SSL, according to a report about the algorithm published by the New York Times this week. The algorithm is applied to arrested subjects to prioritize resources on those who pose the highest risk. Risk scores range from 0 to 500, and nearly 400,000 people have been scored.

Chicago is keeping the factors that go into its algorithm a secret, although the report notes that Chicago's algorithm does not use variables which could pose the risk of bias such as gender, race, or geography. It's truly a black box that spits out scores.

However, the New York Times reporters who wrote the story have taken their work a step further by using the publicly available data that Chicago has released and reverse-engineering the impact of each characteristic on the final risk scores. The writers say they used a linear regression model. Characteristics factored into the scoring include number of assault and battery incidents as a victim, number of shooting incidents as a victim, number of arrests for violent offenses, age per decade, gang affiliation, and more.

According to the report, the most significant characteristic for SSL risk scores is the age of the potential victim or offender. The report notes that no one older than 30 falls within the highest risk category. Yet, arrests for domestic violence, weapons, or drugs were found to be less predictive for future crime involvement, and gang affiliation had barely any impact on the risk score.

The article is a really interesting look into how the scores work. It's unclear whether the algorithm and its use is making a dent in Chicago's crime problem.

But it does raise interesting issues about how we as a society use data. What is ethical? What is moral? How can we avoid bias?

The ethical and moral questions around artificial intelligence are the subject of our final AllAnalytics Academy session on Thursday, June 15 at 2 pm ET/11 am PT. We're excited to welcome Rumman Chowdhury, a senior manager at Accenture, to talk about the place where human ethics meet AI and machine learning, and how organizations should tackle these issues. We hope you can join us on Thursday, and you can register at any time at this link.

Jessica Davis, Senior Editor, Enterprise Apps, Informationweek

Jessica Davis has spent a career covering the intersection of business and technology at titles including IDG's Infoworld, Ziff Davis Enterprise's eWeek and Channel Insider, and Penton Technology's MSPmentor. She's passionate about the practical use of business intelligence, predictive analytics, and big data for smarter business and a better world. In her spare time she enjoys playing Minecraft and other video games with her sons. She's also a student and performer of improvisational comedy. Follow her on Twitter: @jessicadavis.

Retailers Focus on Omnichannel and IoT Data for Analytics

After the rush of holiday shopping season, retailers are regrouping, planning, and getting new inspiration at the National Retail Federation (NRF) Big Show in New York. Here's what's in store for those who are going and those who are learning from afar.

How to Prepare for GDPR's Data Privacy Changes

Companies are headed into the final stretch to get ready for GDPR compliance before the May 2018 deadline. Whether your compliance plan is already in the works or if you are just getting started, here's some advice about setting priorities.

Re: predicting crime
  • 6/30/2017 5:01:14 PM

I for one am surprised that we can take this beyond scientific curiosity into considering real life implementation. We are confined within the boundaries of a Constitution that goes to great pain to protect the individual citizen without due process. I doubt it would come close to satisfy the burden of due process. Especially that it's a black box system.

Re: predicting crime
  • 6/28/2017 2:33:28 PM

I would like to see people willing to try new ideas more rather than be rigid in their thinking. People tend to either think in frozen and polarizing ways.  For example instead of thinking "Higher or lower taxes." rather we should  think "What does the nation need right now."  whith the thinking that we can change it one way and if it doesn't work we can change it back.  



Re: predicting crime
  • 6/28/2017 8:07:12 AM

<I think as a society we have to really think through what are real goals are.  Whether that be punishment and determent or social harmony and justice>

@Sethbreedlove you hit on the crux of the issue. I'm sure that has been debated in numerous ethics classes.  BTW if you are ever in the area of Philadelphia, you can visit the Eastern State Penitentiary. What's fascinating about the tour is the philosophy behind the original design of the jail. They did believe that keeping prisoners in solitary confinement would have beneficial effects on the individual, allowing him to reflect on his crime and resolve to do better. I should think that many people were driven mad by this treatment, despite the noble intentions.

Re: catch-22
  • 6/27/2017 10:57:28 PM

@kq4ym   I couldn't agree more that this topic is going to be hotly debated for centuries to come.  I don't believe this goal can be carried out in a purly objective way.  When the issue involves winner/losers those that come up short or even worse those profiled into the losing group will always be upset as they should be.

I can't even think of an example of pure objectivity at the moment which is probably a statement about Society best left for another time and place.

Re: predicting crime
  • 6/27/2017 10:41:55 PM

@ Ariella, I've discussed your article ""Pedictive Analytics Head to Jail" with others several times to highlight both the positive and negative factors of predictive analytics.  

I fear that as a society we create career criminals everyday.  Once someone gets a record it becomes much more difficut to find employment.  And when people are unemployed and in jail they are not building any nest eggs or retirement.  So desperation leads back to more crime.  

I think as a society we have to really think through what are real goals are.  Whether that be punishment and determent or social harmony and justice. I hope we find a solution that really does have the betterment of society.

Re: catch-22
  • 6/22/2017 11:49:01 AM

Such isses are going to be fraught with lots of controversy for a long time I'm guessing. Calling into play all the moral issues of right and wrong, and the social issues of whether we should call out groups that may be shown by data to be vicitims or perpetrators will surely lead to some angry debates between proponents and critics of such programs.

Re: catch-22
  • 6/16/2017 4:53:19 PM


Stacy writes

Eliminating attributes for social reasons without at least vetting them introduces social bias into a mathematical algorithm that should value accuracy over political correctness, especially when the issue is crime.  Maybe such attributes are important, maybe not.  Let a mathematical selection algorithm make that decision. 

The main issue or problem in any kind of predictive model like this includes the selection of attributes, the data inputs, the weights applied in the components of the predictive algorithms, etc. In other words, humans design the mathematical algorithms, and their biases and assumptions, influenced by the context of societal experiences and personal reactions, can affect how the predictive model operates.

As I've described in previous discussions, I have personal experience of this in predictive modeling for urban and transportation planning, where assumptions and biases of the model designers did influence the type of models deployed and the "predictions" rendered by the models (which were interpreted as recommendations by decisionmakers). 


No Easy Answers When Both Views Have Merit
  • 6/16/2017 2:28:43 PM

From 6/20/14:


  • 6/14/2017 8:13:15 PM

The issue is also with the bias of ethics.  Eliminating attributes for social reasons without at least vetting them introduces social bias into a mathematical algorithm that should value accuracy over political correctness, especially when the issue is crime.  Maybe such attributes are important, maybe not.  Let a mathematical selection algorithm make that decision. 

Re: predicting crime
  • 6/14/2017 12:23:31 PM

Very interesting technology predicting crimes are important for adequate staffing etc. but the ultimate goals would be to use the data to prevent crimes. It is the situation we face with monitoring of possible terrorists we know they might commit a crime but we are not always able to stop them. How do we get to that point? The most recent London attacks highlighted that issue where a suspected terrorist was able to rent a truck the integration to prevention doesn't exist yet.

Page 1 / 2   >   >>