Posted on

AI built to predict future crime was racist

The company Northpointe built an AI system designed to predict the chances of an alleged offender to commit a crime again. The algorithm, called “Minority Report-esque” by Gawker (a reference to the dystopian short story and movie based on the work by Philip K. Dick), was accused of engaging in racial bias, as black offenders were more likely to be marked as at a higher risk of committing a future crime than those of other races. Another media outlet, ProPublica, found that Northpointe’s software wasn’t an “effective predictor in general, regardless of race.”

Share This:

Leave a Reply

Your email address will not be published. Required fields are marked *