UK's 'Murder Prediction' Tool Sparks Alarming Debate Over Ethics and Bias

The idea sounds like science fiction: a computer system that predicts who's most likely to kill. But it's not a movie plot. It's real — and it's happening in the United Kingdom.
What is the UK's 'murder prediction' project?
Commissioned under Prime Minister Rishi Sunak's government in 2023, the so-called "murder prediction tool" is a data science project developed by the UK Ministry of Justice (MoJ), according to the Guardian. Internally dubbed the "Homicide Prediction Project" — though later softened to "Sharing Data to Improve Risk Assessment" — the tool aims to use criminal justice data to identify individuals who may be at higher risk of committing serious violent crimes, including murder.
Though officials say it remains a research project, critics say the implications are anything but theoretical.
The tool analyzes vast amounts of personal data, including names, birthdates, ethnicity, and mental health records, drawn from multiple sources — including police departments in Greater Manchester and London, as well as the Probation Service. Documents obtained through Freedom of Information requests reveal that up to half a million people could be included in the analysis, some of whom may have never been convicted of a crime.
Why did the UK government create it?
The MoJ says the goal is simple: improve public safety. Officials argue that by studying past offenders and identifying patterns in the data, they can enhance the risk assessment systems already used in the UK's probation and prison services.
As reported by the Guardian, a Ministry spokesperson said, "This project is being conducted for research purposes only," emphasizing that it builds on existing models like the Offender Assessment System (OASys), which has been in use since 2001 to evaluate reoffending risks.
But critics warn that this new effort goes far beyond standard risk analysis — and may dangerously cross ethical lines.
Is it being tested — and is it working?
So far, the project remains in the research phase. According to internal MoJ timelines, the study was expected to conclude by December 2024, after which the data will be deleted and findings shared with stakeholders. There is no official word yet on whether the tool has shown any predictive success.
That hasn't stopped backlash.
Civil liberties group Statewatch, which first uncovered the project, calls it "chilling and dystopian," according to the Guardian. Researcher Sofia Lyall warned that the tool risks amplifying racial and socioeconomic biases already embedded in the criminal justice system.
"This latest model will reinforce and magnify the structural discrimination underpinning the system,” Lyall said. "Using sensitive data on mental health, addiction, and disability is highly intrusive and alarming," according to The Register.
What data is being used — and who's affected?
While officials insist the data comes only from convicted offenders, documents obtained by Statewatch suggest otherwise. The types of data listed in agreements between the MoJ and police departments include:
- Age at first contact with police (even as a victim)
- Records of domestic abuse
- Markers of mental health struggles, suicide attempts, and substance use
- Disabilities and other indicators of vulnerability
These details, critics argue, create a profile that could unjustly flag people in crisis — not people at risk of committing homicide.
Is this the first system of its kind?
The UK is not the first country to experiment with predictive policing. The United States has deployed algorithmic tools for years to assess risk in criminal sentencing and policing — often with troubling results. AI systems have been shown to misidentify people of color, mislabel individuals as threats, and contribute to over-policing in marginalized communities.
What sets the UK project apart is its ambition and scale — and the fact that it may use deeply personal, non-criminal data to make predictions about future crimes.
References: UK creating 'murder prediction' tool to identify people most likely to kill | UK officials insist 'murder prediction tool' algorithms purely abstract