When 81 Years of Detective Work Becomes 30 Hours

Both artificial intelligence and criminology are close to my heart, so when I came across the recent Sky News report about an AI tool that can apparently condense ‘81 years of detective work’ into just 30 hours, I felt both a surge of excitement and an instinctive need to investigate.

Avon and Somerset Police are trialling the system, which has been developed in Australia and is called Soze. Their hope is that it might unlock breakthroughs in long-forgotten cold cases.

It’s an impressive claim. By trawling through thousands of hours of footage, financial transactions, social media activity and archived documents, Soze can highlight patterns and connections that might otherwise lie buried for decades.

There is no doubt that the prospect is tantalising. Imagine cases that have languished in the archives for years suddenly given new life because an algorithm was able to connect a piece of video with an overlooked financial record. The families of victims might finally receive answers, and the wider public could regain faith in a justice system often accused of being too slow or too under-resourced to deal with historical crime. The scale and speed at which this tool operates has the potential to free up police officers to concentrate on the aspects of their work that only humans can do well, i.e. interpreting context, engaging communities, and weighing up the nuances of human behaviour.

Perhaps unsurprisingly to those who know me, I can’t view this news without also considering what lies beneath the surface of innovation.

Cold case files are not neutral data sets.

They are products of their time, shaped by the biases and oversights of earlier investigations. Communities that were over-policed in the past, or groups that were systematically ignored, are inevitably represented unevenly in the historical record. If an AI system processes this kind of skewed data uncritically, it risks reinforcing the very injustices we ought to be correcting.

In other words, the tool may highlight ‘leads’, but these could just as easily direct investigators down paths influenced by historic prejudice rather than fresh evidence.

There is also the question of quality. Evidence collected over the years is often messy and inconsistent. Old CCTV might be grainy, metadata corrupted, witness statements incomplete or contradictory. AI does not have the wisdom to distinguish between reliable and unreliable material; it will churn through whatever it is given – and even fill in gaps with hallucinated detail. This could then generate dozens of supposed clues, most of which may turn out to be nothing more than noise. This would have the opposite effect of what Soze is aiming to achieve; investigating such leads would waste police time. The worst case scenario is that it risks turning suspicion unfairly onto individuals who have nothing to do with the case.

Another issue is transparency. If Soze does indeed suggest a new line of inquiry, will investigators and, eventually, the courts be able to understand how it arrived at that suggestion? If the reasoning is buried inside a black box of algorithms, defence teams will rightly challenge the reliability of that evidence, and public trust may falter. In a justice system where accountability and fairness are non-negotiable, we cannot simply accept a machine’s conclusion without explanation.

And then there are the wider ethical dilemmas. Accessing and processing social media data, emails, or financial records on such a vast scale raises profound questions about privacy. Whose consent has been given? What safeguards protect those whose personal information is swept up in these trawls, and what limits exist to prevent misuse? Without clear boundaries, the rush to solve crimes could come at the cost of eroding civil liberties.

I find myself both hopeful and cautious. AI absolutely has a place in the future of policing and criminology, but its role must be one of augmentation, not replacement. The instinctive skills of seasoned detectives, the empathy shown when speaking with victims’ families, the contextual awareness of a community officer; these are not things a machine can replicate. We should see AI as an assistant, a magnifying glass that helps us notice what might otherwise be overlooked, but never as the detective itself.

Looking ahead, I suspect we will see systems like Soze becoming even more sophisticated. It is easy to imagine a future in which multiple streams of evidence, e.g. video, audio, geospatial data, digital communications, are fused together into probabilistic reconstructions of events. Investigators may be presented with scenarios that outline what could have happened, supported by probabilities and patterns that can then be tested in the field. This could transform not only cold-case work but also the way active investigations are pursued. At the same time, we will almost certainly witness legal battles over the admissibility of AI-generated leads, and pressure will mount for international guidelines or charters to regulate how such tools are used.

The real test of success will not be whether AI can close cases quickly or cheaply. It will be whether it can help deliver justice in a way that is transparent, fair and respectful of human rights. For me, that is where the promise truly lies. If we can harness AI to uncover new evidence while simultaneously keeping ethical integrity at the heart of its use, we may be on the brink of a new era in criminology—one that balances speed with fairness, and innovation with accountability.

Next
Next

80% of workers lack the time and energy to do their jobs