UK Police State Turns to Digital ID, Facial Recognition Tech, and 'Murder Prediction' Program
Inching closer to the China blueprint of total surveillance, zero dissent toleration, and an even darker path to uncharted tyrannical waters.
Let us kick off with a succinct summary of the UK government’s 10 “recommendations” from the Crime and Justice Commission, as captured by Kit Knightly from Off-Guardian:
How are we going to fix everything?
We’re going to replace jury trials.
We’re going remove online anonymity and legally limit access to social media
We’re going introduce facial recognition technology.
And we’re going to implement mandatory digital identity.
Yes, to the surprise of absolutely nobody, the report’s much-anticipated list of “recommendations” amounts to “we should do everything we’ve been planning to do for years”.
Now, some people might argue “it doesn’t matter what this commission recommends, they aren’t officially in power”. And that might be a fair point if being “in power” really meant anything.
Besides, the Lord Chancellor has already said that some of these recommendations will likely be made law in the near future.
Shall we try and guess which ones?
The full list of recommendations from the crime and justice commission:
1. Introduce a universal digital ID system to drive down fraud, tackle illegal immigration and reduce identity theft;
2. Target persistent offenders and crime hotspots using data to clamp down on shoplifting, robbery and antisocial behaviour;
3. Roll out live facial recognition and other artificial intelligence tools to drive the efficiency and effectiveness of the police;
4. Create a license to practice for the police, with revalidation every five years to improve culture and enhance professionalism;
6. Introduce a new intermediate court with a judge and two magistrates to speed up justice and reduce court delays;
7. Move to a “common sense” approach to sentencing with greater transparency about jail time, incentives for rehabilitation and expanded use of house arrest;
8. Give more autonomy and accountability to prison governors with a greater focus on rehabilitation and create a College of Prison and Probation Officers;
9. Restrict social media for under-16s to protect children from criminals and extreme violent or sexual content;
10. Raise the minimum age of criminal responsibility to 14 to take account of new developments in neuroscience.
The full report, published by The Times newspaper on 14th April 2025, can be viewed here as a PDF, or downloaded below:
There is some interesting imagery in the report, amidst the large font statistics and quotes on knife crime and abuse - evocatively used to rally support from the public so that they accept their digital panopticon enslavement, (un)naturally.
Here are photos, cover art, and quotes from the report.
The cover page instills a warm and fuzzy feeling in this native Brit…Or is it a sense of dread and impending doom?!
Alright, settle down now. Let’s meet the nice people who commissioned the report.
“If we want safer streets we’re going to have to start thinking about how technology can help us.”
“55 rape suspects arrested in London thanks to the use of live facial recognition technology.”

“This is precision policing. It’s allowing us to pinpoint people who are of interest to us. That can only improve trust and confidence in policing.”
This is reminiscent of the photo used by Greater Manchester police when they doxxed a man for “causing racially & religiously aggravated intentional harassment, alarm, distress” via a tweet…

The focus in the image above is to galvanise support and empathy for the police pictured in the aftermath of the riot reaction to the Southport attack, which is merely referred to as “a brutal knife attack” - as if the knife had a mind of its own. No mention of the horrific act itself committed by Axel Rudakubana in which he murdered young children with a knife. His name and photo were withheld by police for a long time.
Back to the report…

“It’s hard to see how you tackle illegal immigration without a digital identity system.”

Next stop:

Soviet era saying:
"Show me the man and I'll make him fit the crime"
There are variations of this saying, although it is supposedly attributed to Lavrentiy Beria, the head of the Soviet secret police, who boasted that he could fabricate a crime for any individual, regardless of their innocence.
THE CHINA MODEL
China’s panopticon grid has been live since at least 2020. Those with poor social credit scores, or even more brazen - critical of the government, are named and shamed and doxxed on large screens throughout the major cities .
They are also blocked from boarding trains and planes, via their digital ID. I uploaded this clip in 2022 via Odysee - it shows the true reality of so called ‘SMART cities’, if we allow them to come to fruition:
China was always the blueprint. The testing ground for all authoritarian measures which were enforced digitally and algorithmically.
PRE CRIME
How else might the UK police state hit higher quotas of imprisoning its citizens for thought crime? How about not bothering to wait around for a crime to even be committed? Arrests may soon be based on algorithmically determined predictive behaviour, genealogy, and whatever else they decide makes the man fit the pre-crime.
If you have seen something predictively programmed in tee-vee or film, then you can bet your bottom devaluing fiat dollar that it is coming to a city near you soon.
Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner - came across as giddy with excitement at the prospect of pre-crime in his article entitled: Can AI predict who will commit crime?
Will AI be able to tell us who is going to commit crimes in the future? Once a purely fictional question, probabilistic policing is now getting factual attention.
[…]
Predictive AI isn’t clairvoyance; it takes existing verifiable information and makes inferences from it. Machine Learning computes probability from multiple data points and, in areas where there are millions of factors to be crunched – such as the weather- it’s delivering exciting advances. Intelligent forecasting offers huge potential benefits and sectors such as energy are already using Predictive AI to combine historical data and model complex simulations to inform resource allocation.
[…]
Predictive profiling relies on past behaviour being a good indicator of future conduct. Is this a fair assumption?
[…]
One mark of criminal effectiveness is the absence of any previous convictions, so we need some other data points to feed into the AI predictor. What should we use? Arrests, acquittals, acquaintances? Stop and search history? Maybe genealogy and appearance? Nineteenth century scientists believed they could predict criminal disposition based on facial features, a claim that some have also made about AI. Is that intelligent forecasting? It feels like Minority Report with the emphasis historically on the former.
Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.
The coordination of these types of trial-balloon-gauge-public-reaction articles dropping at the same time as official government announcements is always uncanny…
The idea is the literal plot line of Philip K. Dick’s dystopian story ‘Minority Report,’ where individuals are arrested before they are able to commit crimes thanks to a ‘precrime’ predictive policing system.
It’s the exact same thing, minus the mutant human precogs. In reality the program will be much more boring, using ‘algorithms’ to churn through data and spit out results.
Given that the data will presumably include spicy social media posts, which are now being punished with prison sentences in the UK, you can see where this is going.
[...]
The Guardian reports:
The UK government is developing a ‘murder prediction’ programme which it hopes can use personal data of those known to the authorities to identify the people most likely to become killers.
Researchers are alleged to be using algorithms to analyse the information of thousands of people, including victims of crime, as they try to identify those at greatest risk of committing serious violent offences.
The scheme was originally called the ‘homicide prediction project,’ but its name has been changed to ‘sharing data to improve risk assessment.’ The Ministry of Justice hopes the project will help boost public safety but campaigners have called it ‘chilling and dystopian.’
May you live in interesting times, dear reader.
Nicholas Creed is a Bangkok based writer. Any support is greatly appreciated. If you are in a position to donate a virtual coffee or crypto, it would mean the world of difference. Paid subscribers can comment on articles, videos, and podcasts, and also receive a monthly subscriber newsletter.
Email: nicholas.creed@protonmail.com with information and newsworthy stories for open source intelligence gathering to support this Substack, thank you.
Bitcoin address:
bc1p0eujhumczzeh06t40fn9lz6n6z72c5zrcy0are25dhwk7kew8hwq2tmyqj
Monero address:
86nUmkrzChrCS4v5j6g3dtWy6RZAAazfCPsC8QLt7cEndNhMpouzabBXFvhTVFH3u3UsA1yTCkDvwRyGQNnK74Q2AoJs6