Hi I'm a Concerned campaigner about climate & against the idiots running/destroying the world! I'm also a Wine & chilli fan been growing them in the UK (yes both!) since 1974, also a BEng, (Mech engineer) & Computing + IT systems eng, An early adopter & evangelist of the internet having one of the first email addresses in East London. Now I knit MSM lies with political vomit to show how they hate us all, including their own grannies!
Hi I'm a Concerned campaigner about climate & against the idiots running/destroying the world! I'm also a Wine & chilli fan been growing them in the UK (yes both!) since 1974, also a BEng, (Mech engineer) & Computing + IT systems eng, An early adopter & evangelist of the internet having one of the first email addresses in East London. Now I knit MSM lies with political vomit to show how they hate us all, including their own grannies!
AI-empowered predictive policing system as
The government want to use AI to criminalise your child
The UK government is rolling out an AI-empowered predictive policing system to predict the “likelihood” of criminal offences among adolescents. Open Rights Group is warning of the risks and limitations of using AI to determine the “most likely offenders.”
In conversation with the Canary, Mariano delli Santi of Open Rights Group explained that these systems are likely to single out children in care and alert authorities to targeted interventions. This means so-called predictive data models will be used to target vulnerable children.
AI is racist because it mimics society
Delli Santi said:
They say they want to help, that they will use this system to target children who are at risk of criminality, with support and therefore to prevent them from becoming criminals. However the way artificial intelligence and predictive policing works, tells us that this may not be everything in this story.
These systems will inevitably reflect society’s prejudices and stereotypes, making them inherently racist and classist, revealing problematic outcomes of predictive policing in practice.
The system, delli Santi explains, risks reproducing:
bias and stereotypes at scale. Black people, migrant people, poor people, people from geographic areas which have been historically over policed are more likely to be identified as at risk of committing a crime.
https://www.thecanary.co/uk/analysis/2026/03/17/predictive-policing-risks-criminalising-children/