News & Views

Who’s the killer? The rise of pre-crime AI and the quiet targeting of our minds

For the Afro/Indo-Caribbean Diaspora, this ain’t just tech; it’s Tyranny in Disguise

Photo Credit: Artem Podrez

BY SIMONE J. SMITH

Let me ask you a serious question: If someone told you the government had a secret list of people who “might” commit murder in the future… would you believe them? No trials. No crime committed. Just numbers. Just vibes. Just you. Flagged.

Well, believe it, because the U.K. government is currently investing in a chilling new research project that aims to predict who will commit murder before it even happens using data from: police files, health records, trauma history, and social circumstances. It is being designed to “protect society,” but protect who? From what?

“When your past pain becomes the foundation of their future projections, you are no longer a citizen; you’re a suspect with no crime.”

This ain’t science fiction. This is a real-world algorithmic nightmare, one that echoes the infamous dystopia of Minority Report, only this time the actors are: AI engineers, government agencies, and bureaucrats with a little too much faith in their flawed datasets.

Let me break it down from a psychological and human rights lens, especially for our people in the Afro/Indo-Caribbean diaspora, because this technology is racially coded, socially blind, and psychologically abusive.

The UK’s “Homicide Prevention Program” pulls from massive datasets that include personal info like mental health struggles, addiction, poverty, race, and trauma history. That means: Black folks, immigrants, people from working-class, or marginalized backgrounds, we get tagged as “high risk,” because of how we live.

Think about that; your experience with violence, your pain from being assaulted, your attempt at therapy, all that could now label you as a potential murderer in some bureaucratic prediction engine. That is profiling. That is systemic bias, and yes, that is psychological warfare.

In Louisiana, a similar system called TIGER is already being used to make parole decisions. A nearly blind 70-year-old man was denied parole because the algorithm said he might reoffend. This is not justice.

When people try to ask, “How did you come to this decision?” The answer is, “The math told us to.” That is what experts call the “accountability sink.” Nobody can be blamed, and nobody can appeal. Sound familiar? That’s what systems built for control (not freedom) look like.

Now let’s zoom out. How does this tie into Agenda 2030?

Agenda 2030, on the surface, is a global sustainability plan created by the United Nations full of goals like: ending poverty and building inclusive societies. Sounds good, but when governments start using it to justify: mass surveillance, AI policing, and social control under the banner of “safety” and “resilience,” we have to ask: Who is setting the terms? Who benefits? Who gets left behind, or locked up?

One of the targets in Agenda 2030 talks about “reducing violence” and “enhancing the rule of law.” When you use AI to predict criminality based on race, income, or trauma, that is not the rule of law. That is the rule of pattern recognition, and pattern recognition is shaped by past injustice, especially in places where our communities were already overpoliced and under protected.

So yes, we must talk about “globalist” control. There is a legitimate concern around unchecked tech, centralized data, and elite decision-making without community voice. What happens when machines start deciding: who gets freedom, who gets punished, and who never even gets a fair shot? Two words; digital colonialism.

We, the diaspora, cannot afford to be passive. These tools are being tested now, quietly, in labs and policy circles. They will land on our doorsteps. They will shape our children’s futures. We need to understand how power cloaks itself: sometimes in code, sometimes in policy, always in silence.

So, let us not be silent. Let us ask questions. Let us resist, and most of all, let us defend our right to be fully human, not just data waiting to be judged.

Trending

Exit mobile version