But here's the unsettling twist: this isn't really a story about stopping crime. It's about building a lucrative new product for a prison industry that profits from keeping people locked up. What happens when the goal of surveillance shifts from rehabilitation to revenue?
Quick Summary
- What: Securus is testing AI to predict crimes by analyzing inmate communications.
- Impact: This expands surveillance markets without reducing incarceration, prioritizing profit over safety.
- For You: You'll understand how prison tech often serves corporate interests, not justice.
The Predictive Panopticon Goes Live
In a move that reads like a dystopian tech thriller, Securus Technologiesāthe telecom giant that monopolizes communication for roughly 1.2 million incarcerated people in the U.S.āhas begun piloting an AI system designed to scan inmate calls, texts, and emails for signs of planned crimes. According to MIT Technology Review, the company's president, Kevin Elder, revealed they built the tool by training machine learning models on years of historical prison phone and video call data. The stated goal is noble: predict and prevent crimes. But the reality is far more cynical and reveals the true drivers behind prison tech innovation.
Why This Isn't About Public Safety
To understand why this development matters, you must first understand the economics of the prison-industrial complex. Securus, owned by the private equity behemoth Platinum Equity, generates hundreds of millions in revenue annually by charging inmates and their families exorbitant fees for basic communication. It operates in a captive market with virtually no consumer choice. The introduction of "predictive" AI surveillance isn't primarily a public safety play; it's a product expansion into a new revenue stream: selling "intelligence" and "risk mitigation" services back to the very correctional facilities that are its clients.
The system, as described, aims to flag conversations containing keywords, phrases, or tonal patterns associated with planning illegal activitiesāfrom violence inside facilities to potential crimes upon release. On its face, this targets a legitimate concern. However, the fundamental flaw is that the AI is being trained and deployed within a system rife with perverse incentives. More flagged incidents can justify increased monitoring budgets, longer sentences due to "disciplinary infractions," and extended parole supervisionāall of which mean more people in the system for longer, generating more communication fees for Securus and more bed-filling revenue for private prisons.
The Myth of Neutral Data
Proponents will argue the AI is simply analyzing patterns in neutral data. This is a profound misconception. The training dataāyears of inmate callsāis itself a product of a racially biased and economically skewed criminal justice system. As experts like Dr. Ruha Benjamin have long argued, AI trained on biased historical data simply automates and amplifies existing prejudices. An algorithm learning from calls in a system where Black and Brown people are disproportionately incarcerated may learn to associate cultural speech patterns or colloquialisms with "suspicious" activity, creating a high-tech feedback loop of discrimination.
Furthermore, the context is stripped away. A conversation about "getting a package" could refer to a planned drug drop or a care package from a grandmother. A mention of "seeing someone on the outside" could signal a threat or a plan for a job interview. Without human nuance and context, the risk of false positives is immense, with severe consequences for the person flagged.
The Chilling Effect on Rehabilitation
The most immediate and damaging impact may be on rehabilitation itself. Meaningful rehabilitation requires honest communicationāwith family, counselors, and lawyers. If every word is scanned by an opaque algorithm that could extend an inmate's stay or add restrictions, it creates a pervasive chilling effect. People will stop discussing their fears, their struggles, or their plans for life after prison, knowing any ambiguity could be misconstrued. This undermines the very social bonds and support networks proven to reduce recidivism.
Securus's Elder told MIT Technology Review the company began building these tools in response to client demand. This is the key insight: the customer is the prison, not the prisoner, and certainly not the public. The product is designed to meet the prison's stated need for "security" and "control," not society's need for successful reintegration. It optimizes for institutional efficiency, not human outcomes.
What's Next: The Normalization of Pre-Crime Surveillance
The pilot at Securus is not an endpoint; it's a beachhead. The technology and business model, if normalized here, will inevitably expand. The logic will be seductive: "If it works in prisons, why not for parolees on ankle monitors? Why not in high-crime neighborhoods? Why not scan public social media?" We are witnessing the operationalization of "pre-crime" surveillance in the space with the fewest legal protections and the most vulnerable population.
Legally, inmates have severely diminished Fourth Amendment rights. Courts have generally allowed broad monitoring of prison communications for security reasons. This makes the carceral system the perfect testing groundāa legal gray zoneāfor technologies too controversial for the general public. Once refined and legitimized there, the barrier to wider deployment crumbles.
The Real Innovation Isn't Technical
The takeaway is stark. The innovation here isn't in the AI, which uses established pattern-matching techniques. The real innovation is commercial: crafting a compelling sales pitch that turns the pervasive surveillance of a captive population into a value-added service. It's a case study in how technology follows money and power, not necessarily progress or justice.
If we care about building a safer society, we should be deeply skeptical of tools that profit from predicting failure and expanding control. We should invest instead in technologies and programs that foster connection, education, and opportunityāthings that genuinely reduce crime but don't create quarterly revenue for private equity firms. The story of Securus's AI isn't a story about stopping crime. It's a story about monetizing the expectation of it.
š¬ Discussion
Add a Comment