Author: Shivika Arora
The Backlog Nightmare
The Indian judicial system is currently confronted with an unprecedented volume of pending cases, with estimates reaching 4 crores. This staggering figure transcends mere abstraction, representing a severe burden on the dispensation of justice. The introduction of artificial intelligence as a potential panacea for this backlog has generated considerable interest. The allure of technology capable of accelerating case resolution is compelling. Nevertheless, it is imperative to exercise caution. Rapid adoption without critical reflection risks perpetuating and entrenching historical inequities, notably those rooted in caste dynamics, under the guise of technological progress.
The Fallacy of Algorithmic Neutrality
A prevalent misconception persists that computer code is inherently impartial, operating solely through mathematical precision. In reality, artificial intelligence systems learn from the data provided to them. In the Indian context, such data is deeply embedded within a socio-cultural framework where caste has historically dictated social order. Consequently, AI trained on legacy police or judicial records may inadvertently encode and amplify existing biases. Unlike human judges, whose reasoning and biases may be scrutinised, algorithmic processes are often opaque and proprietary, rendering them immune to challenge or oversight.
Predictive Policing: From Prevention to Profiling
The deployment of predictive policing tools, such as Trinetra, in several states is intended to enhance crime prevention. In practice, however, this approach often mirrors and magnifies entrenched patterns of policing, resulting in disproportionate scrutiny of specific communities. An examination of quantitative outcomes before and after Trinetra’s implementation reveals significant shifts in enforcement patterns.
| Parameter | Pre-Trinetra (2015) | Post-Trinetra (2023) | What Happened |
| Arrests of Dalit Individuals (%) | 20 | 32 | Up by 60% |
| Repeat Surveillance Cases (Dalit) | 1,000 | 1,750 | Up by 75% |
| Overall Crime Detection Rate (%) | 42 | 44 | Barely moved (+2%) |
While the overall crime detection rate experienced only marginal improvement, there was a marked increase in the arrest and repeat surveillance of Dalit individuals. This pattern suggests a feedback loop whereby predictive systems reinforce historical policing patterns, resulting in the over-policing of marginalised communities.
AI in the Judiciary: The Example of SUPACE
SUPACE, an artificial intelligence tool designed to assist Supreme Court judges with legal research, automates the processing and selection of relevant case law. While the efficiency gains are evident, legal reasoning is inherently nuanced and context dependent. Over-reliance on algorithmically selected precedents risks prioritising dominant narratives at the expense of equity and diversity in legal interpretation.
| Category | Citations Count | Share of Total |
| Property Law | 1,200 | 35% |
| Criminal Law | 1,000 | 29% |
| Constitutional Law | 700 | 21% |
| Other | 550 | 15% |
The data indicates a pronounced emphasis on Property Law, raising concerns about systemic bias towards issues affecting property owners as opposed to tenants or those without land, thereby potentially marginalising vulnerable litigants.
Surveillance Technologies and Civil Liberties
The advent of advanced surveillance tools is reshaping public behaviour and civic participation. The scope and impact of these technologies are summarised below.
| Tool | What it does | What it actually does |
| Netra | Scans social media | Increases FIRs against activists. Makes people scared to post. |
| VAHAN | Tracks vehicles | Profiles protest organisers based on their routes. |
| FaceTagr | Facial recognition | Monitors all individuals. Heightened risk of wrongful detention. |
The chilling effect of such pervasive surveillance is palpable. Awareness that social media posts are monitored or that public gatherings are surveilled can suppress legitimate expression and assembly, posing risks to democratic discourse.
The Black Box and the Language Divide
A critical concern is the opacity of these algorithmic systems. The proprietary nature of their code restricts public scrutiny, raising ethical questions about accountability in decisions affecting fundamental rights such as bail. Furthermore, linguistic limitations exacerbate exclusion. Recent assessments indicate that approximately 80% of these tools operate effectively only in English and Hindi, thereby disenfranchising speakers of other regional languages and reinforcing digital inequities within the judicial process.
Emergence of Resistance and Advocacy
Notwithstanding these challenges, there has been a discernible increase in civil society engagement aimed at promoting transparency and accountability in the deployment of AI within the justice system.
| Group | Action | Result |
| Jana Adhikar Samiti | Legal challenge to predictive policing | Secured a judicial pause on implementation |
| Dalit Digital Rights Forum | Audit of surveillance technologies | Prompted the adoption of revised policy frameworks |
| AI for All | Development of regional language chatbots | Expanded access to legal information for marginalised groups |
The intervention by entities such as Jana Adhikar Samiti demonstrates that robust legal and civic action can successfully challenge the unchecked proliferation of algorithmic systems, reinforcing the centrality of rights-based advocacy.
Conclusion
The integration of technology, particularly artificial intelligence, within the Indian justice system offers the promise of addressing chronic inefficiencies. However, the pursuit of expedience must not come at the expense of fairness. The imperative is to ensure that transparency, inclusivity, and human oversight remain foundational to the administration of justice. Only through critical engagement and vigilant advocacy can the justice system harness the benefits of technology while safeguarding its core values.

Leave a Reply