Code, Caste, and Constitution: The Role of Artificial Intelligence in Redefining Justice in India

Author: Shivika Arora

The Backlog Nightmare

The Indian judicial system is currently confronted with an unprecedented volume of pending cases, with estimates reaching 4 crores. This staggering figure transcends mere abstraction, representing a severe burden on the dispensation of justice. The introduction of artificial intelligence as a potential panacea for this backlog has generated considerable interest. The allure of technology capable of accelerating case resolution is compelling. Nevertheless, it is imperative to exercise caution. Rapid adoption without critical reflection risks perpetuating and entrenching historical inequities, notably those rooted in caste dynamics, under the guise of technological progress.

The Fallacy of Algorithmic Neutrality

A prevalent misconception persists that computer code is inherently impartial, operating solely through mathematical precision. In reality, artificial intelligence systems learn from the data provided to them. In the Indian context, such data is deeply embedded within a socio-cultural framework where caste has historically dictated social order. Consequently, AI trained on legacy police or judicial records may inadvertently encode and amplify existing biases. Unlike human judges, whose reasoning and biases may be scrutinised, algorithmic processes are often opaque and proprietary, rendering them immune to challenge or oversight.

Predictive Policing: From Prevention to Profiling

The deployment of predictive policing tools, such as Trinetra, in several states is intended to enhance crime prevention. In practice, however, this approach often mirrors and magnifies entrenched patterns of policing, resulting in disproportionate scrutiny of specific communities. An examination of quantitative outcomes before and after Trinetra’s implementation reveals significant shifts in enforcement patterns.

ParameterPre-Trinetra (2015)Post-Trinetra (2023)What Happened
Arrests of Dalit Individuals (%)2032Up by 60%
Repeat Surveillance Cases (Dalit)1,0001,750Up by 75%
Overall Crime Detection Rate (%)4244Barely moved (+2%)

While the overall crime detection rate experienced only marginal improvement, there was a marked increase in the arrest and repeat surveillance of Dalit individuals. This pattern suggests a feedback loop whereby predictive systems reinforce historical policing patterns, resulting in the over-policing of marginalised communities.

AI in the Judiciary: The Example of SUPACE

SUPACE, an artificial intelligence tool designed to assist Supreme Court judges with legal research, automates the processing and selection of relevant case law. While the efficiency gains are evident, legal reasoning is inherently nuanced and context dependent. Over-reliance on algorithmically selected precedents risks prioritising dominant narratives at the expense of equity and diversity in legal interpretation.

CategoryCitations CountShare of Total
Property Law1,20035%
Criminal Law1,00029%
Constitutional Law70021%
Other55015%

The data indicates a pronounced emphasis on Property Law, raising concerns about systemic bias towards issues affecting property owners as opposed to tenants or those without land, thereby potentially marginalising vulnerable litigants.

Surveillance Technologies and Civil Liberties

The advent of advanced surveillance tools is reshaping public behaviour and civic participation. The scope and impact of these technologies are summarised below.

ToolWhat it doesWhat it actually does
NetraScans social mediaIncreases FIRs against activists. Makes people scared to post.
VAHANTracks vehiclesProfiles protest organisers based on their routes.
FaceTagrFacial recognitionMonitors all individuals. Heightened risk of wrongful detention.

The chilling effect of such pervasive surveillance is palpable. Awareness that social media posts are monitored or that public gatherings are surveilled can suppress legitimate expression and assembly, posing risks to democratic discourse.

The Black Box and the Language Divide

A critical concern is the opacity of these algorithmic systems. The proprietary nature of their code restricts public scrutiny, raising ethical questions about accountability in decisions affecting fundamental rights such as bail. Furthermore, linguistic limitations exacerbate exclusion. Recent assessments indicate that approximately 80% of these tools operate effectively only in English and Hindi, thereby disenfranchising speakers of other regional languages and reinforcing digital inequities within the judicial process.

Emergence of Resistance and Advocacy

Notwithstanding these challenges, there has been a discernible increase in civil society engagement aimed at promoting transparency and accountability in the deployment of AI within the justice system.

GroupActionResult
Jana Adhikar SamitiLegal challenge to predictive policingSecured a judicial pause on implementation
Dalit Digital Rights ForumAudit of surveillance technologiesPrompted the adoption of revised policy frameworks
AI for AllDevelopment of regional language chatbotsExpanded access to legal information for marginalised groups

The intervention by entities such as Jana Adhikar Samiti demonstrates that robust legal and civic action can successfully challenge the unchecked proliferation of algorithmic systems, reinforcing the centrality of rights-based advocacy.

Conclusion

The integration of technology, particularly artificial intelligence, within the Indian justice system offers the promise of addressing chronic inefficiencies. However, the pursuit of expedience must not come at the expense of fairness. The imperative is to ensure that transparency, inclusivity, and human oversight remain foundational to the administration of justice. Only through critical engagement and vigilant advocacy can the justice system harness the benefits of technology while safeguarding its core values.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *