As we prepare to integrate Artificial Intelligence (AI) into the criminal justice system, we must be cautious of systemic biases that may arise from the data used to train these systems, Chief Justice of India (CJI) DY Chandrachud warned.
He explained that the data forming the basis for AI algorithms often reflect existing biases and systemic inequalities within the criminal justice system, which could result in perpetuating these biases and unfairly targeting marginalized communities.
“If historical crime data used to train these algorithms reflect biases or systemic inequalities in the criminal justice system, the algorithms may perpetuate these biases by targeting the same neighborhoods as ‘high-risk’ areas for future crime. This can result in disproportionate surveillance and policing of already marginalized communities, exacerbating social inequalities and perpetuating cycles of discrimination,” said the CJI.
He also noted that predictive policing algorithms often operate as black boxes, meaning their internal workings are not transparent.
CJI Chandrachud delivered these remarks during the Keynote Address at the 11th Annual Conference of the Berkeley Centre for Comparative Equality and Antidiscrimination Law, held at the National Law School of India University, Bengaluru, on the topic “Is there Hope for Equality Law?”
The CJI emphasized that the principle of “contextualization” is crucial when addressing AI challenges in a diverse country like India.
“India’s rich demographic patterns, characterized by linguistic diversity, regional variations, and cultural nuances, present a unique set of challenges and opportunities for AI deployment. As responsible users, we must ask ourselves critical questions to ensure that our engagement with AI is ethical and equitable. We need to be vigilant about the origins of data and its potential biases, scrutinize the algorithms we employ for transparency and fairness, and actively seek to mitigate any unintended discriminatory effects,” said the CJI, who has been at the forefront of technological evolution in the judiciary.
The CJI also highlighted how climate change amplifies the inequities faced by marginalized and disadvantaged groups. He pointed out that women, children, disabled individuals, and indigenous people face heightened risks from climate change, including displacement, health inequities, and food scarcity.
“Inequality thus becomes both a cause and consequence of climate change,” he opined.
He noted that wealthier individuals often have the means to invest in protective infrastructure and cooling systems during extreme heat, whereas poorer communities lack such resources, making them more vulnerable to climate-related disasters.
“Ensuring climate justice requires recognizing these differential impacts and actively involving affected communities in decision-making processes,” the CJI said.














