Karnataka’s Department of School Education and Literacy is set to roll out an AI-powered Facial Recognition System (FRS) named Nirantara across 52,686 government and aided schools starting in the 2025-26 academic year. This mobile-based system, integrated with the Students Achievement Tracking System (SATS), aims to streamline attendance tracking for over 52 lakh students, ensuring accurate records and transparency in government welfare programs like midday meals. While the initiative promises efficiency, it has sparked a heated debate over student privacy and data security.
Key Points:
- Scale of Implementation: Covers 46,460 government and 6,226 aided schools, impacting 40.7 lakh and 11.8 lakh students, respectively.
- Technology: Uses an Advanced Vector-Based Facial Recognition Engine to capture facial features like eyes and nose, encrypting them into unique IDs.
- Objective: Tracks absenteeism and monitors welfare scheme delivery, such as eggs, shoes, and socks distribution.
- Timeline: Announced in the 2025-26 state budget by CM Siddaramaiah, with a planned rollout by June 2025 after a pilot in select schools.
Privacy Concerns: A Ticking Time Bomb?
The adoption of AI facial recognition in schools has raised significant alarms among privacy advocates, parents, and educationists. A group of 31 experts, including members of the People’s Alliance for Fundamental Right to Education (PAFRE) and Critical EdTech India (CETI), have urged the government to reconsider, citing data misuse risks and global precedents where such systems have been banned.
Key Concerns:
- Data Vulnerability: Experts warn that biometric data of students could be leaked, sold, or stolen, potentially reaching child traffickers or other criminals.
- Image Morphing Risks: Photos of children, especially in protected spaces like classrooms, could be manipulated for malicious purposes, compromising safety.
- Lack of Legal Framework: India lacks comprehensive data protection laws to regulate FRT, increasing the risk of privacy violations.
- Global Bans: Many regions, including parts of the US and EU, have restricted or banned FRT in schools due to privacy and bias concerns.
- Digital Divide: Rural schools with poor internet and outdated infrastructure may struggle to implement the system, potentially excluding marginalized students.
A parent, Radha Narayan, emphasized, “Biometric data is incredibly sensitive. Once collected, there’s no going back. The government needs to ensure airtight security measures.”
Benefits of AI Facial Recognition: Efficiency vs. Ethics
Proponents argue that the Nirantara system modernizes education by reducing manual errors and ensuring accountability. The technology allows teachers to capture attendance for up to 50 students in seconds, freeing time for instruction and improving transparency in welfare programs.
Key Benefits:
- Accuracy: Minimizes errors and prevents proxy attendance, ensuring precise records.
- Efficiency: Teachers can mark attendance quickly using a mobile app, integrating data with SATS in real-time.
- Welfare Monitoring: Tracks student participation in schemes like midday meals, ensuring benefits reach intended recipients.
- Scalability: The system can potentially extend to colleges and other institutions.
A school teacher, Uma Kumar, noted, “This system ensures fairness in distributing benefits and reduces manual errors, making it easier to monitor absenteeism.”
Privacy Safeguards: Are They Enough?
The Karnataka government claims the system prioritizes data security. The app, hosted at the Karnataka State Data Centre, encrypts facial images into unique IDs, preventing reverse-engineering. Only specific facial features (eyes, nose) are captured, not full images, to minimize data storage risks.
Key Safeguards:
- Encryption: Facial data is converted into secure IDs, reducing misuse potential.
- No Full Photos: The system avoids storing complete images, focusing on specific features.
- Centralized Storage: Data is managed at the Karnataka State Data Centre for oversight.
However, critics argue these measures fall short without a robust legal framework. The absence of comprehensive data protection laws in India, coupled with past incidents of data leaks (e.g., SSLC exam data shared with commercial entities), fuels skepticism.
Global Context: Lessons from Other Regions
Globally, facial recognition technology in schools has faced scrutiny. In Delhi, similar systems in public schools were criticized as an “overreach” by digital rights groups due to the lack of privacy policies and consent mechanisms. In the US, states like New York have banned FRT in K-12 classrooms, citing risks of racism, surveillance normalization, and inaccurate identification. The EU prohibits real-time biometric identification for law enforcement unless strictly necessary, highlighting the need for stringent oversight.
Key Global Insights:
- Bans and Restrictions: New York, California, and parts of the EU limit FRT use in sensitive settings.
- Bias Concerns: Studies show FRT is less accurate for non-white faces and women, risking discrimination.
- Legal Gaps: Unregulated FRT deployments often lead to privacy violations and data breaches.
Alternatives to AI Facial Recognition
Experts advocate for less invasive methods to achieve the same goals, emphasizing community-driven and open-source solutions. Strengthening School Development and Monitoring Committees (SDMCs) could enhance teacher accountability without compromising student privacy.
Key Alternatives:
- Community Oversight: Invest in SDMCs to monitor attendance and welfare programs locally.
- Free and Open-Source Software (FOSS): Use transparent, publicly owned software to ensure data control.
- Manual or Non-Biometric Systems: Simple digital check-ins or RFID-based attendance could reduce risks.
- Parental Consent: Implement systems requiring explicit guardian approval to address privacy concerns.
The Way Forward: Balancing Innovation and Safety
The Nirantara initiative reflects Karnataka’s ambition to modernize education but must address privacy concerns to gain public trust. A robust data protection framework, transparent implementation, and stakeholder consultation are critical. Engaging parents, teachers, and communities through forums like SDMCs can ensure accountability without risking student safety.
Recommendations:
- Develop Legal Safeguards: Enact laws to regulate FRT use, defining clear purposes and accountability measures.
- Enhance Transparency: Disclose how data is collected, stored, and protected, with regular audits.
- Pilot Expansion with Oversight: Continue pilots with independent ethical reviews to assess risks.
- Explore Alternatives: Prioritize non-biometric solutions to minimize privacy threats.






