“Ensuring the safety of our children online is not just a parental duty—it is a collective responsibility. As digital platforms evolve, we must demand that safety, privacy, and education go hand in hand. Only then can we create a future where teens feel protected and empowered in the online world.”
— Dr. Ranjana Kumari, Director, Centre for Social Research (CSR)
Online Grooming Risks: A Case from Tamil Nadu
A case from Tamil Nadu (Link) painfully illustrates the dangers many young teens face online. A 13-year-old girl was contacted by a man on social media who gained her trust. When they met in person, she discovered troubling content on his phone and confronted him. In response, he abandoned her at a lodge, where the owner exploited her vulnerable situation. Thankfully, she managed to escape and contact her parents, who had already filed a missing person report. While such stories are heartbreaking, they also underscore the urgent need for improved protections—an effort already underway. This harrowing incident underscores the very real risks teens face when strangers manipulate them online.
Why This Blog Now
Real-world examples like this highlight the urgent need for social media platforms to strengthen protections for teen users. For many teens, social media is their first point of contact—but it also exposes them to many online harms. This blog is not a critique of Meta or Instagram but an acknowledgment of important recent steps taken to enhance teen safety. It also emphasizes that safeguarding young users requires collective responsibility—shared among platforms, parents, educators, and communities.
Instagram’s New Teen Safety Measures
In September 2024, Meta began rolling out new safety features for teen accounts on Instagram, starting with the USA, UK, Canada, and Australia, with India following in February 2025. These updates include setting teen accounts to private by default, restricting messaging to approved contacts, and requiring parental approval for changes such as disabling content filters or going live. These measures aim to reduce exposure to harmful content—including explicit imagery—while giving parents tools to monitor and support their teens’ online activity. At the same time, they seek to respect teens’ need for autonomy and a safe environment to explore digital spaces. These updates are planned for Facebook and Messenger as well, reflecting Meta’s broader commitment to digital safety. Meta plans to complete this global rollout by June 2025.
Global and Indian Perspectives
The rollout has received mixed reactions worldwide. Many parents welcome the new oversight tools offering greater control, while some teens worry these restrictions limit their freedom to connect and express themselves. This tension reflects the complex balance between safety and autonomy in adolescent digital life. In India, where teen social media use is rapidly growing, many parents appreciate the default private settings and messaging restrictions that protect young users from unsolicited contact and harmful interactions. However, challenges remain—many parents lack the digital literacy needed to fully utilize supervision tools, and screen addiction remains a concern. The Centre for Social Research (CSR) advocates for a holistic approach—combining platform improvements with education for parents and teens on responsible social media use.
CSR’s Digital Safety Initiatives
Beyond advocacy, CSR actively shapes safer digital spaces. Its work includes high-level discussions on online harms, addressing the mental health impacts of AI and technology, and conducting cyber safety workshops with parliamentarians. CSR leads initiatives such as the Women’s Engage Alliance, which counters online hate speech and harassment through collective action. It also runs the Digital Literacy Lab, helping women survivors of domestic violence build digital confidence. Additionally, CSR trains university students as Digital Literacy Ambassadors to promote online safety among peers and partners on projects like StopNCII.org, combating non-consensual image abuse. Notably, CSR co-launched ACTS—the Alliance for Cyber Trust and Safety—which brings together over 20 members in a whole-of-society effort to address digital trust and safety challenges. In October 2025, CSR will host India’s inaugural Trust and Safety Festival, convening stakeholders from technology, government, and civil society to tackle emerging online risks.
CSR’s Recommendations for Strengthening Safety
While these platform changes are positive, CSR stresses the need for further measures. Stronger age verification systems using advanced AI—like facial recognition and behavioural analysis—are vital to prevent minors from accessing adult content. CSR and ACTS recommend exploring AI-powered voice recognition to detect and block harmful content accessed via voice assistants, a common way children interact online. Education remains key: digital literacy programs for parents and teens should help them understand risks, manage screen time, and use safety tools effectively, while promoting healthy online habits and digital etiquette. Content moderation must be culturally sensitive to India’s diverse social fabric and address cyberbullying in locally relevant ways. Finally, greater transparency from platforms—including sharing data on the effectiveness of safety measures—is essential for building trust and guiding improvements.
Looking Ahead
Instagram’s recent teen safety updates mark an important milestone toward safer digital environments. The ongoing challenge remains balancing protection with empowering teens to confidently and independently navigate online spaces. Platforms like Meta play a pivotal role but cannot bear this responsibility alone. We must join forces—parents, educators, civil society, and young users—to shape safer, more supportive digital ecosystems. CSR encourages Meta to share transparent data on these initiatives’ impact and invest further in digital literacy so safety features translate into meaningful, lasting protection. Together, through collaboration and innovation, we can create a digital future where young users are safeguarded and empowered to thrive.