88% of parents want online platforms to identify and act on fake minor accounts: Survey
58% of parents are okay with platforms monitoring content usage to determine a user’s age
By Newsmeter Network Published on 1 March 2025 10:51 AM IST
Representational Image.
Hyderabad: Regarding the safety of children navigating online media spaces, Information Technology (IT) secretary S Krishnan said that while technological solutions exist to detect children online, platforms have been unwilling to implement them.
“Now we are telling in the law—the draft Digital Personal Data Protection Rules—that they will have to detect who is a child and who is an adult,” he told media.
Echoing similar sentiments, according to a survey by LocalCircles, 88% of parents surveyed were in favour of DPDP rules mandating that platforms identify any minor accounts with age misrepresented and proactively either seek parental consent or shut such accounts. In other words, the social media, OTT and online gaming platforms must have a system that can tell whether a minor is trying to access adult media by falsifying parental consent or misrepresenting their age.
What are parents concerned about regarding children bypassing their consent?
According to some parents, many children have falsely represented their age when signing up on the platforms and with no checks and balances existing on most platforms, they were able to sign up and use such platforms.
Most parents believe that for the DPDP regulation to be effective, it is critical that the existing accounts where children are using social media, online gaming and OTT platforms as adult users are called out, consented upon, reconfigured for appropriate use or be shut.
It is hoped that the Ministry of Electronics and Information Technology, which has released the draft rules for the Digital Personal Data Protection (DPDP) Act after a long wait of 16 months and invited public inputs via the MyGov portal by February 18, will plug the loopholes that place minors at risk. The rules require that parental consent be implemented by social media, OTT and online gaming platforms.
What can go wrong with leaving a child to self-declare their age?
When children self-declare a false age to gain access to social media or online games, as they get older, so does their claimed user age.
This means they could be placed at greater risk of encountering age-inappropriate or harmful content online. Once a user reaches age 16 or 18, some platforms, for example, introduce certain features and functionalities not available to younger users – such as direct messaging and the ability to see adult content.
The internet can pose several other serious dangers to children. “It is hard for teenagers, in particular, to consider how a party picture or Snapchat message could cause problems ten years down the road when they interview for a new job, or how a prospective mate might respond to personal content that they post to their social media profiles or other websites,” advised kaspersky.com in a report on ‘Internet Safety for Kids’.
A study commissioned by Ofcom, the regulator for the communications service in the UK, found that most children aged between 8 and 17 (77%) who use social media now have their own profile on at least one of the large platforms.
To understand some of the solutions discussed by parents and consented upon, LocalCircles conducted a national survey which received over 44,000 responses from parents of school children located in over 349 districts of the country; 61 per cent respondents were men while 39 per cent of respondents were women; 44 per cent respondents were from tier 1, 27 per cent from tier 2 and 29 per cent of respondents were from tier 3 and 4 districts.
88% want platforms to proactively identify and shut down fake minor accounts
Considering that children often cite the wrong age on internet platforms, the survey first asked parents, “In the event a child wrongly states their age as over 18, the platforms are likely to permit them to open an account without parental consent. Should the rules also make it mandatory for platforms to identify such misuse proactively and shut such accounts?”
Out of 21,760 who responded to the question, 88 per cent stated ‘yes, absolutely’ and 4 per cent of respondents stated ‘no, let the child usage continue’ while others did not give a clear answer or the question not being applicable to them.
To sum up, 88% of parents surveyed are in favour of DPDP rules mandating that platforms (social media, OTT, online gaming, etc.) identify any minor accounts with age misrepresented and proactively either seek parental consent or shut such accounts.
58% of parents are okay with platforms monitoring content usage to determine a user’s age
LocalCircles also asked the parents about platforms monitoring content data of particular profiles to determine the age of the user.
58% of parents surveyed approve of the usage of profile information, user content consumption patterns, friends list, etc., by platforms to proactively identify accounts of minors with misrepresented age
To understand parents’ perspective on the problem of misrepresentation of age, the survey asked, “To identify whether the social media, gaming, OTT platform usage is by a child who has mis-stated his/her age in account creation, platforms will have to rely on content type consumed, uploaded, profile information, images etc. Do you approve of platforms using this information in determining age misrepresentation by the child?”
Out of 22,518 who responded to the question, 58 per cent stated ‘yes’ and 25 per cent of respondents stated ‘no’ while others did not give a clear answer or the question not applying to them.
Usage of AI to identify accounts
Many parents believe that through Artificial Intelligence, it will be relatively easy for platforms to identify such accounts. Many believe that they could even look at age and activity, friends list, etc. and classify accounts as RED, ORANGE and GREEN, with the RED ones needing urgent enforcement action.
Checking out the friends list and activities of children can be an indicator of the misrepresentation of age, and according to some parents, the platforms must deploy Artificial Intelligence capabilities and identify non-compliant accounts for further action.
The big question, according to parents, is that of intent of platforms, and that is where the Government must step in with its DPDP rules and ensure that it happens.
What is Centre’s take on further modifications to DPDP?
The Ministry of Electronics and Information Technology (MeitY) has signalled its openness to introducing further clarifications in the Digital Personal Data Protection (DPDP) Rules, 2025, released for consultation on January 3. The willingness to make the rules more robust was indicated during a closed-door industry consultation, as per a media report.
“The DPDP Act and its Draft Rules reflect our commitment to creating a digital world where children’s privacy is prioritised, and their rights are respected. We stand at a pivotal moment in the digital era – where children’s safety and privacy are no longer optional but essential,” Vikash Chourasia, a scientist at MeitY, stated in a social media post.