<p>Mandatory apps that can monitor access to pornography on all devices sold in India, permission to break end-to-end encryption to trace distributors of such content and penal provisions for cyber grooming are among the 40 recommendations of a Rajya Sabha panel formed to study the issue.</p>.<p>The 14-member adhoc panel led by Jairam Ramesh also suggested that streaming platforms like Netflix and social media websites like Twitter and Facebook should have “separate adult sections”, which cannot be accessed by underage children.</p>.<p>In its 24-page report submitted to Rajya Sabha Chairman M Venkaiah Naidu on Saturday, the panel recommended a broader definition of child pornography, controlling access for children to such content, holding Internet Service Providers for providing access and disseminating such content besides calling for international alliance against child abuse.</p>.<p>Demanding broadening the definition of child pornography in Protection of Children From Sexual Offences (POCSO) Act 2012, it said sexually explicit conduct does not require that an image depict a child engaging in sexual activity.</p>.<p>"A picture of nude or semi-nude child may constitute illegal child pornography if the posture is sufficiently sexually suggestive also called erotic posing," the panel, which was constituted by Naidu on December 12 last year after MPs raised concern over children's access to pornography as well as spread of child pornography, said.</p>.<p>It also said there was a need for explicitly defining cyber grooming – the act of persuading, coercing, communicating or arranging a meeting with a child with the intent of sexually abusing the child – and introducing penal provision.</p>.<p>In the Information Technology Act 2000, the panel wanted new additions to deal with punitive measures for those who provide pornography access to children and those who access, produce or transmit Child Sexual Abuse Material (CSAM). It also wanted a designated authority to block or prohibit all websites that carry such content.</p>.<p>Batting for mandatory apps on all devices sold in India that monitors children's access to pornographic content, the panel said Ministry of Electronics and Information Technology (MeitY) must "mandate existing screen-monitoring apps and/or encourage industry partnerships to develop the same through hackathons etc. Google’s Family Link App or similar solutions should be developed and made freely available to ISPs, companies, schools and parents."</p>.<p>It also wanted provisions in rules to ensure that Internet Service Providers (ISPs) pro-actively monitor and take down CSAM and report it to National Cyber Crime Portal. "All search engines must ensure that CSAM websites are blocked during the search and should be obligated to report any website along with gateway ISPs to the appropriate authority," the panel recommended.</p>.<p>The panel wanted the social media platforms to employ Photo DNA to target profile pictures of groups with CSAM content or prevent content from being uploaded at source as well as ban user accounts reported or flagged instead of simply blocking such content from users who report such content.</p>.<p>It also recommended allowing NGOs and activists, who wants to investigate the websites for purpose of finding the abusers, to do their probe with the approval of a nodal agency.</p>
<p>Mandatory apps that can monitor access to pornography on all devices sold in India, permission to break end-to-end encryption to trace distributors of such content and penal provisions for cyber grooming are among the 40 recommendations of a Rajya Sabha panel formed to study the issue.</p>.<p>The 14-member adhoc panel led by Jairam Ramesh also suggested that streaming platforms like Netflix and social media websites like Twitter and Facebook should have “separate adult sections”, which cannot be accessed by underage children.</p>.<p>In its 24-page report submitted to Rajya Sabha Chairman M Venkaiah Naidu on Saturday, the panel recommended a broader definition of child pornography, controlling access for children to such content, holding Internet Service Providers for providing access and disseminating such content besides calling for international alliance against child abuse.</p>.<p>Demanding broadening the definition of child pornography in Protection of Children From Sexual Offences (POCSO) Act 2012, it said sexually explicit conduct does not require that an image depict a child engaging in sexual activity.</p>.<p>"A picture of nude or semi-nude child may constitute illegal child pornography if the posture is sufficiently sexually suggestive also called erotic posing," the panel, which was constituted by Naidu on December 12 last year after MPs raised concern over children's access to pornography as well as spread of child pornography, said.</p>.<p>It also said there was a need for explicitly defining cyber grooming – the act of persuading, coercing, communicating or arranging a meeting with a child with the intent of sexually abusing the child – and introducing penal provision.</p>.<p>In the Information Technology Act 2000, the panel wanted new additions to deal with punitive measures for those who provide pornography access to children and those who access, produce or transmit Child Sexual Abuse Material (CSAM). It also wanted a designated authority to block or prohibit all websites that carry such content.</p>.<p>Batting for mandatory apps on all devices sold in India that monitors children's access to pornographic content, the panel said Ministry of Electronics and Information Technology (MeitY) must "mandate existing screen-monitoring apps and/or encourage industry partnerships to develop the same through hackathons etc. Google’s Family Link App or similar solutions should be developed and made freely available to ISPs, companies, schools and parents."</p>.<p>It also wanted provisions in rules to ensure that Internet Service Providers (ISPs) pro-actively monitor and take down CSAM and report it to National Cyber Crime Portal. "All search engines must ensure that CSAM websites are blocked during the search and should be obligated to report any website along with gateway ISPs to the appropriate authority," the panel recommended.</p>.<p>The panel wanted the social media platforms to employ Photo DNA to target profile pictures of groups with CSAM content or prevent content from being uploaded at source as well as ban user accounts reported or flagged instead of simply blocking such content from users who report such content.</p>.<p>It also recommended allowing NGOs and activists, who wants to investigate the websites for purpose of finding the abusers, to do their probe with the approval of a nodal agency.</p>