The Intergroup on Children’s Rights asks the Commission for Protective measures for children online
- By Ntcadmin
- In Uncategorized
4 March 2021
The Vice-chair of the Intergroup on Children’s Rights, and EPP Secretary General, Antonio López-Istúriz has lead an initiative addressing the European Commission on the need of establishing protective measures for children online following the death of a 10 year old girl from Palermo (Italy) who suffocated after participating in a choking challenge on the social media platform Tik Tok.
The written question to the European Commission was co-signed by all Intergroup Co-Chairs, MEPs Caterina Chinnici, David Lega and Hilde Vautmans, along with other 18 MEPs.
In the text, the Members of the European Parliament ask the Commission if it is aware of any breach of consumers´ rights as set out in EU legislation. MEPs also asked the Commission about what initiatives it is going to propose to ensure that tech companies and other social medial platforms put in place appropriate measures to improve child safety online and to support and coordinate Members States targeted actions on awareness-raising campaigns about online risks for children and young people.
Following the death of the 10 year old from Palermo, the Italian Digital Protection Authority imposed a ban on Tik Tok for users unverifiable by age. The European lawmakers from the Children´s Rights Intergroup are calling for coherence and consistency across all EU legislation to ensure that specific protective measures for children are implemented in all EU Member States.
The written question highlights that screen time spent online by children has dramatically increased due to the current lockdown measures implemented to contain the Covid 19 pandemic, and with it the associated dangers.
MEP Antonio Lopez Isturiz White, Vice-Chair of the Intergroup on Children’s Rights, EPP Secretary General stated:
“Whether it is with self-regulation through safety by-design tools – or other parental control tools – or through more stringent regulations from the legislator, we need to ensure that tech companies are held responsible for their networks and how they are used by the their users base. Children are by far the most vulnerable consumers and any breach of their right to privacy, safety and processing of their personal data – including by aggressive advertising targeting them – need to be properly addressed by the tech companies to ensure a safe and healthy experience for children online.”
The European Union has recently put forward the Digital Services Act and Digital Markets Act introducing a set of new rules applicable across the whole EU.
MEP Caterina Chinnici, Co-Chair of the Intergroup on Children’s Rights added:
“It is utmost importance that online companies take their share of responsibility and live up to the high standards of the European values. We have now a unique opportunity with these new legislation and we need to make sure that specific protective measures for children and young people – who spend a great deal of time online – are put in place. I am particularly worried at the type of content online children are exposed to – which can lead to self-harm, as it was the case of 10-year old girl from my hometown in Italy: Palermo, who died as a result of a‘TikTok challenge’ – and how easy it is for them to fall victim of sexual abuse and other forms of violence on the net.”
Over the last few months, we have witnessed a series of regulatory complaints in Europe against TikTok where consumer protection groups have filed a series of coordinated complaints alleging multiple breaches of EU law. The complaints span from claims of unfair terms, to the type of content children are exposed to on the platform.
Post Your Comment
You must be logged in to post a comment.