The European Union has given Alphabet’s YouTube and TikTok a deadline of November 30 to reply to an information request on how they protect children from illegal and harmful content. The European Commission is requesting more information on the measures the companies have taken to step up their efforts to protect minors in order to comply with the bloc’s new rulebook for Big Tech, the Digital Services Act (DSA). Companies face fines of as much as 6% of their global turnover for DSA violations. These brands must provide information in the following areas
- Psychological risks to health
- Physical well-being
- Risks to minors using its services
- Assessment and mitigation of risks associated with the dissemination of illegal and harmful content
- Exercise of fundamental rights (including the rights of children)
- Protection of minors
The information request comes after EU industry chief Thierry Breton reminded Alphabet CEO Sundar Pichai of the company’s obligation to protect children and teenagers. Breton also told TikTok CEO Shou Zi Chew to spare no effort to counter disinformation on its platform, owned by China’s ByteDance.
TikTok and YouTube must provide the requested information to the Commission by November 30, 2023. The Commission will assess the next steps based on its assessment of the responses. They may need to formally initiate proceedings under Article 66 of the DSA.
The EU’s child protection probe
The European Commission has sent formal requests for information to YouTube and TikTok on the firms’ child protection obligations. The Commission is looking into what measures TikTok and YouTube have taken to protect children’s mental and physical health. The EU is concerned about the impact of social media on children’s well-being. This includes the spread of harmful content, cyberbullying, and addiction.
Gizchina News of the week
Under Article 74(2) of the DSA, the Commission can impose fines for incorrect, incomplete or misleading information provided in an information request. In the event of failure to respond, the Commission may decide to request information by decision. In this case, failure to respond by the deadline may result in the payment of periodic penalties. After being designated as very large online platforms, TikTok and YouTube must comply with the full set of regulations introduced by the DSA.
EU’s stand on child protection
The European Commission has been taking steps to uphold child rights and support children in need. This includes child protection on social media.
In March 2021, the European Commission proposed the European Child Guarantee, which aims to ensure that every child in Europe at risk of poverty, social exclusion, or discrimination has access to the most essential services, such as education, healthcare, and housing. The guarantee also includes measures to protect children from violence, abuse, and exploitation, including online.
The Digital Services Act (DSA) is a new rulebook for Big Tech companies operating in the EU. The DSA aims to create a safer online environment for users, including children, by holding tech companies accountable for the content on their platforms. It requires companies to take measures to prevent the spread of illegal content, such as hate speech and terrorist propaganda, and to protect users from harmful content, such as disinformation and cyberbullying. The DSA sets out fines of up to 6% of global turnover for platforms that fail to remove illegal content.
The EU’s child protection probe of YouTube and TikTok is part of the bloc’s efforts to create a safer online environment for users, especially children. The Digital Services Act aims to hold Big Tech companies accountable for the content on their platforms and to protect users from harmful content. Companies that violate the DSA face fines as much as 6% of their global turnover. The deadline for YouTube and TikTok to reply to the EU’s information request on child protection measures is November 30.