Meta, Snap Must Detail Child Protection Measures By December 1, Says EU
The EU said it has sent formal requests for information to Meta, parent of Facebook (Representational)
The EU added Meta and Snap on Friday to a growing list of tech companies it is probing to see how they are complying with a new law meant to stop illegal content online.
The European Commission said it has sent formal requests for information to Meta, parent of Facebook and Instagram, and Snap, which runs the Snapchat image and messaging app, on what measures they have taken to protect minors online.
They have until December 1 to reply.
The probes are only an initial stage under the EU's Digital Services Act, which came into force in August, and do not themselves constitute an indication of legal violations or a move towards punishing the companies.
On Thursday, the commission opened similar probes into YouTube and TikTok, also to see what measures they had in place to protect minors from illegal and harmful content.
The DSA also bans targeted advertising to minors aged 17 and under.
Should any platform be found infringing the DSA it risks fines that can go up to six percent of global turnover.
The EU also has launched other probes into TikTok, X (formerly Twitter) and Meta over disinformation following the October 7 Hamas attack in Israel.
And it is investigating AliExpress, the online vendor owned by China's Alibaba, to seek more information on what it is doing to protect consumers from the sale of illegal products, including fake medicines.
The DSA is part of the European Union's powerful armoury to bring big tech to heel.
TikTok and YouTube are also among 22 services listed by the EU in September that face stricter curbs on how they do business under the DSA's sister law, the Digital Markets Act (DMA). Companies must fully comply with the DMA by March 2024.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)