Facebook, TikTok and others urged to reveal how many children use their platforms – The Guardian

0 Comments

Britain’s culture and education secretaries demand transparency in advance of online safety bill
Last modified on Thu 2 Dec 2021 12.12 GMT
Tech companies including Facebook and TikTok have been urged to reveal how many children have used their platforms after being warned to expect tougher regulation under a landmark online safety bill.
Leading tech firms were asked to produce details of underage use of their platforms at a meeting with the children’s commissioner and the culture and education secretaries on Wednesday. The companies were also told to expect the draft online safety bill, which includes provisions to protect children from harmful content, to emerge from pre-legislative scrutiny as a tougher piece of legislation.

In a statement following the meeting, the office of the children’s commissioner for England, Dame Rachel de Souza, said tech firms had pledged to “identify further information they can usefully share with the commissioner in a way which respects people’s privacy, regarding children on their platforms and the nature of harms children may face”.
It is understood the request for details of underage site use was made before the meeting and repeated during the gathering, where the subject of age-checking users was also brought up, along with parental controls in app stores.
The tech firms attending the meeting included Facebook and Instagram owner Meta, TikTok, Snap and Twitter – all of whom require a minimum age of 13 for their users. Google and Apple also attended.
The Facebook whistleblower, Frances Haugen, told MPs in October that the company could make a “huge dent” in the number of under 13-year-olds on its site if it wanted to. She said: “But they don’t want to because they know that young users are the future of the platform and the earlier they get them, the more likely they’ll get them hooked.”
In its most recent quarterly results statement, Meta said it had removed more than 2.6m accounts on Facebook and 850k accounts on Instagram because they were unable to meet the company’s minimum age requirement.
The culture secretary, Nadine Dorries, said it remained “far too easy” for children to access the worst corners of the internet. “We are creating new laws to compel the owners of tech sites to protect children from seeing horrific things online, and today I’ve made clear to them that now is the time for action to sort out their algorithms, enforce their own age limits and be a force for good in young people’s lives.”
The education secretary, Nadhim Zahawi, said he welcomed the commitment tech companies made in the meeting “to being more transparent and signposting important pieces of guidance for teachers and parents”.
The headline on this article was amended on 2 December 2021 to clarify that the request has been made of a number of major tech companies.

source

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *

Related Posts