Plans to roll out end-to-end encryption on Facebook and Instagram have been delayed amid a row over child safety.
Meta – as Facebook's parent company is now called – said messaging encryption on the apps would now come in 2023.
The process means only the sender and receiver can read messages, but law enforcement or Meta cannot.
However, child protection groups and politicians have warned that it could hamper police investigating child abuse.
The National Society for the Prevention of Cruelty to Children (NSPCC), has claimed that private messaging "is the front line of child sexual abuse".
UK Home Secretary Priti Patel has also criticised the technology, saying earlier this year that it could "severely hamper" law enforcement in pursuing criminal activity, including online child abuse.
End-to-end encryption works by "scrambling" or encrypting the data while it travels between phones and other devices.
The only way to read the message is usually to get physical access to an unlocked device that sent or received it.
The technology is the default for popular messaging service WhatsApp, also owned by Meta – but not the company's other apps.
The NSPCC sent Freedom of Information requests to 46 police forces across England, Wales, and Scotland asking them for a breakdown of the platforms used to commit sexual offences against children last year.
The responses revealed:
That has led to fears that Meta's plans to expand encryption to widely-used Facebook Messenger and Instagram direct messages could shield the majority of abusers from detection.
The NSPCC said that encrypting messages by default could lead to the easier spread of child abuse imagery or online grooming.
But advocates say that encryption protects users' privacy, and prevents prying by both governments and unscrupulous hackers. Meta chief executive Mark Zuckerberg made those arguments himself when he announced Facebook's encryption plans in 2019.
Antigone Davis, Meta's global head of safety, said that the delay in implementing encryption to 2023 was because the company was taking its time "to get this right".
The company had previously said the change would happen in 2022 at the earliest.
Ms Davis said: "As a company that connects billions of people around the world and has built industry-leading technology, we're determined to protect people's private communications and keep people safe online."
She also outlined a number of additional preventative measures the company had already put in place, including:
Andy Burrows, head of child safety online policy at the NSPCC, welcomed the delay by Meta.
He said: "They should only go ahead with these measures when they can demonstrate they have the technology in place that will ensure children will be at no greater risk of abuse.
"More than 18 months after an NSPCC-led a global coalition of 130 child protection organisations raised the alarm over the danger of end-to-end encryption, Facebook must now show they are serious about the child safety risks and not just playing for time while they weather difficult headlines."
What is encryption?
Dozens killed in bus crash on Bulgarian motorway
Yalda Hakim: My return to Afghanistan
Why China banned this viral pop song. Video
Threat of Israeli strike on Iran nuclear sites grows
Why Facebook and Twitter are under fire in Ethiopia
Can UK avoid a Europe-style return to lockdown?
Can South Africa embrace green energy from the sun?
Why China banned this viral pop song. Video
The shoebox-sized satellites that could change the world
BBC Travel: Norway's soaring mountain staircases
UN ignored plea for Beirut blast probe evidence
Looted art claims pose questions for revamped museum
Learning to survive and facing impossible choices
Sean Bean and Stephen Graham star in new prison drama Time
'I didn't want to break the President's leg'
Bear Grylls on filming with Barack Obama
© 2021 BBC. The BBC is not responsible for the content of external sites. Read about our approach to external linking.
23 novembre, 2021 0 Comments 1 category
Category: Non classé