November 22, 2024

Messenger: Meta’s decision to end-to-end encrypt Facebok Messenger criticised, here’s why

[ad_1]

Meta recently announced that it is rolling out end-to-end encryption for Messenger but the decision has not gone down well with certain groups. According to a report, some child safety organisations and US prosecutors have criticised the decision, saying that the feature will cripple online child protection.
As per a report in NBCNews, National Center for Missing & Exploited Children (NCMEC) in the US has said that child sexual abuse reporting may plummate due to encryption.Social media companies are legally obligated to send any evidence of child sexual abuse material they detect to this group.
“Encryption on platforms without the ability to detect known child sexual abuse material and create actionable reports will immediately cripple online child protection as we know it,” said a spokeperson from NCMEC, which sends evidence shared by social media comapnies to relevant domestic and international law enforcement agencies.
“NCMEC anticipates the number of reports of suspected child sexual abuse from the larger reporting companies will plummet by close to 80%,” it added.
What is end-to-end encryption
End-to-end encryption is essentially a security tool that enables company’s to keep user data private and secure. Meta said it has spent years of investment and testing for this security feature.
A default end-to-end encryption will add an extra layer of security to protect the content of user’s messages and calls shared with friends and family. With the security in place, nobody, including Meta, will be able see what’s sent or said, unless users choose to report a message to the company.
What Meta has to say on criticism
A Meta spokesperson said that the feature is an effort to strengthen its enforcement systems to root out potentially predatory accounts.
“We don’t think people want us reading their private messages so have spent the last five years developing robust safety measures to prevent and combat abuse while maintaining online security. These include defaulting anyone who is under the age of 16 to more private settings when they join Facebook and limiting adults from sending private messages to teens if they aren’t friends,” the spokesperson was quoted as saying.
Gail Kent, director of messaging policy at Meta said that the company has been working on machine learning technology to detect publicly posted signals that will help it identify potential predators.



[ad_2]

Source link