Last December the state of New Mexico sued Meta for failing to protect children, claiming that Facebook and Instagram algorithms recommended sexual content to minors. Now, an unredacted internal Meta presentation has been revealed, with the company's own employees estimating that 100,000 child users were harassed daily, The Wall Street Journal reported.
According to a 2021 internal document, Facebook's "People You May Know" (PMYK) algorithm was singled out as a primary connector of children to predators. When employees reported those findings to Meta executives, they reportedly rejected recommendations that the algorithm be redesigned to stop recommending adults to minors.
The feature was responsible for 75 percent of all inappropriate adult-minor contact, according to one employee. "How on earth have we not just turned off PYMK between adults and children?" another employee said. "It's really, really upsetting," added another.
The issues were particularly insidious on Instagram, according to an internal 2020 memo, with "sex talk" 38 times more prevalent on that platform than Facebook Messenger in the US. In one case an Apple executive reported that his 12-year-old child was solicited on Instagram. "This is the kind of thing that pisses Apple off to the extend of threat[en]ing to remove us from the App Store," said an employee charged with addressing the issue.
New Mexico claims that Meta has failed to address large-scale predation on its platform, particularly around recommendation algorithms. State investigators originally set up phony accounts for children, providing adult dates of birth, as kids often misstate their ages to access online services that they're not supposed to. Then, they implied that the accounts were being used by children, with one posting about a lost baby tooth and starting seventh grade. The suit alleged that, among other things, the accounts were sent child sex images and offers to pay for sex.
The state further alleges that Meta leaders didn't take action to limit adult predation on minors until late 2022, and still stopped short of strict measure recommended by safety staff. Instead it, only attempted to block suggestions to adults who previously demonstrated suspicious behavior toward children. However, according to a Meta study, 99 percent of accounts disabled for grooming children failed to state their age.
Meta recently introduced measures for teen users on Instagram and Facebook, including stopping non-followers from messaging them and blocking offensive comments. On top of the New Mexico complaint, Meta is facing lawsuits from 41 states complaining it harms the mental health of its youngest users. Another recently unsealed complaint filed by 33 states alleges that Meta "coveted and pursued" users under the age of 13 and has been dishonest about how it handles underage users' accounts when they're discovered.
This article originally appeared on Engadget at https://ift.tt/HiwdvNbfrom Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/HiwdvNb
No comments:
Post a Comment
Guys Comments for Revolutionary Change!!!