Meta, YouTube verdict can ripple through social media markets worldwide


After days of deliberation, a jury in a Los Angeles court on Wednesday said that Meta and YouTube were negligent in a landmark trial where a 20-year-old woman said that her use of social media in her teens addicted her to the platforms and made her depression worse. 

Meta and YouTube were ordered to pay $3 million in damages to the plaintiff, identified as K.G.M. There may be additional damages. Meta and YouTube said they will appeal. 

“The era of Big Tech invincibility is over – this ruling is an earthquake that shakes Big Tech’s predatory business model to its core,” Sacha Haworth, executive director of The Tech Oversight Project, said in a statement. “New evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years. These products were purposefully designed to harm, addict millions of young people, and lead to lifelong mental health consequences.”

The complaint argued that Instagram, Facebook, YouTube, TikTok and Snapchat “rewired how our kids think, feel, and behave,” by knowingly designing addictive products that exposed children to harm. Snap and TikTok settled with the plaintiff before the trial began.  

The verdict of the trial could serve as a “bellwether” for similar cases in the future, in the U.S. and elsewhere, with many analysts likening it to the cases against Big Tobacco which forced companies to change their business practices.   

Thousands of lawsuits have been filed in the U.S. alone, and days earlier, a court in New Mexico state ordered Meta to pay $375 million after the company was found to have concealed what it knew about child sexual exploitation on its platforms.

Outside the U.S., in countries that are some of the biggest markets for social media platforms, the focus has so far been on society-wide harms such as disinformation and hate speech. Countries including India and Indonesia have introduced laws for quick content removal, blaming the sites for failing to moderate hateful content. 

Facebook played a role in spreading hate speech in Myanmar, which led to the genocide against Rohingya Muslims in 2017, the United Nations and human rights groups have said. Just a year later, Facebook apologized for failing to curb inflammatory posts against Muslims in Sri Lanka, which led to deadly attacks. The platform also failed to curb the spread of posts inciting violence in Ethiopia during a civil war, human rights groups have said. 

Meta and Google will likely redirect additional time and resources to addressing their services in the U.S., which “could mean fewer resources devoted to trust and safety in countries outside the U.S., increasing the risks and the likelihood of safety failures beyond what users in the Majority World already experience,” Kate Ruane, a director at the Center for Democracy and Technology, told Rest of World.

Many of these countries are now acting to protect young users. Brazil recently introduced a child-safety law with mandatory age verification, restrictions on design practices that encourage compulsive screen use, and new obligations for platforms to combat digital crimes. Indonesia and Malaysia are also considering laws to keep young users off social media sites, as are several Indian states.

“The ruling is an important precedent as far as platform accountability goes—and in recognizing the role of algorithms in scaling harm,” Sabhanaz Rashid Diya, executive director at digital rights organization Tech Global Institute, and a former public policy head for Meta in Bangladesh, told Rest of World

But there is a risk that countries may rush to impose even more restrictions on children’s access to social media, Diya said. 

“Well before the verdict, we’ve been seeing age assurance and age gating picking up in several Global South countries,” she said. “There should absolutely be more discussions and guardrails on advertising and targeting children, parental controls, etc. [But] the verdict does set a dangerous precedent by potentially risking an end to end-to-end-encryption … taking children off social media, or requiring backdoor data access to user data for child safety.”



Source link

Leave a Reply

Translate »
Share via
Copy link