“IG is a drug”: Internal messages may doom Meta at social media addiction trial

A federal court trial of Meta, the parent company of Facebook and Instagram, has begun in connection with hundreds of personal injury lawsuits filed against social media companies over allegations that they prioritized profits over child safety. The case centers on a 19-year-old woman who claims to have suffered from depression, anxiety, self-harm, and suicidal thoughts after spending hours on YouTube and Instagram.

The trial's significance lies not only in the individual case but also as a potential bellwether for hundreds of similar lawsuits filed against social media companies. Meta has allegedly prioritized profits over child safety by designing features that keep users engaged for longer periods, including infinite scroll and autoplay.

Experts testifying in the case claim that social media addiction is real and that platforms' design features contribute to its development. This includes Kara Bagot, a licensed clinical psychologist, who stated that "social media overuse and addiction causes or plays a substantial role in causing or exacerbating psychopathological harms in children and youth."

The trial has gained attention due to the publication of internal research by Meta, which appears to show that the company's design features were intended to keep users engaged for extended periods. An email from Mark Zuckerberg stated that teen engagement was a top priority in 2017, while another document from 2020 detailed the company's plan to keep kids engaged "for life."

The jury will have access to this internal research, as well as testimony from Arturo Bejar, a former Meta safety researcher who claims to have witnessed design defects on platforms that can cause harm to minors.

Meta has argued that the plaintiffs failed to read the terms of service and, therefore, should not benefit from posted warnings. However, a judge ruled that the experts' opinions are relevant and will be considered by the jury.

The outcome of this trial could have significant implications for social media companies, potentially leading to billions of dollars in damages and changes in platform design. The case highlights the growing concern over the impact of social media on children's mental health and the need for greater regulation of these platforms.
 
I'm low-key freaked out about this trial 🤯... like, can you imagine using social media and having it literally affect your mental health? It's crazy to think that these companies have been so aware of the issue but just chose to prioritize profits over our well-being 💸. The fact that they designed features to keep us engaged for hours is basically a recipe for disaster 🤦‍♀️. I'm glad there's someone like Arturo Bejar speaking out about what really goes on behind the scenes at these companies. Maybe this trial will lead to some real change and we'll start seeing more regulation around social media 💪... fingers crossed!
 
🤔 I mean, it's about time someone held Meta accountable for prioritizing profits over actual human well-being 🤑. I'm sure Mark Zuckerberg is just thrilled to have his entire "we're all about kids" narrative called out like that 👀. The fact that they had internal research showing their design features were meant to keep users engaged for extended periods? Yeah, no surprise there 😏. It's not like they've been playing dumb about this or anything... 🙃
 
omg can you believe all those ppl glued 2 their screens 24/7? it's like they forget there's a whole world outside 🌐 social media companies r making billions off our addiction & now we're seeing the devastating effects on mental health 😔 gotta wonder if Mark Zuckerberg has kids of his own, how would he feel if his own teens were struggling with depression & anxiety? 💔 anyway, can't wait 2 see what the verdict is 🤞
 
🤕 I can totally imagine how scary it must be for that 19-year-old girl who went through all those struggles because of her screen time 📱💔 The way Meta designed their features to keep us hooked is just plain cruel 😡 And now, a judge is finally giving attention to the harm they've caused ⚖️ I hope the jury sees the truth and holds Meta accountable 💯 It's time for these companies to start prioritizing our well-being over profits 🌟
 
🤔 I think its crazy that we're even having this conversation... like 10 years ago, people were saying the same thing about how Facebook was ruining society. Now it's Instagram and YouTube too! 📱💻 It makes me wonder, when did we become so addicted to these platforms? Like, what is the real priority here? Is it really just profits over safety or is it something more complex? 🤷‍♀️ I mean, we all know we use social media because of FOMO and the fear of missing out, but at what cost? 🌎
 
omg 20% of kids who start using instagram are gonna end up with depression 🤕💔 like, what kinda profit margins are we talking about here? it's not just about the $$$, it's about our future. the fact that meta is hiding its own research from users is straight up sketchy 🚫👀 and i'm low-key hoping this trial goes in their favor so they gotta change their ways 💪 at least then we can start to make progress on protecting our kids online 📊💻
 
🤔 I'm both excited and skeptical about this trial. On one hand, it's amazing that we're finally seeing some accountability from social media companies like Meta 🙌. The fact that they've admitted to designing features that keep users engaged for extended periods, including infinite scroll and autoplay, is pretty alarming 😱.

But on the other hand, I'm not sure if this trial will actually lead to meaningful change 💔. We've seen this happen before, where companies get fined or settle lawsuits, but it doesn't necessarily translate to real reform 🤑. And let's be real, social media addiction is a complex issue that involves so many factors beyond just platform design 👀.

Still, I'm keeping my fingers crossed that the jury will see through Meta's attempts to downplay their role in contributing to user addiction and instead holds them accountable 💪. If that happens, it could be a game-changer for kids' mental health and online safety 🌟.
 
OMG 🤯 I'm totally with this trial! It's about time someone held Meta accountable for putting profits over our kids' safety 😩. Those design features are literally designed to keep us glued to our screens, and we all know how that can mess with our mental health 🤕. I mean, who hasn't spent hours mindlessly scrolling through Instagram or YouTube? It's crazy to think that some of these teens were even suicidal because of it 💔. The fact that Meta had internal research showing they knew about the engagement features and intentionally designed them to keep us hooked is just wild 😲. The judge making the experts' opinions relevant is a huge win for justice 👏. If this trial leads to changes in platform design and more regulation, I'm all for it 💪. We need to start putting our kids first, not just profits 🤝.
 
🤯 I'm low-key freaked out about this trial 🙅‍♀️, like what even is the point of having a kid account if it's just gonna be designed to keep them hooked all day? 💻 The way Meta's been hiding its own research on how to get kids addicted to their platforms is straight up shady 🤑. And now they're trying to shift the blame onto parents for not reading the terms and conditions... give me a break! 😒 It's like, we get it, you want to make money off our kids' eyeballs, but at what cost? 🤕 The fact that experts are coming out saying social media addiction is real and platforms design features contribute to it is super telling 📊. This trial could be the wake-up call we need to make some serious changes to how these companies operate 👀.
 
🤔 This whole thing just feels so messed up... like how can a company prioritize profits over our mental health? 🤑 These social media companies are literally designing features to keep us hooked, and we're all paying the price in our heads 💔. It's not just this one case or Meta, though - it's the bigger picture of how these platforms affect our kids' lives too 🤕. If they're gonna claim that users are responsible for their own mental health because they didn't read the terms, that's just not fair 🙅‍♀️. We need some real accountability here and maybe some changes to how these companies design their features 👀. The fact that a researcher claims to have seen design defects that can cause harm is wild 🤯. Maybe it's time for us to take back control of our online lives and make some serious changes 🔒.
 
🤔 I think its wild that a whole trial is being held around this. Like, we've all been scrolling mindlessly on our feeds for years, thinking its no big deal... but if it really does contribute to depression, anxiety, and suicidal thoughts in teens then thats a massive problem 🚨. The fact that Meta has internal research showing they designed features to keep users engaged longer is pretty damning 💯. And now its up to the jury to decide if these companies are responsible for the mental health of our youth. It feels like we're just seeing the tip of the iceberg here... how many other ways are social media platforms affecting our well-being? 🤷‍♀️
 
Back
Top