A federal court trial of Meta, the parent company of Facebook and Instagram, has begun in connection with hundreds of personal injury lawsuits filed against social media companies over allegations that they prioritized profits over child safety. The case centers on a 19-year-old woman who claims to have suffered from depression, anxiety, self-harm, and suicidal thoughts after spending hours on YouTube and Instagram.
The trial's significance lies not only in the individual case but also as a potential bellwether for hundreds of similar lawsuits filed against social media companies. Meta has allegedly prioritized profits over child safety by designing features that keep users engaged for longer periods, including infinite scroll and autoplay.
Experts testifying in the case claim that social media addiction is real and that platforms' design features contribute to its development. This includes Kara Bagot, a licensed clinical psychologist, who stated that "social media overuse and addiction causes or plays a substantial role in causing or exacerbating psychopathological harms in children and youth."
The trial has gained attention due to the publication of internal research by Meta, which appears to show that the company's design features were intended to keep users engaged for extended periods. An email from Mark Zuckerberg stated that teen engagement was a top priority in 2017, while another document from 2020 detailed the company's plan to keep kids engaged "for life."
The jury will have access to this internal research, as well as testimony from Arturo Bejar, a former Meta safety researcher who claims to have witnessed design defects on platforms that can cause harm to minors.
Meta has argued that the plaintiffs failed to read the terms of service and, therefore, should not benefit from posted warnings. However, a judge ruled that the experts' opinions are relevant and will be considered by the jury.
The outcome of this trial could have significant implications for social media companies, potentially leading to billions of dollars in damages and changes in platform design. The case highlights the growing concern over the impact of social media on children's mental health and the need for greater regulation of these platforms.
The trial's significance lies not only in the individual case but also as a potential bellwether for hundreds of similar lawsuits filed against social media companies. Meta has allegedly prioritized profits over child safety by designing features that keep users engaged for longer periods, including infinite scroll and autoplay.
Experts testifying in the case claim that social media addiction is real and that platforms' design features contribute to its development. This includes Kara Bagot, a licensed clinical psychologist, who stated that "social media overuse and addiction causes or plays a substantial role in causing or exacerbating psychopathological harms in children and youth."
The trial has gained attention due to the publication of internal research by Meta, which appears to show that the company's design features were intended to keep users engaged for extended periods. An email from Mark Zuckerberg stated that teen engagement was a top priority in 2017, while another document from 2020 detailed the company's plan to keep kids engaged "for life."
The jury will have access to this internal research, as well as testimony from Arturo Bejar, a former Meta safety researcher who claims to have witnessed design defects on platforms that can cause harm to minors.
Meta has argued that the plaintiffs failed to read the terms of service and, therefore, should not benefit from posted warnings. However, a judge ruled that the experts' opinions are relevant and will be considered by the jury.
The outcome of this trial could have significant implications for social media companies, potentially leading to billions of dollars in damages and changes in platform design. The case highlights the growing concern over the impact of social media on children's mental health and the need for greater regulation of these platforms.