Ireland's Regulators Launch Investigations into TikTok and LinkedIn over DSA Violations
A growing list of tech giants has found itself on the wrong side of Irish regulators as Coimisiún na Meán, the country's media watchdog, has announced investigations into two major platforms: TikTok and LinkedIn. The probes are centered around the companies' reporting tools for illegal content, with concerns raised over their implementation and presentation.
Critics claim that both platforms' reporting features may be deceptive, leading users to believe they're reporting content as illegal when, in fact, it's merely a breach of terms and conditions. This raises fundamental questions about the effectiveness of these mechanisms, designed to combat online abuse while also protecting free speech.
According to John Evans, Coimisiún na Meán's DSA Commissioner, the Digital Services Act (DSA) relies on accurate reporting tools to enable users to identify and report suspicious content. The regulator emphasizes that providers must design their interfaces in a way that avoids deceiving or manipulating users, ensuring they can make informed decisions.
The investigations follow a pattern of tech companies making significant changes to their reporting mechanisms following warnings from Irish regulators. Failure to comply with the DSA can result in substantial fines – up to six percent of revenue – placing pressure on platforms to prioritize transparency and cooperation.
Meanwhile, another investigation is underway into social media platform X, which allegedly trained its AI assistant on user posts that would be a breach of the General Data Protection Regulation (GDPR). If found guilty, Ireland could take a four percent cut of X's global revenue, highlighting the strict consequences faced by companies that disregard data protection laws.
A growing list of tech giants has found itself on the wrong side of Irish regulators as Coimisiún na Meán, the country's media watchdog, has announced investigations into two major platforms: TikTok and LinkedIn. The probes are centered around the companies' reporting tools for illegal content, with concerns raised over their implementation and presentation.
Critics claim that both platforms' reporting features may be deceptive, leading users to believe they're reporting content as illegal when, in fact, it's merely a breach of terms and conditions. This raises fundamental questions about the effectiveness of these mechanisms, designed to combat online abuse while also protecting free speech.
According to John Evans, Coimisiún na Meán's DSA Commissioner, the Digital Services Act (DSA) relies on accurate reporting tools to enable users to identify and report suspicious content. The regulator emphasizes that providers must design their interfaces in a way that avoids deceiving or manipulating users, ensuring they can make informed decisions.
The investigations follow a pattern of tech companies making significant changes to their reporting mechanisms following warnings from Irish regulators. Failure to comply with the DSA can result in substantial fines – up to six percent of revenue – placing pressure on platforms to prioritize transparency and cooperation.
Meanwhile, another investigation is underway into social media platform X, which allegedly trained its AI assistant on user posts that would be a breach of the General Data Protection Regulation (GDPR). If found guilty, Ireland could take a four percent cut of X's global revenue, highlighting the strict consequences faced by companies that disregard data protection laws.