TikTok users' fears about the platform's alleged shift to censorship in favor of a "MAGA" agenda are indeed justified, experts say. A recent spate of technical issues and content moderation errors has sparked concerns that TikTok is intentionally suppressing pro-Trump content, which may be a symptom of deeper issues with the app's algorithm.
The problems began after US owners took over the app from Chinese companies, raising fears that Trump had promised to make the platform more "MAGA" (Make America Great Again) aligned. While TikTok has denied any intentional censorship, experts say that the inconsistencies in moderation and content curation are suspicious, especially given the recent power outage at a US data center.
Ioana Literat, an associate professor of technology, media, and learning, argues that even if the technical issues are genuine, they reveal underlying biases in the platform's design. "Users' fears are absolutely justified," she says. "When your 'bug' consistently affects anti-Trump content, Epstein references, and anti-ICE videos, you're looking at either spectacular coincidence or systems that have been designed—whether intentionally or through embedded biases—to flag and suppress specific political content."
Literat points out that TikTok users are savvy and have developed digital literacy, allowing them to recognize patterns of censorship. They've witnessed similar issues on other platforms like Instagram and Twitter, which has shaped their expectations.
However, not everyone agrees that the technical explanations can be ignored. David Greene, senior counsel for the Electronic Frontier Foundation, cautions that users must be cautious in making assumptions about TikTok's intentions. "For years, TikTok users were being told that they just needed to follow these assumptions the government was making about the dangers of TikTok," he says.
The situation is further complicated by Trump's comments about wanting to make TikTok more aligned with his agenda. This has raised concerns that the platform may be vulnerable to manipulation and propaganda. "I don't see how it'd be good for users or for democracy, for TikTok to have an editorial policy that would make Trump happy," Greene says.
Ultimately, TikTok's fate hangs in the balance, and experts predict a gradual erosion of trust rather than a mass exodus. "TikTok is where their communities are, where they've built audiences, where the conversations they care about are happening," Literat notes. As users adapt to the changes, they may develop workarounds, shift to other platforms for political content, or create coded languages and aesthetic strategies to evade detection.
The situation highlights the delicate balance between free speech and algorithmic moderation on social media platforms. While TikTok aims to maintain its neutral stance, it's clear that users are watching closely, waiting to see how the platform evolves under new ownership.
The problems began after US owners took over the app from Chinese companies, raising fears that Trump had promised to make the platform more "MAGA" (Make America Great Again) aligned. While TikTok has denied any intentional censorship, experts say that the inconsistencies in moderation and content curation are suspicious, especially given the recent power outage at a US data center.
Ioana Literat, an associate professor of technology, media, and learning, argues that even if the technical issues are genuine, they reveal underlying biases in the platform's design. "Users' fears are absolutely justified," she says. "When your 'bug' consistently affects anti-Trump content, Epstein references, and anti-ICE videos, you're looking at either spectacular coincidence or systems that have been designed—whether intentionally or through embedded biases—to flag and suppress specific political content."
Literat points out that TikTok users are savvy and have developed digital literacy, allowing them to recognize patterns of censorship. They've witnessed similar issues on other platforms like Instagram and Twitter, which has shaped their expectations.
However, not everyone agrees that the technical explanations can be ignored. David Greene, senior counsel for the Electronic Frontier Foundation, cautions that users must be cautious in making assumptions about TikTok's intentions. "For years, TikTok users were being told that they just needed to follow these assumptions the government was making about the dangers of TikTok," he says.
The situation is further complicated by Trump's comments about wanting to make TikTok more aligned with his agenda. This has raised concerns that the platform may be vulnerable to manipulation and propaganda. "I don't see how it'd be good for users or for democracy, for TikTok to have an editorial policy that would make Trump happy," Greene says.
Ultimately, TikTok's fate hangs in the balance, and experts predict a gradual erosion of trust rather than a mass exodus. "TikTok is where their communities are, where they've built audiences, where the conversations they care about are happening," Literat notes. As users adapt to the changes, they may develop workarounds, shift to other platforms for political content, or create coded languages and aesthetic strategies to evade detection.
The situation highlights the delicate balance between free speech and algorithmic moderation on social media platforms. While TikTok aims to maintain its neutral stance, it's clear that users are watching closely, waiting to see how the platform evolves under new ownership.