The University of Houston had hired an artificial intelligence company, Dataminr, to monitor its students' social media activity and chat logs, using the AI tool known as "First Alert." This was part of a broader trend in US universities hiring private companies to gather open-source intelligence on student-led movements for Palestine. Dataminr's First Alert is designed to help law enforcement officials gather situational awareness, but it relies on an advanced algorithm that gathers massive amounts of data and makes decisions without human oversight.
The university used this system to identify potential incidents of concern, such as pro-Palestine chants or posts on social media, and forward the information directly to campus police. In one instance, a University of Houston communications official received an alert from First Alert based on chat logs scraped from a semi-private Telegram channel called "Ghosts of Palestine." The alert identified a potential incident because it mentioned that students were demanding an end to genocide.
This system was not used exclusively for student protests but also for other forms of expression, as university administrators sought to monitor the online activity of students who had posted screenshots of Instagram posts. At the University of Connecticut, one administrator watched a group of protesters sleep in their tents and wrote that they were "just beginning to wake up" with only a few police cars nearby.
Dataminr's services are used by newsrooms and corporate giants, as well as universities, to gather intelligence and respond to threats. The company has been implicated in various scandals, including the domestic surveillance of Black Lives Matter protesters in 2020 and abortion rights protesters in 2023.
In April 2024, at least one University of Houston administrator received over 900 emails from First Alert in their inbox alone. This highlights how these systems can overwhelm administrators with information and lead them to act on it without fully understanding the context.
Critics argue that universities have a duty of care for students and the local community but instead use these systems as a means of chilling speech and creating an unsafe environment. "Universities have a duty of care for their students and the local community," Rory Mir, associate director of community organizing at the Electronic Frontier Foundation said. "Surveillance systems are a direct affront to that duty. It creates an unsafe environment, chills speech, and destroys trust between students, faculty, and the administration."
The university used this system to identify potential incidents of concern, such as pro-Palestine chants or posts on social media, and forward the information directly to campus police. In one instance, a University of Houston communications official received an alert from First Alert based on chat logs scraped from a semi-private Telegram channel called "Ghosts of Palestine." The alert identified a potential incident because it mentioned that students were demanding an end to genocide.
This system was not used exclusively for student protests but also for other forms of expression, as university administrators sought to monitor the online activity of students who had posted screenshots of Instagram posts. At the University of Connecticut, one administrator watched a group of protesters sleep in their tents and wrote that they were "just beginning to wake up" with only a few police cars nearby.
Dataminr's services are used by newsrooms and corporate giants, as well as universities, to gather intelligence and respond to threats. The company has been implicated in various scandals, including the domestic surveillance of Black Lives Matter protesters in 2020 and abortion rights protesters in 2023.
In April 2024, at least one University of Houston administrator received over 900 emails from First Alert in their inbox alone. This highlights how these systems can overwhelm administrators with information and lead them to act on it without fully understanding the context.
Critics argue that universities have a duty of care for students and the local community but instead use these systems as a means of chilling speech and creating an unsafe environment. "Universities have a duty of care for their students and the local community," Rory Mir, associate director of community organizing at the Electronic Frontier Foundation said. "Surveillance systems are a direct affront to that duty. It creates an unsafe environment, chills speech, and destroys trust between students, faculty, and the administration."