Pause AI and Stop AI: Meet the anti-AI groups facing questions after the attack on Sam Altman
The attempted firebombing of OpenAI CEO Sam Altman’s San Francisco home last Friday, allegedly carried out by 20-year-old Daniel Moreno-Gama, has drawn attention to two anti-AI groups with similar names: Pause AI and Stop AI. Both have condemned the violence and said the suspect is not and never was a member of their organizations.
Recommended Video
Still, the incident, in which Moreno-Gama also went to OpenAI’s headquarters and tried to shatter the building’s glass doors with a chair and threatened to burn the facility, surfaced his activity on Pause AI’s Discord server and renewed scrutiny of Stop AI’s direct actions targeting OpenAI last year.
A movement built on slowing AI
Pause AI, founded in Utrecht, Netherlands, in May 2023 by Joep Meindertsma, aims to halt what it calls “dangerous frontier AI” and staged its first protest outside Microsoft’s lobbying office in Brussels. The group, whose name was inspired by an open letter from the Future of Life Institute in March 2023 (which is also now its largest single funder), has since grown into a global grassroots movement with local chapters. That includes a separate organization called Pause AI US, led by Berkeley-based Holly Elmore, who has a PhD in evolutionary biology from Harvard and previously worked at a think tank focused on wildlife animal welfare.
Moreno-Gama was linked to comments on Pause AI’s Discord server, including one post, dated Dec. 3, 2025, that read: “We are close to midnight, it’s time to actually act.” Pause AI said the suspect joined its server two years ago and posted a total of 34 messages, none of which “contained explicit calls to violence.”
Lea Suzuki—San Francisco Chronicle/Getty Images
Elmore told Fortune that she had been on her way to Washington, D.C., last week to finish preparing for a peaceful demonstration on Capitol Hill and meetings with members of Congress when the attempted firebombing occurred. “When I landed, suddenly I was getting these questions about somebody who had attacked Sam Altman’s house,” she said. “It’s been back and forth between working on something that I feel really proud and positive about, and it’s just exactly the right kind of change to be making—democratic change through democratic means—and then having to comment on this horrible event and additionally being really smeared with a connection to this event.”
The group has “no reason to think that this person had much to do with us,” she added, pointing out that Pause AI’s stance on violence “has always been incredibly clear” and explicitly prohibits it. She also emphasized that the activity occurred on a public, global Discord server distinct from Pause AI US’s organizing channels, and said the suspect “didn’t get any further in onboarding or having any official role.”
Elmore added that Pause AI deliberately vets volunteers and keeps tight control over its messaging to avoid being associated with extreme views.
But Nirit Weiss-Blatt, an independent researcher who has long-followed the two groups and writes the newsletter AI Panic, pointed to a 2024 documentary, Near Midnight in Suicide City, in which For Humanity podcast host John Sherman interviews Elmore, who holds up a sign reading, “Humanity can’t survive smarter-than-human AI.”
Weiss-Blatt said the film shows Elmore urging activists to understand what she describes as an urgent timeline toward potential human extinction. “She’s never advocating violence, but is raising the stakes about doom,” Weiss-Blatt said.
“When prominent AI doomers like Eliezer Yudkowsky—author of If Anyone Builds It, Everyone Dies—keep insisting that human extinction is imminent, it should not be surprising when someone is driven to extreme action,” she added. “Young, anxious followers, looking for purpose, can be radicalized by apocalyptic AI rhetoric, even without explicit calls for violence.”
However, Mauro Lubrano, a lecturer at the University of Bath and author of Stop the Machines: The Rise of Anti-Technology Extremism, cautioned that there is a clear distinction between groups that seek to eradicate technology violently and those advocating for regulation or a pause. “I think it’s easy to conflate all of these groups and movements that are trying to raise awareness of some of the dangers of AI,” he said.
A break over tactics—and a turn to direct action
The incident at Altman’s home occurred about five months after OpenAI told employees at its headquarters to shelter in place because a 27-year-old man named Sam Kirchner threatened to go to several OpenAI offices in San Francisco to “murder people,” according to callers who notified police that day. Kirchner was a cofounder of Stop AI, a group he launched in 2024 with 45-year-old Guido Reichstadter, both of whom had previously been involved in Pause AI.
Guido Reichstadter, a cofounder of Stop AI, at a 2022 protest for abortion rights.Drew Angerer—Getty Images
“I kicked them out,” said Elmore, who added the split stemmed from disagreements over ta