What It Will Take to Make AI Sustainable | WIRED
CommentLoader-
Save StorySave this story
CommentLoader-
Save StorySave this story
Building AI sustainably seems like a pipe dream as tech giants that previously made promises to cut emissions have been racing to build out massive data centers powered by fossil fuels.
The rush to build out AI at all costs has been reinforced by the Trump administration, which is also rolling back environmental protections.
Despite these headwinds, Sasha Luccioni, an AI sustainability researcher, thinks that demand for more transparency in AI, from both businesses and individuals, is higher than ever from the customer side.
Luccioni has become a leader in trying to create more transparency about AI’s emissions and environmental impacts in her four years at Hugging Face, an AI company, including pioneering a leaderboard documenting the energy efficiency of open-source AI models. She has also been an outspoken critic of major AI companies that, she says, are deliberately withholding energy and sustainability information from the public.
Now, she’s starting Sustainable AI Group, a new venture with former Salesforce sustainability chief Boris Gamazaychikov. They’ll focus on helping companies answer, among other things, “what are the levers that we can play with in order to make agents slightly less bad?” Luccioni is also interested in sussing out the energy needs of different types of AI tools, such as speech-to-text translation, or photo-to-video—an area that’s she says has so far been understudied.
Luccioni sat down exclusively with WIRED to talk about the demand for sustainable AI and what exactly she wants to see from Big Tech.
This interview has been edited for length and clarity.
WIRED: I hear a lot from individual people who are worried about the environment and AI use, but I don't hear as much from companies thinking about this. What have you heard specifically from folks who are working with AI in their business, and what are they worried about?
Sasha Luccioni: First of all, they are getting a lot of employee pressure—and board pressure, director pressure, like, “You need to be quantifying this.” Their employees are like, "You're forcing us to use Copilot—how does it affect our ESG goals?”
For most companies, AI has become a core part of their business offering. In that case, they have to understand the risks. They have to understand where models are running. They can't continue to use models where they don’t even know the location of the data centers or the grid they're connected to. They have to know what the supply chain emissions are, transportation emissions, all these different things.
It’s not about not using AI. I think we’re past that. It’s choosing the right models, for example, or sending the signal that energy source matters, so customers are willing to pay a little bit more for data centers that are powered by renewable energy. There are ways of doing it, and it's a matter of finding the believers in the right places.
I'd also imagine that for global companies, the sustainability situation is very different than in the US, right? The US government might not give a shit about this, but other governments certainly do.
In Europe, they have the EU AI Act. Sustainability has been a pretty big part of that since the beginning. They put a bunch of clauses in there, and now the first reporting initiatives are coming out.
Even Asia is trying to be more transparent. The International Energy Agency has been doing these reports [on AI and energy use]. I was talking to them, and they were like, other countries realize that the IEA gets their numbers from the countries, and the countries don't have these numbers for data centers specifically. They can't make future-looking choices, because they need the numbers to know "OK, well that means we need X capacity, in the next five years" or whatever. [Some countries] have started pushing back on the data center builders.
If you could wave a magic wand tomorrow and make Sam Altman, or Dario Amodei, or whoever, give you a piece of information that you've been looking for, what would it be? Or would you want them to generally be more open about what they have?
I wish there was a little meter or info box on the ChatGPT or Claude UI that tells you at the end of each query or conversation how much energy was used. Ideally, greenhouse gas emissions and how that energy was generated.
I think that it would be a market competitive advantage if one of the big model providers decided to make a bet on sustainability. Right now, they're all infighting and trying to one-up each other. If one of them was like, "OK, we're going to stop trying to create these data centers that are powered by natural gas, and we're going to make renewable data centers," I think that that could actually give them an advantage. It's like when Anthropic said no to the US government for military use. It did give them a boost.
A cul