In the rapidly evolving world of technology, few topics are as pressing as the environmental impact of artificial intelligence. Today, we’re thrilled to sit down with Laurent Giraud, a renowned technologist with deep expertise in AI, particularly in machine learning, natural language processing, and the ethical implications of these technologies. Laurent has dedicated much of his career to exploring how AI can be harnessed responsibly, with a keen focus on mitigating its carbon footprint. In this conversation, we dive into the energy demands of generative AI, the hidden emissions tied to data centers, innovative strategies for reducing environmental impact, and the future of sustainable AI development.
How does generative AI contribute to climate change in ways that might surprise the average person?
Most people don’t realize just how energy-intensive generative AI is. When you interact with a chatbot or generate an image, behind the scenes, massive data centers packed with powerful processors are working overtime. These systems, especially the GPUs used for training and running AI models, consume enormous amounts of electricity—often sourced from fossil fuels. Beyond that, the sheer scale of building and maintaining these data centers, from the concrete and steel to the cooling systems, adds a significant layer of emissions called embodied carbon. It’s not just about flipping a switch; it’s the entire lifecycle of the tech that’s driving up the carbon footprint.
What do you see as the biggest driver of AI’s environmental impact—energy consumption during operation or something else?
While operational energy use—powering the servers and cooling systems—is a huge factor, I’d argue that the embodied carbon from constructing data centers is just as critical, though often overlooked. Building these facilities requires vast amounts of resources, and as demand for AI grows, so does the need for more data centers. Forecasts suggest global electricity demand from data centers could double by 2030, and a big chunk of that will likely be met by fossil fuels. Both aspects—operation and construction—need urgent attention if we’re to tackle AI’s climate impact holistically.
How does the energy demand of AI data centers stack up against other major industries?
It’s staggering when you look at the numbers. Data centers supporting AI and other digital services are projected to consume around 945 terawatt-hours by 2030, which is more than the total energy use of some large countries like Japan. Compared to industries like manufacturing or transportation, data centers have a uniquely high energy density—sometimes 10 to 50 times that of a typical office building. While transportation still accounts for a larger share of global emissions, the rapid growth of AI’s energy appetite makes it a rising concern, especially since much of that power isn’t yet coming from renewable sources.
Why is the energy use of data centers growing at such an alarming rate?
The explosion of generative AI applications is a major driver. Training large models, like those used for natural language processing or image generation, requires thousands of GPUs running simultaneously for days or even weeks. On top of that, as more businesses and consumers adopt AI tools, the demand for real-time processing and deployment skyrockets. We’re also seeing a cultural shift—everyone wants faster, more powerful AI, which pushes companies to scale up infrastructure quickly, often outpacing the transition to cleaner energy grids. It’s a perfect storm of technological advancement and energy demand.
Can you explain the difference between operational and embodied carbon in the context of AI infrastructure?
Absolutely. Operational carbon refers to the emissions produced from the day-to-day running of data centers—think of the electricity needed to power servers, GPUs, and cooling systems. Embodied carbon, on the other hand, comes from the upfront impact of building the data center itself. That includes everything from mining raw materials like steel and concrete to manufacturing hardware and laying miles of cabling. While operational carbon gets more attention because it’s ongoing, embodied carbon can be just as significant, especially as we build more facilities to keep up with AI’s growth.
What are some practical ways to cut down on operational carbon emissions in AI data centers?
There are several approaches that mirror energy-saving tactics we use at home. One simple idea is to “turn down” the GPUs—reducing their power consumption to about a third of their max capacity. Research has shown this barely impacts AI model performance while slashing energy use and making cooling easier. Another strategy is using less power-hungry hardware for specific tasks, or even stopping the training of models early when high accuracy isn’t critical—like for some e-commerce recommendation systems. Additionally, optimizing workloads to avoid wasted computing cycles can make a big difference in reducing unnecessary energy drain.
How are advancements in AI technology helping to make it more energy-efficient?
We’re seeing incredible progress on multiple fronts. Hardware innovations, like denser transistor arrays on chips, are boosting the amount of computation GPUs can handle per unit of energy—improving by 50 to 60 percent annually in some cases. Beyond that, new AI model architectures are being designed to solve complex problems faster with less power. I’ve also been excited about concepts like the “negaflop,” which refers to computations you don’t need to perform thanks to smarter algorithms or techniques like neural network pruning. These advancements mean that in a few years, we might do the same tasks with much smaller, less energy-intensive models.
Why does the timing and location of AI workloads matter so much for reducing emissions?
Timing and location are game-changers because not all electricity is equal in terms of carbon impact. The carbon footprint of a kilowatt-hour can vary wildly depending on the time of day or the energy mix of a region. By scheduling AI tasks to run when renewable energy sources like solar or wind are more abundant on the grid, we can significantly cut emissions. Similarly, placing data centers in areas with cooler climates or access to clean energy—like northern regions with hydropower—reduces both cooling costs and reliance on fossil fuels. It’s about working smarter with the resources and conditions we have.
What role do you think AI itself can play in addressing broader climate challenges?
AI has immense potential to be part of the solution. It can optimize renewable energy systems by predicting solar or wind output more accurately, or help identify the best spots for new clean energy facilities. It’s also being used for predictive maintenance of green infrastructure, like solar panels, to maximize their efficiency. On a larger scale, AI can analyze vast datasets to guide climate policy, ensuring efforts target the biggest emission sources. If we can harness these capabilities while curbing AI’s own footprint, we’re looking at a powerful tool for sustainability.
What’s your forecast for the future of sustainable AI development over the next decade?
I’m cautiously optimistic. Over the next ten years, I expect we’ll see a major push toward smaller, more efficient AI models that deliver the same results with a fraction of the energy. Innovations in hardware and algorithms will likely continue to outpace the raw growth in demand, and I think we’ll see more data centers powered by renewables as grids catch up. However, this hinges on collaboration—between tech companies, policymakers, and researchers—to prioritize sustainability over unchecked expansion. If we get it right, AI could become a net positive for the climate, but it’s going to take deliberate, collective action to steer us there.