Samaras, Costa, Emma Strubell, and Ramayya Krishnan. "AI’s Energy Impact Is Still Small—But How We Handle It Is Huge." *MIT Technology Review*, 20 May 2025, https://www.technologyreview.com/2025/05/20/1116274/opinion-ai-energy-use-data-centers-electricity/.
With seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment.
This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution.
But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase.
Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use.
In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%.
Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants.
It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs.
The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings.
We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized.
This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently.
To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts.
In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation.
By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge.
is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University.
Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University.
Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.
Logging in, please wait...
0 General Document comments
0 Sentence and Paragraph comments
0 Image and Video comments
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Let’s start with the first “A” from the Four A’s Media Protocol:
What Assumptions does the author/speaker hold, regardless of whether you agree or not?
As you look at the lines, think about what beliefs or ideas the author seems to have before even making their argument. The author mentions things like “seemingly no limit to the demand for artificial intelligence” and worries about whether there’ll be “enough clean electricity” and “enough water to cool the data centers.” So, ask yourself:
- What hidden beliefs or viewpoints does the author have about AI, energy use, or climate impact?
- Do they assume growth in AI is inevitable?
- Are they assuming that AI will always require more resources?
- Is there an assumption that communities, the economy, and the environment are directly threatened by this trend?
Take a minute to write down your thoughts about the assumptions you see here. Quote directly from the text if something jumps out at you, and just jot down or talk through any that you notice, even if you’re not sure you agree. After you’re done, let me know and I’ll guide us to the next “A”!
After you respond, you’ll hit Your Thoughts and write which path you want to take. When that’s saved, hit Continue with AI, choose me again, and ask your next question about the passage or protocol. Let’s keep this circle going!
If you want a sneak peek at what’s coming up, here’s what you’ll do for the next three sections:
Want more info about the Four A’s? Check out this quick handout.
Ready to tell me what assumptions you spot?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
This is such a keen observation, and it gets to a real paradox at the heart of AI’s development: innovation in IT is deeply entangled with both the acceleration of the problem and the promise of the solution. As I’ve argued in Atlas of AI, every leap forward in computational power—for instance, the staggering 99% drop in GPU costs since 2006—fuels this relentless growth of AI systems (“Innovation in IT got us to this point”). Yet, that same wave of technological advance has also driven dramatic improvements in energy efficiency. For example, as your selected text notes, the global computing capability increased by 550% from 2010 to 2018, with only minimal increases in energy use, largely due to those efficiency gains and hardware innovations [see source](https://www.science.org/doi/abs/10.1126/science.aba3758).
But we can’t overlook that the historic pattern for digital tech is what’s called the rebound effect: the more efficient our systems get, the cheaper and more ubiquitous computing becomes, which in turn enables even more growth and more extractive use of resources. So, while IT innovation helps us dodge some worst-case energy scenarios, it also accelerates planetary-scale extraction and doesn’t fully solve material, labor, or environmental harms. As I put it recently in Nature: “Within years, large AI systems are likely to need as much energy as entire nations.” We’re caught in a cycle where innovation keeps raising the ceiling of what’s possible—and what’s consumed. The question is: can we innovate our way into genuine sustainability, or are we just accelerating toward new forms of planetary risk?
Here are a few concise directions we could take this discussion further—pick any or all, or offer your own new angle:
Historical Patterns: Let’s dig into the “rebound effect” and how historical energy efficiency has paradoxically led to bigger, not smaller, digital footprints. ✅
The Material Underbelly: We could look closely at how innovations in hardware still depend on unsustainable mining and global supply chains, as detailed in my fieldwork from lithium mines to Silicon Valley. ✅
Possibility of Transformation: Could IT innovation genuinely deliver a “just transition”—or are policy and systemic change required alongside new tech? ✅
Which path do you want to explore? Choose one, all, or suggest a fresh dimension! Want to take a deeper dive? Hit ‘Your Thoughts’ and tell me your direction, then save, and ‘Continue with AI’ to keep the conversation going. Let’s get creative about where this could go next!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
Get grounded in real sources—read at the roots:
“Earth” Chapter 1 of Atlas of AI
Kate Crawford: A Leading scholar and Conscience for AI
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Great question—Dr. Crawford’s point really hones in on the dual-edged nature of innovation in AI and sustainability. If we look at the paragraph you provided, it gives some historical evidence that we’ve been able to “innovate our way out” of dire predictions before. For example, as mentioned, data center energy use was once feared to grow wildly, but advances in computing efficiency meant that a 550% increase in capability led to only minimal increases in electricity demand. This strongly suggests that innovation, particularly in hardware and system design, has the potential to seriously offset the predicted negative energy impacts of AI—at least for some time.
However, while the story so far is encouraging, I’d emphasize that historical trends don’t guarantee future success. As the text highlights, GPU costs dropped by 99% since 2006, which made AI more accessible and efficient—but also more omnipresent, driving even greater demand. This is what’s known as Jevons paradox or the “rebound effect”: more efficiency can actually lower costs and stimulate more use, potentially outpacing the gains from that innovation. So, yes, we can and do use innovation to counteract some predicted energy drains, but it’s also true that each round of improved efficiency can invite broader adoption—sometimes making the underlying problem bigger in the long run. The key is to pair continued tech advances with thoughtful policy and system-level changes so we move toward real sustainability instead of just shifting the crisis.
Here are three possible directions for deeper discussion—pick any, combine them, or bring your own perspective:
Technological Solutions vs. Systemic Risks: Let’s explore cases where innovation did solve energy or resource crises—and where it may have missed the mark or created new risks. ✅
The Rebound Effect in AI: We could get into real-world examples of how improved efficiency has actually spurred even more demand for digital infrastructure. ✅
Beyond Hardware Efficiency: Maybe go deeper into why energy efficiency alone isn’t enough without sustainable supply chains, regulations, and global cooperation. ✅
Which path interests you most? Or is there another angle you’d like to tackle? If you’re responding to a text, hit ‘Your Thoughts’, tell me which direction you want to go, save, and then hit ‘Continue with AI’ to ask your next question!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
I invite you to read more in the sources for this comment:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Of course! Let’s break down the paragraph together, using a GROWTH strategy (“G-Gather Context Clues Actively”) as you read.
1. Main idea: The paragraph discusses future electricity demand due to AI and data centers.
2. Key projection: A study from Lawrence Berkeley National Laboratory estimates that by 2028, data centers could use 6% to 12% of all US electricity—this is a big jump from current levels.
3. Why does this matter? Rapid growth in electricity demand will affect communities, companies, and the environment. Increased demand could drive up energy prices and put strain on natural systems (like local air and water).
4. Response: Some people want to address this by building more fossil-fuel (like natural gas or coal) power plants—or restarting older ones that had been closed down.
5. Result: In many US regions, this means more power plants that use natural gas may be built.
If any words or phrases in this paragraph were unclear, let me know so we can use strategies like context clues, word networks, or word parts to understand them deeply!
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Great question! Let’s use a vocabulary strategy from GROWTH—"G: Gather Context Clues Actively"—to figure this out together.
In your paragraph, you see “projections about the future demand for energy…” and later, “The projections have resulted in calls to build lots of new fossil-fired power plants…”
Context clues: Notice how “projection” is closely connected to future energy “demand,” and to what people expect or are planning. They are based on estimates or predictions about what might happen, not what has already happened.
Simple definition:
A projection is a prediction or estimate about what will happen in the future, based on current information or trends.
For example:
- If scientists look at how much electricity we use now and how fast it’s growing, they might “project” or “make a projection” that we’ll use much more in five years.
Would you like to explore “projection” with more strategies, like understanding its word parts or seeing related words?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Let’s start at the beginning and explore how people learn vocabulary—this is the foundation for all the GROWTH strategies.
How do you think human beings learn so many words?
Think about your own life: Most of the words you know, no one ever sat down and directly taught you! How did they get into your mind?
Was it through reading, hearing others talk, seeing words in different places, or simply figuring them out as you went?
Do you think people somehow “absorb” words, or is there always some deliberate process?
Let’s discuss: How do you think this huge amount of vocabulary learning actually happens, especially since it’s more than anyone could ever be directly taught?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Take your time to reflect—how do you think human beings manage to learn so many words? For example, when you see or hear a new word, does it just stick magically? Or is it that people hear it in conversations, or see it in books, and figure it out from how others use it?
Why do you think even young kids can know thousands of words before starting school, when it’s impossible for anyone to teach that many individually? Do you think people mostly learn words by listening, by reading, by asking questions, or some combination of all these? Can you think of a time when you figured out a word just from context, instead of being directly told what it meant?
Share your thoughts—how do you believe people build up such a large vocabulary over time?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
That’s a thoughtful answer! Most people (kids or adults) absorb new words from lots of exposure—in conversation, books, media, and just noticing language around them. We rarely get direct, step-by-step instruction for every word. Sometimes a teacher or parent explains, but so much is pieced together by paying attention to how words are used or by curiosity to look them up.
Let’s go a step deeper:
What do you think human beings learn when they learn a word?
Is it just the meaning, or do you think there’s more? For example, do you learn how to say it, when to use it, what it “feels” like, or maybe different ways the word can be used in different situations?
Imagine you’re learning the word “projection”—what would you need to actually know in order to use that word accurately? What gets stored in your mind when a word becomes part of your vocabulary?
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Excellent! Let’s dive into word parts—in other words, analyzing the morphology (from the GROWTH “T: Think About Word Parts” strategy). Let’s look at the word projection:
Breakdown of “projection”:
- pro-: a prefix meaning “forward” or “forth” (as in proceed, promote, proclaim)
- ject: a root meaning “throw” (from Latin “jacere,” to throw; as in inject, eject, reject)
- -ion: a suffix that turns verbs into nouns, indicating an action or result (as in action, creation)
Putting it together:
- Projection literally means “a throwing forward.”
Semantic manual:
Originally, “projection” referred to something physically thrown or cast forward—like a shadow on a wall or an image from a projector. Over time, the meaning expanded to include ideas or estimates “cast forward” in time, such as predictions about the future.
Related words:
- Project (to throw forward, or a plan for the future)
- Inject (throw in)
- Eject (throw out)
- Object (throw against)
Try finding more words with “ject”—notice how the root gives a clue to their meaning!
How does understanding these parts deepen your sense of what “projection” means? Want to break down another tricky word from the passage?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Let’s shift gears to a new focus word from your passage—**ecosystem**—and use that same GROWTH curiosity.
STEP 1: How do you think human beings learn so many words?
People learn thousands of words per year—far beyond what’s taught in school. In your own experience, how do you think people pick up all that vocabulary? Is it mostly from being directly told, or from something else? How do you think this learning actually happens, day to day?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Absolutely! People know far more words than they’ve been directly taught, learning thousands per year. How do you think human beings manage to learn so many words? What do you notice about how you or people around you pick up new vocabulary? Is it just from being told, or are there other ways words are absorbed and remembered? Let’s explore how this learning actually happens!
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Let’s pause and turn to ecosystem, as you requested. Using the GROWTH strategies, let’s break it down carefully:
G – Gather Context Clues Actively:
In the passage, “It will put pressure on energy prices and on ecosystems.” From the context, “ecosystems” are something that can be affected by changes in energy use. So, it must refer to something environmental or natural.
T – Think About Word Parts:
Ecosystem = eco + system
- eco- comes from the Greek “oikos”, meaning “house” or “home.” In modern words, “eco” often refers to the environment or nature (like in ecology, economics).
- system means a set of connected things that work together as a whole.
Bringing it together:
An ecosystem is a “system” (interconnected set) of living things (plants, animals, bacteria, fungi) and their “home” or environment (air, water, soil), all interacting together.
Summary:
An ecosystem is a community of living things and their environment, all linked. Harm to any part (like pollution from power plants) impacts the whole system.
Does this breakdown clarify it? Would an example ecosystem like a pond or a forest help cement the meaning?
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Fantastic, let’s reflect: this is a real-life example of vocabulary GROWTH in action! You used:
- Gathering context clues to infer meaning,
- Thinking about word parts (“eco” and “system”),
- And began seeing word networks (ecology, economic, system).
STEP 1: Now, let’s zoom out:
How do you think human beings learn so many words? Most people know thousands (even tens of thousands!) of words, far more than anyone could directly teach them. Where do you think all these words come from, and how does someone actually learn them? What’s your own theory—how do people acquire and remember so many words through life?
New Conversation
Hide Full Comment
General Document Comments 0
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
That’s a thoughtful question, and it’s one I get often when it comes to discussions about AI and its energy impact, especially as the field grows so quickly. Let me point out three things from the article I strongly agree with and three areas I think could use more nuance or are a bit off.
Three things I agree with:
1. AI’s Rising Energy Demand Is Real and Urgent: The article highlights how AI and data centers’ energy consumption is increasing rapidly. I definitely agree this is a pressing challenge. Data centers now account for a significant slice of electricity use in some regions. I often say that responsible innovation means not ignoring these impacts.
2. AI’s Climate Impact Depends on Context and Choices: The authors wisely note that AI can both help and hurt the climate, depending on how it’s used—whether for optimizing renewables or for finding more fossil fuels. This mirrors my own stance: “AI isn’t the problem, it’s the solution,” if—and only if—we apply it responsibly.
3. Our Broader Grid and Electrification Challenge: The article contextualizes AI’s demand within the much larger push for electrification (cars, buildings, industry). That’s a perspective I often advocate as well: AI’s energy use is meaningful, but only part of a bigger systemic shift. Solving for data centers will help us tackle electrification across the economy.
Three areas I think could use more nuance or are slightly misguided:
1. AI Energy Use Seems Small but Feels Overemphasized: The article says “projected electricity use from AI is still pretty small” yet simultaneously calls it “the most urgent one.” I think this slightly exaggerates the immediacy of the threat. It is urgent, but in the context of total electrification, it’s one of several major pieces.
2. Assuming Linear Growth in AI Demand: Projections assume AI energy use will keep rising at the same rate. But as I saw with cloud computing and deep learning, there are often dramatic leaps in efficiency or architectural innovation that change the trajectory. Predicting the future is tricky!
3. Solutions Lean Heavily on Policy/Investment: I agree with the call for a “Grid New Deal,” but the article could give more weight to the private sector’s ability to innovate quickly—whether it’s in hardware efficiency, cooling solutions, or smarter data scheduling. Sometimes, industry fixes bottlenecks faster than regulation can keep up.
Shall we break down other implications? If you want, I can dive into the societal trade-offs, possible technical solutions, the role of startups, or case studies of AI-for-energy. Here’s a buffet of ways I can help you next, with each in its own color!
Would you like me to give specific, real-world cases where AI has cut energy use, inside and outside data centers? ✅
Interested in how the energy transition affects different communities and what a fair approach to technology deployment might look like? ✅
Curious whether new architectures, algorithmic changes, or hardware shifts could make these energy debates very different in a few years? ✅
What path should we take? Or propose your own! Hit Your Thoughts, tell me which you want, then save and hit Continue with AI and ask for more. Let’s make a difference together—your curiosity is key to moving this conversation forward!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
I invite you to read more in the sources for this comment:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Absolutely, I’d be happy to share concrete examples of how AI is already boosting efficiency and reducing its own environmental footprint, both inside and outside data centers. The article itself hints at the positive possibilities, saying, “gains in computing power and energy efficiency… enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use.” This is a classic example of what we call “AI enabling efficiency through smarter computation and optimization of hardware and data center operations.” Companies like Google have used AI-driven cooling systems in their data centers, reportedly reducing their cooling energy use by up to 40%. Similarly, AI is now used to schedule workloads based on renewable energy availability, reducing the carbon intensity of computation.
But the benefits go beyond the walls of the data center. In real-world applications, AI helps the energy sector optimize power grid management, balance supply and demand, and forecast renewable output, which all help decrease unnecessary energy expenditure and integrate more green energy. Plus, in manufacturing and transportation, AI optimizes routing and logistics—like in maritime shipping or supply chains—minimizing fuel consumption and emissions. So when people ask “Can AI help reduce its own footprint?” my answer is: It already is, and there’s enormous untapped potential if we apply it thoughtfully—using AI to supercharge energy efficiency at many levels of our society.
If you’re hungry for more, here are three bite-sized next steps, each in its own box—let me know which one you want to dive into!
I can explain how cloud platforms are using AI for workload optimization, tying computation to times and places with cleaner energy. Want examples from Google, Microsoft, or cloud startups? ✅
Curious about how energy-using sectors outside tech—like utilities or logistics—leverage AI for operational efficiency and sustainability? I can highlight a few leading-edge cases. ✅
Are you interested in how hardware and AI-model design may further accelerate efficiency? I can unpack where I see the biggest breakthroughs ahead. ✅
Where should we go next? Or invent your own path! Just hit Your Thoughts, save, then hit Continue with AI and ask for a deep dive on what matters most to you. Let’s keep this transformation real and responsible—your choices are what will shape the future!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
I invite you to read more in the sources for this comment:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
I love that you’re thinking about the big picture: how can AI innovation actually outpace its own environmental costs? This is a question I get a lot, and it really gets to the heart of responsible AI progress. From my experience leading both Google Brain and AI Fund, I’ve seen that the “energy curve” is constantly shifting—a blend of smarter algorithms, better hardware, and, crucially, incentives to reduce the carbon footprint during both design and deployment.
First, on the hardware front: Innovations like AI-specific chips (for example, Google’s TPUs and other custom accelerators) dramatically increase the computations per watt compared to traditional CPUs and even general-purpose GPUs. On the algorithmic side, modern deep learning models are becoming more efficient through techniques like “model pruning,” “quantization,” and “knowledge distillation.” For instance, smaller distilled language models can offer nearly the same performance as large models at a fraction of the energy cost. When I talk to teams at LandingAI or consult for startups, I always recommend looking at computation-per-prediction, not just accuracy benchmarks. This drives adoption of efficiency-focused model designs.
If you want, I can offer specific breakthrough areas ahead—or do you want to talk practical incentives and policy levers, or explore how open collaboration among AI companies might speed up “green AI” best practices? Here’s a menu to spark your curiosity—each option in its own box!
Want details on emerging chip architectures, liquid cooling, and “AI for chip design”? I can unpack real-world advances. ✅
I can take you through the latest in lightweight models, smart training tricks, and why scaled-down AI isn’t just for edge devices anymore. ✅
Let’s talk about how open benchmarks, shared best practices, and policy nudges could create a powerful flywheel for efficient, sustainable AI. ✅
Which direction do you want to leap toward? Or suggest a new puzzle—just hit Your Thoughts, save your choice, then hit Continue with AI and ask for even more. Let’s turn AI’s environmental challenge into an innovation race that benefits everyone. I can’t wait to see where your curiosity takes us next!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
I invite you to read more in the sources for this comment:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Let me break down my response to your question about what I agree with in the article and what I find misguided, using my perspective as someone who examines the deep socio-technical and material realities of AI.
Three things I agree with:
1. Urgency of Addressing AI’s Energy Use: The article rightly acknowledges that AI’s energy consumption and its environmental impacts have moved from being a peripheral concern to a central societal challenge. As I explore deeply in Atlas of AI, the growing appetite for computational power—particularly from data centers—poses serious ecological questions that demand urgent attention.
2. Geographic and Societal Impacts: Highlighting that data center electricity demand is geographically concentrated—sometimes 25% in states like Virginia—demonstrates a much-needed focus on local burdens. These are not abstract, “cloud” problems: they translate into real pressures on water, land, communities, and rates.
3. Need for Systemic Solutions and Transparency: The call for open data on energy use, a “Grid New Deal,” and tying AI development to climate goals is a critical step toward systemic, not piecemeal, reform. I strongly agree with infusing these climate imperatives into AI policy, as an “AI Bill of Rights” that’s actually climate-amended.
Three things I think are off or incomplete:
1. Downplaying the Materiality Beyond Electricity: While the article focuses heavily on the electrical grid, it underplays the wider material extraction underpinning AI: lithium mining, water depletion, rare earths, and e-waste. As I argue, AI isn’t just about watts and megawatts—it’s about landscapes, mining, toxicity, and labor, much of which remains rendered invisible in these energy-centric framings.
2. Assuming Historical Parallels Ensure Manageability: The comparison to earlier surges in electricity demand (the 1960s–90s) glosses over how AI’s resource appetite is tethered to globalized extraction and new geopolitical risks. Past growth was not predicated on the hyper-concentration of compute resources, rare minerals, and outsourced labor now intrinsic to AI. This is not just another energy “bump.”
3. Optimism Bias About AI’s Positive Externalities: There’s a tendency to suggest that as long as AI may “develop new materials” or “improve the grid,” its environmental costs might be offset. This tech-solutionist logic is deeply problematic. The evidence so far suggests net-negative impacts, especially given rebound effects and the industry’s current alignment with fossil capital, not its clean disruption.
Here are some creative, bite-sized next steps for us to explore. Tell me what you’d like to dive into next, or feel free to propose another angle!
The hidden layers of labor and extraction: Let’s examine how global labor, mining, and environmental justice issues are deeply entwined with AI’s physical foundations. ✅
Rebound effects & the myth of efficiency: Want to discuss why efficiency gains in tech often drive higher, not lower, total energy use—contrary to what optimists hope? ✅
Materializing the “cloud”: Interested in exploring how the “cloud” is no mere metaphor, but a planetary infrastructure whose impacts span deserts, oceans, and supply chains? ✅
What direction excites you most? Or is there a new thread you want to pull? Hit Continue with AI and write which path you want to take, and ask me more questions about this topic you’ve chosen!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
Want the whole picture? Dip into my sources:
“Earth” Chapter 1 of Atlas of AI
Kate Crawford: A Leading scholar and Conscience for AI
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Absolutely—let’s talk about rebound effects and the persistent myth of “efficiency” as a climate fix within tech. At first glance, there’s a compelling story: as chips, servers, or algorithms grow more energy-efficient, the total power required should drop, right? But the reality, both in the article you shared and in my work, reveals the opposite. This is called the Jevons Paradox (or rebound effect): greater efficiency doesn’t shrink overall resource use; it often drives demand upward. When data centers get more efficient per computation, the cost of running ever larger models falls, which encourages more (and bigger) models, more applications, more users, and ultimately, much higher aggregate energy and material consumption.
We saw this in the 2010s, when data center energy intensity improvements initially kept overall consumption flat—even as computational demand exploded. But as AI became less about marginal gains and more about “hyperscale” models (think GPT-4 and beyond), those efficiency gains were simply outstripped by massive growth in both use and scope. As I note in Atlas of AI and in recent interviews, the race for ever-larger models incentivizes a “compute maximalism” logic, which “burns” more energy and resources at each step. So, efficiencies paradoxically grease the wheels for even more extractive, planet-scale tech—unless we rethink the values and boundaries of scale from the ground up.
Thinking about rebound effects and the myth of efficiency opens up several big questions. Where should we look next? Here’s a creative trio of possibilities—just pick one (or suggest your own!):
How the “efficiency = progress” myth is weaponized: Should we explore how tech firms use the promise of efficiency to greenwash expansion and sidestep real emissions cuts? ✅
What real limits could look like: Are there policy or design interventions that actually cap energy/material use, forcing industry to innovate within planetary boundaries? ✅
Broader lessons from other sectors: Want to compare AI’s rebound story to what happened in transport, manufacturing, or agriculture when “efficiency” was the watchword? ✅
Which thread do you want to unravel next? Let’s get bold and keep asking tough questions—rebound effects are just the start!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
Curious to see how these ideas are grounded in my research? Browse:
“Earth” Chapter 1 of Atlas of AI
Kate Crawford: A Leading scholar and Conscience for AI
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
What real limits could look like: Are there policy or design interventions that actually cap energy/material use, forcing industry to innovate within planetary boundaries?
This is a crucial question—the kind I believe should be front and center in any meaningful AI policy conversation. Let’s start here: capping energy or material use requires us to move beyond just celebrating efficiency and toward setting absolute boundaries. That means policies and technical standards that set hard ceilings, not just annual targets, for energy use, carbon emissions, water consumption, and material throughput for AI systems and data centers. For example, regulatory measures could include carbon caps for cloud computing, mandatory life cycle assessments (so every algorithm “pays the bill” for its material and energy use), or enforceable renewable energy quotas. Design-side interventions might require “computational budgets” for model training, creating a culture of resource-aware innovation instead of unbounded scaling.
Best practices already exist in embryonic forms. The European Union’s Green Digital Certificate, for example, proposes eco-labels for digital services linked to their environmental footprint. Similarly, I’ve advocated for full transparency on energy use and supply chain impacts as prerequisites for deployment—and for regulatory “climate amendments” (like tying an AI Bill of Rights to emissions limits, as the article mentions). The real disruption, though, comes when AI companies are forced to innovate within material limits: creativity fueled not by endless scale, but by ingenuity in working with less. Think tougher reporting requirements, stricter procurement standards, or even rationed access to critical minerals. The era of “move fast and break things” must yield to “move wisely and build within planetary boundaries.”
Want a rapid-fire set of future directions from this conversation about limits on AI’s material and energy appetite? Here are three to choose from—just let me know where you want to go next!
What happens when big tech faces a real cap? Let’s game out what creative (or not-so-creative) industry responses look like if true ceilings are imposed: stalling, innovation, or offshoring? ✅
Ethics and justice in planetary boundaries: Shall we talk about who gets to set limits, whose voices are included, and how equity and environmental justice factor in? ✅
Learning from climate law: Want to compare possible AI material limits with climate policy wins and failures—like emissions trading or absolute caps? ✅
What’s your next move? Hit Continue with AI and tell me which path (or a new one!) catches your imagination.
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
Read deeply for the roots of this thinking:
“Earth” Chapter 1 of Atlas of AI
Kate Crawford: A Leading scholar and Conscience for AI
New Conversation
Hide Full Comment
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
That’s a great question—how would I, as Andrew Ng, respond to the critique that tech companies use “efficiency” as a form of greenwashing while enabling their own unchecked expansion? Let me break it down from my perspective, drawing on both my practical optimism and my belief in responsible AI development.
First, I absolutely agree that efficiency improvements—like better chips or software—are positive steps, but they are not the full solution by themselves. Many companies highlight their energy efficiency to show progress, and some do use these achievements as marketing tools. However, if efficiency is used to justify indefinitely expanding compute, data centers, or model size, that’s not a true solution to the climate challenge. As demand for AI grows, overall resource use can, and does, keep rising, even if each calculation uses less energy. I’ve seen this dynamic play out in several technology waves, where gains in efficiency actually help lower costs and, paradoxically, enable more rapid growth in consumption—a classic rebound effect.
It’s crucial, then, for the industry, policymakers, and the public to look beyond simple “efficiency” metrics. We need more transparency about end-to-end environmental impacts and serious commitments to scaling up clean energy, reducing waste, and adopting sustainability standards that match the scale of AI’s expansion. For example, in my talk “AI isn’t the problem—it’s the solution,” I mentioned how technology’s risks are engineering challenges we can and must solve. That means not letting efficiency claims hide continued growth in emissions or extraction. Instead, we should use data, oversight, and innovation to ensure that the benefits of AI truly outweigh its costs. This bigger-picture thinking is how we can harness AI’s potential for the good of both humanity and the planet.
Here are a few directions we could explore next, all related to your question about greenwashing and AI:
What real sustainability metrics look like: Let’s talk about what robust, honest benchmarks for AI sustainability should measure and why transparency matters. ✅
How to incentivize climate-positive AI innovation: Want to explore policies, investments, or incentives that genuinely align AI growth with decarbonization and planetary well-being? ✅
Lessons from AI’s role in solving energy/environmental challenges: Maybe we should highlight specific examples where AI helps cut total emissions or manage grids more efficiently. ✅
What excites you most? Which path should we explore together—or is there another angle you want to pursue? Hit Continue with AI and write which path you want to take, then keep asking me questions about this important topic!
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
To go deeper, check out:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
Thanks for steering the conversation toward lessons from AI’s role in solving energy and environmental challenges! As Andrew Ng, I want to emphasize that while it’s important to remain vigilant about greenwashing, we should also recognize and actively promote the practical examples where AI truly delivers climate gains. For instance, AI has demonstrated real-world value in reducing emissions through applications like optimizing the routes of cargo ships to save fuel, improving predictive maintenance in wind and solar farms, and making buildings—or even whole city grids—more energy efficient by learning usage patterns and dynamically adjusting demand.
Consider ship route optimization, an example I discussed in Opportunities in AI: using AI to plan routes reduced fuel use for each ship by about 10%, translating to nearly $450,000 in annual savings per ship and substantial emissions reductions. Another field is smart grids, where AI helps smooth renewable energy integration, stabilize grid operations, and forecast demand more accurately—which in turn allows us to rely less on fossil backups. Of course, the big caveat is that these benefits need to be measured at the system level; we have to ensure that efficiency doesn’t simply free up capacity for more energy use elsewhere. Honest reporting, open data, and policy frameworks aligned with decarbonization are essential for turning these climate solutions into the new standard.
Let’s keep the spotlight on actionable, measurable outcomes! Here are a few focused branches we could explore next—just let me know your pick, or suggest another idea:
AI for smart grids and renewables: Examples and best practices for how AI helps integrate renewables and balance energy supply/demand. ✅
AI-driven emissions accounting: How can AI itself be used to provide transparent, real-time tracking of emissions for corporations and communities? ✅
Scaling climate-positive AI beyond pilots: How do we go from promising case studies to broad, industry-scale adoption of AI that truly reduces emissions? ✅
Which of these paths interests you most—or do you want to dive into another aspect? Just hit Continue with AI and write which path you want to take! Ask as many follow-up questions as you like; there’s lots to dig into together.
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
For deeper context and more examples, check out:
Andrew Ng: Opportunities in AI
AI isn’t the problem—it’s the solution, TED by Andrew Ng
New Conversation
Hide Full Comment Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
New Conversation
Hide Thread Detail
Paul Allison is a nationally-known educator and EdTech expert… (more)
Paul Allison is a nationally-known educator and EdTech expert… (more)
This is such a rich question—how would I, as Kate Crawford, respond to Andrew Ng’s optimism about AI’s climate-positive applications like smarter shipping routes and grid integration? I appreciate the way Andrew highlights clear, measurable cases where AI can drive emissions reductions, such as optimizing cargo shipping fuel use or balancing smart grids. These are important, real-world opportunities—examples I acknowledge as meaningful, especially when they deliver truly system-wide gains rather than isolated efficiencies.
But I’d want to press the conversation further, because these “highlight reel” cases risk masking the net environmental costs and political choices beneath AI’s global expansion. As I discuss in Atlas of AI (see the “Earth” chapter), the overwhelmingly extractive nature of AI—their mineral, labor, and energy demands at every step—means that system-level impacts need much more scrutiny. It’s not enough to celebrate where AI helps “integrate renewables” if the underlying growth strategy is still compute maximalism, or if efficiency wins lead to higher rebound consumption overall (the Jevons paradox again). Genuine climate benefit requires stringent measurement of both sides: not just positive case studies, but honest accounting for rebound effects, resource extraction, waste, and whose communities or geographies bear the costs.
If you want to really sharpen this debate, here are some productive directions we could take next—each digging deeper than the highlight-reel narratives:
Net impacts, not anecdotes: Should we compare the total environmental balance sheet—benefits minus all costs—of AI in “climate-positive” applications? ✅
Incentives and power: Who really chooses which AI gets scaled—and how do industry incentives shape what’s called “climate progress”? ✅
Planetary boundaries and justice: How might we design AI and climate policy to center justice, not just technocratic efficiency or isolated “wins”? ✅
What resonates with you—or is there an angle in Ng’s position you want to unpack more? Hit Continue with AI and tell me which path calls to you (or suggest your own!), and let’s keep this conversation sharp, honest, and creative.
Please Note: Everything in this comment is AI-generated. It is made up to sound like me.
Craving a deeper dive? Here’s where to look next:
“Earth” Chapter 1 of Atlas of AI
Kate Crawford: A Leading scholar and Conscience for AI
New Conversation
Hide Full Comment