Artificial intelligence is using more power than many people realize. From the chatbots we use daily to powerful systems behind the scenes, AI needs lots of electricity to work.
Recent research shows AI’s power use is growing fast. Training GPT-4, the system behind ChatGPT, used as much electricity as 175 American homes do in a year.
“What we’re seeing is just the beginning,” says Dr. Emma Chen at the Berkeley Energy Institute. “As AI systems get more complex, their energy needs increase dramatically.”
This matters for our planet. Most electricity still comes from burning fossil fuels. When AI systems use more power, they can add to climate change.
Tech companies know this problem exists. Google and Microsoft have built special computer chips that use less energy for AI tasks. They’re also placing data centers in cooler locations to save on cooling costs.
Some experts worry we’re moving too fast. “We’re rolling out AI everywhere before we understand the full environmental impact,” warns climate scientist James Rivera.
The energy problem gets bigger as more people use AI. Every time someone asks ChatGPT a question, computers somewhere use electricity to find the answer.
Some AI systems now run on phones and laptops instead of big data centers. This approach, called “edge computing,” can use less total energy.
Companies are finding creative solutions. Meta recently designed an AI system that uses 90% less energy than earlier versions. Others are exploring ways to power AI with renewable energy.
Schools are also teaching future technologists about energy-efficient AI design. “Students today need to think about both what AI can do and what resources it requires,” says education researcher Priya Sharma.
The future might bring better options. Quantum computing could someday run AI with much less energy. New computer chips designed specifically for AI tasks continue to improve.
For now, we face important choices about AI’s growth. Should we limit certain AI uses that consume too much energy? How can we make sure the benefits of AI are worth the environmental costs?
As AI becomes part of more products and services, these questions will affect everyone. The decisions made today about AI and energy will shape our climate tomorrow.
“The challenge isn’t stopping AI development,” says environmental policy expert Thomas Lin. “It’s making sure that development happens in a way that respects planetary boundaries.”
For everyday users, small changes help too. Using AI tools less frequently or choosing energy-efficient devices can reduce your digital carbon footprint.
The next few years will be critical. As AI systems grow more powerful, finding ways to power them sustainably becomes one of technology’s biggest challenges.