Ampere’s Jeff Wittich: How Energy-Efficient Processors Can Cut AI’s Carbon Footprint

Artificial intelligence (AI) is rapidly changing the world, touching nearly every industry and aspect of our daily lives. The growth of AI, especially generative AI, is expected to reach a staggering $1.3 trillion market size by 2032, according to Bloomberg. However, this remarkable progress comes with a significant cost: energy consumption. The International Energy Agency predicts that AI’s electricity usage will more than double from 2022 to 2026, driven by the immense computational resources and sprawling data centers required to power it.

Major companies are starting to address this growing energy concern. Microsoft, for instance, has acknowledged a 30% increase in its emissions since 2020, mainly from building new data centers. To counter this, they have heavily invested in carbon credits, essentially purchasing ‘credits’ from companies with unused emissions allowances to offset their increased emissions. While carbon credits offer a temporary solution, they are not a long-term answer to the emissions generated by AI. They simply shift power consumption from one location to another without reducing the overall global energy usage and carbon footprint.

To truly tackle the energy and emissions challenge of AI, the tech industry needs to harness its strength: innovation. We need to look at the source of the power usage – the processor. The AI boom, especially AI training, has fueled a surge in demand for power-hungry GPUs. Additionally, existing server infrastructure, often more than five years old, is aging and becoming less efficient. This has resulted in a data center capacity crisis. Data centers are often half-empty because their power limits are reached quickly, even with a few installed servers. Consequently, companies continue to build new data centers to expand their computing capacity, offsetting them with carbon credits to maintain their sustainability goals. However, with data centers consuming around 2% of global energy consumption, expanding data centers is no longer a sustainable solution to the capacity problem.

The answer lies in energy-efficient, scalable cloud-native processors. Replacing outdated systems with these processors and selecting the optimal processor for each task maximizes performance per rack, increases data center density, and provides the necessary computing power for AI adoption – all while minimizing environmental impact instead of simply offsetting it.

The rapid expansion of AI has resulted in a dramatic increase in energy usage, particularly in data centers. Some AI-ready racks consume a hefty 40-60 kilowatts each due to their GPU-intensive hardware. This rising demand necessitates more direct and tangible solutions to significantly reduce the carbon footprint. Relying solely on carbon credits is not a sustainable long-term strategy. The AI industry invested $50 billion in GPUs last year for training advanced models, yet critics point out that the industry’s revenue remains relatively small, raising concerns about potential financial strain. This disparity underscores the need for more efficient and cost-effective approaches in the AI sector.

To make a real difference, companies need to move beyond carbon credits and adopt a fresh approach. Here are three key steps businesses can take to reduce their AI carbon footprint:

1.

Right-size Inference Compute:

Analyze AI workloads and identify those that can be processed efficiently with less powerful processors. This reduces overall energy consumption without sacrificing performance.
2.

Switch to CPUs for Relevant AI Workloads:

Modern CPUs offer high performance with significantly lower power requirements compared to GPUs. For tasks like AI inference, where real-time processing is critical, CPUs can deliver comparable performance while consuming less energy. Replacing outdated servers with new, energy-efficient CPU models can drastically reduce energy usage in data centers.
3.

Refresh Aging Servers with Cloud Native Processors:

Upgrading server infrastructure with cloud native processors enhances performance and reliability while reducing energy consumption. These advanced processors are specifically designed for AI inference tasks, ensuring optimal efficiency even for demanding workloads.

Addressing AI’s energy challenges requires a holistic approach that goes beyond simply purchasing carbon credits. By right-sizing inference compute, switching to CPUs for relevant AI workloads, and refreshing aging servers with cloud native processors, companies can significantly reduce their AI carbon footprint. These strategies not only improve operational efficiency and sustainability but also offer significant cost savings.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top