Skip to main content

There is a conversation happening in boardrooms and sustainability committees right now that the technology function needs to be part of, and in many organizations, still is not. The company has made public commitments to carbon reduction targets. The Chief Sustainability Officer is reporting progress to the board. And simultaneously, the AI strategy is driving data center electricity consumption up by a factor of three. These two conversations are heading for a collision, and the CIO is sitting right in the middle of it.

The energy figures associated with AI at scale are genuinely striking. The International Energy Agency projects that data center electricity consumption could reach around 945 terawatt-hours by 2030, up from roughly 415 terawatt-hours in 2024. To put that in perspective, 945 terawatt-hours is comparable to Japan’s entire annual electricity consumption. A single large AI data center can now consume one gigawatt of power, equivalent to the electricity needs of roughly 876,000 households. AI is not a footnote in the energy conversation. It is becoming a headline.

PwC Belgium published analysis in 2025 showing that 85.5 percent of AI’s greenhouse gas impact comes from three sources: model training, inference operations, and network traffic. That breakdown matters because it points to where interventions are actually possible. Training large foundation models is energy-intensive, but most enterprise AI deployments are on the inference side, and inference optimization is an area where CIOs can have a real impact without waiting for hyperscalers to solve the problem upstream.

The regulatory signal is already here

This is not just an ESG optics question. The EU AI Act’s General-Purpose AI obligations, which came into force in August 2025, explicitly include energy consumption reporting requirements for GPAI model providers. That reporting infrastructure is being built now, and the data it generates will be visible to regulators, customers, and investors. Organizations that have not started measuring their AI energy footprint will find themselves at a disadvantage when that information becomes a standard disclosure item.

IDC predicts that by 2027, sixty-five percent of CIOs will be directly responsible for integrating sustainability goals into technology project decisions. Gartner is already framing carbon intensity as a metric that technology leaders should optimize alongside latency, cost, and reliability. The trajectory is clear: green technology architecture is moving from nice-to-have to governance requirement.

What CIOs can actually do

The good news is that this is a problem with real levers. Six practical actions stand out based on what leading organizations are doing in 2025. Invest in prompt engineering training for your teams: poorly designed prompts drive unnecessary compute consumption, and a few hours of training on how to write efficient prompts can meaningfully reduce inference costs and energy use at the organizational level. Right-size your model selection: not every use case requires a frontier model, and smaller specialized models running on appropriate infrastructure are often more efficient and deliver comparable results for well-defined tasks.

Implement AI metering and internal carbon pricing: organizations that have no visibility into which teams are consuming AI compute, and at what volume, cannot manage the footprint. Chargeback models that include an energy cost component create the right behavioral incentives. Include renewable energy SLAs in cloud procurement: the major hyperscalers offer data center regions with different renewable energy profiles, and location decisions for AI workloads can be made with energy intensity as a criterion alongside latency and compliance.

For batch inference and non-real-time AI tasks, scheduling to run during periods of high renewable grid availability is increasingly feasible with the tooling now available from major cloud providers. And demand sustainability transparency from your AI vendors: published model cards, energy consumption benchmarks, and scope 3 emissions reporting from technology suppliers should be part of standard procurement criteria.

The opportunity framing here is also real. AI is simultaneously contributing to the energy problem and offering tools to address it: optimizing energy grids, improving manufacturing efficiency, reducing logistics emissions, accelerating materials science for better battery technology. The organizations that engage seriously with Green AI are not just managing a liability. They are positioning technology as part of the sustainability solution, which is a much more compelling story for the board, for customers, and for the talent they are trying to attract.


Discover more from In-Movement

Subscribe to get the latest posts sent to your email.