By 2035, data centers in Belgium could consume up to five times more electricity than today. This striking forecast comes from a recent study by the Boston Consulting Group. Artificial intelligence plays a key role in that projection - not only when we actively interact with it (like querying ChatGPT), but also when it runs in the background, continuously refining models through training.
Yes, AI is energy-hungry. But here’s the twist: AI can also be part of the solution. With the right mindset, technology choices, and policies, we can leverage AI to reduce digital waste, optimize software, and shrink our energy footprint.
AI is becoming embedded in nearly every business application and its usage is rapidly expanding. But that growth comes with a cost.
Global data center electricity demandalready exceeds 415 TWh annually, around 1.5% of total global consumption. According to the International Energy Agency, this figure could more than double to 945 TWh by 2030 if current trends continue.
Cooling, computation, storage, redundancy - everything consumes energy. The more AI models scale, the more cycles we run, the more power we need.
Some countries have already imposed moratoria on new data centers due to pressure on the power grid. Others are relocating data infrastructure to areas with more sustainable energy (hydro, geothermal) or better natural cooling (cold climates, access to water).
Tech giants are taking the challenge seriously - attacking the problem from both ends:
Infrastructure efficiency: Better cooling systems, chip design, and energy-optimized data center architecture. E.g. the use of dedicated chips (ASICs, TPUs) optimized for specific software to improve efficiency.
Energy sourcing: Google’s data centers already operate at an average PUE of 1.09, among the best in the industry. Microsoft has signed a deal to power data centers with nuclear energy, and is investing in modular reactors and nuclear fusion technologies.
But hardware alone won’t solve it. It’s time to focus on software. Optimizing software for energy and resource efficiency is often overlooked in sustainability discussions. Yet, it offers massive untapped potential. Some examples:
Use AI to optimize resource consumption. See my blog "The Hidden Potential of AI: Minimizing Resource Consumption in Software" (https://bankloch.blogspot.com/2024/12/the-hidden-potential-of-ai-minimizing.html) for more info.
The size of even the simplest executable can vary massively depending on the language. For a "Hello World" binary, there is a huge variance in the size of the resulting binary depending on the underlying programming language, i.e. Rust: 3.6 MB, Go: 1.9 MB, C++: 17 KB, C: 16 KB and Assembly: 8.7 KB. There’s no reason why the compiled output (no matter the programming language) can’t be optimized to the level of assembler. Developers can enjoy the convenience of expressive source code, while compilers and optimizers ensure minimal binaries.
The average web page now weighs 1.5 to 2 MB - often just to display a few paragraphs and images. Excessive libraries, fonts, scripts and trackers inflate everything from bandwidth to storage and caching costs. With AI-assisted audits, code can be trimmed and rewritten for performance and size.
Binary delta patching: Most apps download entire packages for updates, even when changes are minor. AI-driven Binary delta patching — calculating the minimal binary change and delivering only the difference - slashes bandwidth and energy use across millions of devices.
Predictive applications: Imagine applications that understand their own usage patterns and adjust in real time, i.e. Identify rarely accessed files and move them to lower-energy storage tiers, Compress files that aren’t frequently used, Predict traffic spikes, spinning up or down resources accordingly or Adapt software behavior based on cloud provider infrastructure for maximum efficiency.
Not all applications need high-performance compute. Many can shift data to cold or glacier storage, compress logs, or run on leaner hardware. These changes not only reduce cost — they also cut energy consumption.
There’s a growing field of research around Green AI, i.e. developing AI models and systems that deliver similar results with less computation. For example: A study published on Data-Centric Green AI study shows that using smaller datasets, fewer features, and smart model compression energy use can drop by over 90%, often with minimal accuracy loss.
To accelerate this shift towards sustainable software, a few things must happen:
Make energy efficiency a KPI in software development. Just like performance or security.
Build tools and linters that analyze energy usage and suggest improvements.
Educate engineers on green development patterns — caching, compression, modular loading, etc.
Incentivize low-energy software with tax credits, grants, or certification programs for low-energy software.
Enforce transparency: require providers to disclose energy use, cooling efficiency, and carbon intensity.
AI doesn’t have to be the villain in the energy story. It can be the hero - if we focus not only on the infrastructure and hardware, but on the code itself.
We need to code for intelligence and efficiency. Because sustainability isn’t just about clean energy. It’s about writing better software - software that runs lean, fast, and green.

Comments
Post a Comment