Request Access to Climatiqs data tool kit.
All posts
May 24, 2023

9 ways to reduce your computing carbon footprint

9 ways to reduce your computing carbon footprint

This is our final post in a series of four articles that dissects how to measure greenhouse gas emissions from operating data centres and cloud services. Today’s post explores practical ways you can lower your computing footprint.

Check out other posts in this series to learn about the primary sources of greenhouse gas emissions in data centres, the role of the big three, namely AWS, Azure, and GCP, and how you can assess emissions from your cloud computing activities, including on-prem data centres and hybrid set-ups. Sign up for our free newsletter to stay up-to-date on new articles.


Calculating your organisation’s computing footprint is a great first step, but your ultimate goal is to understand the scope of the emissions you’re generating so that you can make meaningful reductions. It’s not just the ICT industry that has to take computing emissions into account, because the entire infrastructure of our world now runs on technical solutions. No matter what your business does, there are smart ways you can reduce emissions from your IT operations. Let’s look at a few ideas.  

Migrate from on-prem to the cloud. 

In the last few years, businesses have realised the need to get smarter about data storage. This means looking beyond security and user safety to take sustainability into account, too. Data centres currently consume about 3% of the global electricity supply and account for approximately 2% of greenhouse gas emissions. By 2030, data centres are projected to consume 13% of global electricity. As there’s no getting away from the fact that the data centre industry will grow, it’s now imperative that businesses think long and hard about where their data is stored. Moving from on-prem data centres to cloud servers may be key to making meaningful reductions. Hyperscalers, in particular, are up to 5 times less carbon-intensive than on-prem data centres. One of Climatiq’s partners reported an even greater CO2e saving of 85% by moving from on-prem to a cloud-based service. As we’ve reported previously, hyperscalers and co-located data centres are more efficient (including accounting for water consumption), driven by better energy utilisation, more efficient cooling systems, and increased workloads per server. Computing workloads in hyperscale data centres are almost six times more water-efficient than internal data centres. 

Move to an efficient data centre. 

Once you’ve migrated to the cloud—or even if you’re already there—it’s time to think about which data centres you want to use. You can enquire with your cloud provider about data centres that have a low PUE (Power Usage Effectiveness) and are supplied with a large percentage of renewable energy. Bear in mind that a data centre’s energy credentials can only be as good as the local energy grid mix so do your research before picking a country to host in. Google Cloud has a helpful region picker here or you can use Climatiq’s data explorer.

Eliminate unused processes. 

Getting rid of processes and virtual machines that are either idle or unused is a smart step towards optimising your computing and shaving unneeded energy costs off your operations. Google Cloud’s Active Assist offers this as a feature and provides recommendations that help you reduce costs, increase performance, improve security, and make more sustainable use of the energy you do use. 

Run serverless. 

Serverless architecture is a bit of a misnomer as processes still run in the cloud, but working on-demand means resources are only consumed while the service is in use. This is a more optimal way for companies to function and allows developers to write and deploy code without having to manage an underlying infrastructure. At Climatiq, we use Fastly, which allows us to reduce cost, scale quickly and exponentially, deploy at speed, and reduce latency for our end-users. Because we aren’t running idle or underutilised servers, this also helps us to reduce emissions. You can get deeper insight into our architecture here on the Fastly blog.

Schedule batch jobs to green data centres.

Another way that efficiency can significantly reduce emissions is through smart scheduling. By setting batch jobs to run at green data centres when the share of renewable energy is high, you’re mitigating the amount of CO2e that gets released by simply planning ahead. For instance, Cloudflare’s Green Compute automatically computes scheduled jobs on parts of their edge network located in facilities powered by renewable energy. 

Choose efficient languages.

As computing has evolved, so too have the languages it runs on. As every dev knows, some languages are more efficient than others and choosing the right one can make all the difference to your emissions, as well as to the speed of your processes. At Climatiq, we use Rust to reduce CPU and memory use, with the added bonus that it allows us to practise what we preach in terms of sustainability. Nothing is perfect, but for the last few years, Rust has consistently topped Stack Overflow’s most-loved languages list. Its main benefits are that it has better memory safety than other languages due to the compiler, it has easier concurrency due to the data ownership model that prevents data races, and it has zero-cost abstractions. Boiling it down to its basics, Rust is so efficient that it uses less energy. And less energy means fewer emissions. 

Support dark themes.

It may be time to go dark. Dark themes are not only easier on many users’ eyes and phone batteries, but supporting them is nowadays a common practice and can also save electricity by up to 40% on OLED screens. While different types of displays function differently, lighting pixels generally requires more energy than not lighting them. Especially in dark mode for OLED and AMOLED screens, many pixels are simply turned off. Less lighting needed, less electricity that has to be expended. 

Extend equipment life cycle.

Make the equipment you work with last longer. In many cases, servers are replaced every 3-5 years, but if you can make those servers last longer (up to 8 years), the fewer emissions will be embodied in the creation of new equipment. Observing best practices in server maintenance  and taking care of equipment as much as possible will mean you have to replace it less often, and as a result, will produce fewer cumulative emissions over time. However, some of the latest generation hardware pieces are more energy-efficient and could reduce the energy consumption significantly - so understanding the trade-off of operating old equipment to improve the life-time emissions versus installing new components for better energy efficiency is crucial.

Move logic to the Edge. 

Edge computing means computing and data storage that happens at or near where data is produced. Edge computing still uses the cloud, but computation is done locally, and this geographic proximity can potentially reduce the volumes of data that need to move, which in turn, decreases traffic and latency. The less energy required to move data back and forth between the client and the servers, the fewer emissions that are produced. This holds especially true for serverless architecture, but has not been fully proven for general edge computing. 

So, there you have it: our non-exhaustive list of ways you can lower your computing footprint. If you have other tips to share with our community, be sure to let us know and we will pass that info on. The open-source ethos at the heart of the tech industry lends itself particularly well to knowledge-sharing and problem-solving. As our industry evolves, it’s our responsibility to make sure we’re supporting the future of the planet, not compromising it.

Tags