KubeCon NA 2022: Eye-opening Sessions on Cloud Native Environmental Sustainability with GitOps
In two sessions that took place at KubeCon co-located events GitOpsCon & Telco Day NA 2022, we discussed ways organizations can measure and reduce the energy usage and carbon footprint of their CI/CD pipelines with GitOps, with particular relevance to telcos.
Two eye-opening sessions on cloud native environmental sustainability
At KubeCon North America 2022 co-located events GitOpsCon and Telco Day, engineers from Weaveworks offered their insight in two separate discussions on a subject that is only becoming more important in the business of cloud native computing: environmental sustainability.
First, a Lightning Talk at GitOpsCon from software engineer Niki Manoledaki, in which she gave some context to the issue of energy consumption and CO2 emissions in cloud native computing. She gave an example of how to gather energy metrics about Flux Controllers and other Kubernetes resources using Kepler, a cloud-native EBF-based energy-gathering tool, before revealing some of the findings of Weaveworks’ recent investigation into the potential of various GitOps-compatible tools to reduce the footprint of IT operations.
You can watch Niki’s Lightning Talk in full here or below.
Next up is a panel discussion at Telco Day moderated by Niki, with speakers including her Weaveworks colleague Chris Lavery, Intel’s Marlow Weston and Red Hat’s William Caban, which looked specifically at the potential of various tools to reduce the environmental footprint and energy consumption incurred by telcos as they move their operations to the edge in support of 5G – a shift that has involved a supporting migration to container-based, cloud native architecture (read: Kubernetes).
You can watch the full panel discussion here or below.
A problem in urgent need of solutions
Practically everyone who works in the tech industry today is aware of the enormous amounts of energy consumed by the world’s datacenters. As Niki mentions in the panel discussion, according to the International Energy Agency, datacenters account for between 1% and 1.5% of global energy use at present (2022) – and the GSMA reported in 2019 that the telco sector specifically is responsible for between 2% and 3% of global electricity use.
The speakers discussed how access to granular data on the energy consumption and carbon emissions accounted for by various processes is not always available, especially if many workloads are running in VMs on a public cloud. So while the datacenter operators themselves can take steps to optimize their hardware and cooling systems, it can be difficult for customers to reduce the energy requirements of their applications and infrastructure.
Despite growing regulatory requirements for organizations to report on the emissions resulting from their activities, there has until now been a distinct lack of tools for reporting on the emission levels of IT activities, particularly for resources running in Kubernetes clusters. As a result, many organizations – and, as William noted in the panel discussion, many telcos – are still not fully ready to report on their IT emissions at the level of detail required.
Tools that are available now
The good news is that there are tools emerging with the capabilities to plug some of these data gaps. Niki’s Lightning Talk showcases an example of how to measure the energy consumption of the Flux controllers, which run as Pods in the “flux-system” namespace. This is done using Kepler, which is a new tool based on eBPF, that works in conjunction with the monitoring tool Prometheus to provide data on the energy usage of Kubernetes resources such as Pods and Nodes. This data, which is provided in millijoules, can then be converted into watts or indeed kilowatts per hour.
She mentions how public data can also help. Carbon intensity data is available for electrical grids around the world via API providers like WattTime, enabling you to calculate a Marginal Emissions Rate and establish how much carbon is created in different locations in the creation of a megawatt of energy. Organizations can leverage these APIs to inform provisioning, placing workloads where the emissions are likely to be lowest.
Solving the problem with GitOps
As Niki explained in her Lightning Talk, CI/CD pipelines are ripe for energy optimization, with a number of steps that can be taken to both measure and reduce CO2 emissions.
For many organizations, the transition to GitOps often begins by decoupling CI and CD – and that, in turn, represents an opportunity to measure the energy consumption of these processes independently.
OpenGitOps is looking at the use of cloud-native tools such as Kepler in GitOps pipelines in various scenarios, with an aim to provide the business community with data they can use to make informed decisions about emissions when managing software delivery. Members of the community can join this conversation through the new subgroup of OpenGitOps on Environmental Sustainability, which aims to focus on this and continue this research.
But measurement is only one piece of the puzzle. Clearly you still need to find ways to cut down on the energy used. Energy consumption and crucially, energy waste, can be reduced by implementing GitOps tools and patterns. Not only does the declarative nature of GitOps pipelines mean you have full visibility of the tools running in your clusters, but GitOps gives you a mechanism to actually turn various tools off when they are not needed.
Scheduling and workload placement
As the panel discussed in the Telco Day session, GitOps also facilitates scheduling, via a number of open source Kubernetes tools (e.g. Karpenter, KEDA, Intel’s Telemetry Aware Scheduling or TAS, and Nomad) that can enable organizations to choose when to run various processes and, better still, to automate this ‘switching off’, so that it takes place as and when required.
By employing these tools, it is possible to schedule resources dynamically, based on metrics such as power consumption, for example.
As well as informed or automated scheduling, dynamic provisioning can help to reduce emissions. Red Hat’s William Caban explained in the telco panel session that, rather than the traditional approach of pre-provisioning to make a service available everywhere it might possibly be needed, telcos in particular can adopt a smart workload placement approach that involves monitoring network activity closely in order to predict when and where various services will be required and to respond in real time, making them available only in the geographical locations where they are needed.
Policies that save money and energy
As mentioned by Chris in the Telco Day session, the concept of FinOps has recently begun to gain traction, aiming to underpin operational models with tighter financial control. The aim is to reduce the wasted energy consumed by unused resources, via the implementation of system-wide observability and smarter resource allocation. Although the essential aim of FinOps is to save money, much of its practical application concerns a reduction in energy usage, which brings with it the additional benefit of improved environmental sustainability. As the saying goes, the greenest energy is the energy you don’t use at all.
Start your sustainability journey today
For many people reading this article, one of the fastest ways to improve the environmental sustainability of your CI/CD pipeline and IT operations as a whole could be to adopt GitOps as a controlling model. As outlined above (and in the two sessions that took place at GitOpsCon), it enables you to decouple CI and CD, so you can focus more easily on the individual processes involved. As a model built around Kubernetes, it is compatible with all the tools in the Kubernetes ecosystem that can be brought to bear to measure energy consumption at the levels of granularity required, as well as those that can assist with automated scheduling and dynamic provisioning. It also enables you to implement policies that serve as guardrails for your teams, ensuring that nothing can be deployed in a manner that makes measurement difficult or contravenes your pre-defined rules on where software should run.
To evaluate GitOps, you can download and use the Weave GitOps for free. To implement the model at scale, we recommend Weave GitOps Enterprise, which includes additional enterprise features and commercial support.