Multicluster GitOps on EKS-D with WKP

By Weaveworks
December 01, 2020

This demo shows how easy it is to install Weave Kubernetes Platform and use GitOps to build a continuous deployment pipeline and promote deployments from a development cluster to production. This scenario covers a development cluster based in the cloud on AWS EKS, together with a production deployment on-premise with EKS-D.

Related posts

What is a Kubernetes Cluster?

Webinar: Achieving Record Growth While Reducing Costs: Dream11 Explains How

Weaveworks product highlights from GitOps Days 2021

This post was authored by Michael Beaumont, Weaveworks Software Engineer. 

From the moment that the enterprise began considering cloud services, they have been talking about the hybrid cloud - one where they leverage both a public cloud and on-premise data centers. The combination of EKS in the cloud and EKS-D on-premise now makes that possible.

We’ve put together a demo showing how easy it is to install Weave Kubernetes Platform and use GitOps to build a continuous deployment pipeline and promote deployments from a development cluster to production. This scenario covers a development cluster based in the cloud on AWS EKS, together with a production deployment on-premise with EKS-D.

EKS-D is a Kubernetes distribution from AWS that includes the same patches and runtime as EKS uses. This guarantees the same behaviour on-premise or in other runtimes as you get from EKS. With EKS-D, AWS brings their experience with security and reliability to on-premise Kubernetes. The combination of EKS-D and EKS offers customers consistent Kubernetes clusters as well as an extended support cycle both in the cloud and on premise. For customers, day two operations typically bring quite a bit of complexity including managing cluster component configuration like logging and metrics as well as workload deployment and delivery. We’ve created Weave Kubernetes Platform (WKP) to address these challenges.

WKP provides you with an application-ready cluster that includes logging and metrics as well as tools for observability. We’ve baked GitOps directly into WKP and in the demo show you exactly how to start deploying workloads on your development cluster as well as how to use WKP to give you insight into what’s running in your cluster. For an overview of how WKP can help you deliver on-premise Kubernetes, see Weaveworks Brings GitOps to Amazon EKS Distro

Install WKP to Production and Development Clusters

To start, we install WKP on both our production and development clusters using wk, which is the CLI tool for WKP. wk offers a convenient interface to existing WKP installations and also provides a way to install new clusters and WKP on to existing clusters.

$ wk setup install
||*NOTE*|| Please edit the file 'setup/config.yaml' to configure your cluster
Then, enter 'wk setup run' to create your cluster
$ vim setup/config.yaml
$ wk setup run
…
Successfully created and initialized cluster: wkp-prod


With the initial WKP installation, wk creates an initial repository that holds the cluster configuration. We need to change only a few settings in our wk configuration file. First, we need to point wk to a GitHub account where we’ll keep our repositories and secondly to a DockerHub account to fetch images for the components that make up WKP.

For the development cluster, we will repeat the process where we’ll end up with 2 GitHub repositories for our 2 clusters. This means that there is a clear commit history and instant rollback for each environment.

We can manage both repositories in the same directory using different git remotes, giving a simple and natural developer experience.

$ git remote -v |grep push
dev   git@github.com:michaelbeaumont/wkp-dev (push)
prod  git@github.com:michaelbeaumont/wkp-prod (push)

GitOps for managing and deploying workloads

In the demo, we show GitOps, as the foundation of WKP, for deploying a workload to development by adding some Kubernetes manifests to our wkp-dev repository.

dev_master.png

Flux, which is installed as part of WKP, watches the repo for changes. When it sees the new manifest, it ensures the resources are applied to the cluster. We then have the opportunity to test and verify our changes.

The final step we walk through is promoting the new workload to our production cluster.

We can checkout our prod/master branch and cherry pick our changes adding podinfo to our workload cluster into a new branch off of our production master branch, then push.

$ git checkout prod/master
$ git checkout -b prodpodinfo
$ git cherry-pick dev/master
$ git push -u prod prodpodinfo

Because the production cluster is more strictly gated by a review process, we open a PR in our production repo, get a review from our teammates and merge our changes in wkp-prod.

add-podinfo.png

We now have the same workload running in both our EKS and our EKS-D cluster.

eks-eks-d.png

GitOps and the Cluster API for infrastructure management

WKP also brings GitOps to any cluster. When you use WKP to install a cluster on existing infrastructure, on-premise or otherwise, it uses a Cluster API provider to do so. The Cluster API project brings declarative, Kubernetes APIs to manage the cluster lifecycle. We’ve written the Cluster API Existing Infrastructure Provider (CAPEI), bringing declarative GitOps management of Kubernetes to bare-metal clusters. Together with other CAPI providers for multiple public and private cloud systems this enables a consistent operations process across multi-cloud.

Watch the demo in its entirety here: 


Making Hybrid Cloud Real

From our customer base, we often see the exact hybrid cloud scenario we’ve demoed here - one where the dev/test cycle leverages a public cloud, but for security and other reasons, the production deployment runs on premise. Using WKP it is easy to run such a workflow on a consistent, battle-tested and uniformly supported Kubernetes distribution across both EKS and EKS-D. Devops productivity has proven business benefits and the GitOps capabilities in WKP allow these folks to use familiar tools to extend their development practice all the way out to runtime operations.


If you’d like to discuss a similar scenario in your own datacenter, please reach out to us.


Related posts

What is a Kubernetes Cluster?

Webinar: Achieving Record Growth While Reducing Costs: Dream11 Explains How

Weaveworks product highlights from GitOps Days 2021

Ask us for a demo of WKP