In my last post, I talked about the evolution of my lab. In this post, I’ll try to explain the best way possible what’s going on in there, from the moment I write the code to the moment it goes into “production”.

I will use this website as an example, but it applies to virtually everything I maintain, from my personal websites, projects, PiVPN, or other applications.

Development: it works on my PC!

Some of the things, like the website, blog, and CV, are developed by me using a variety of tools. I don’t have a very specific criterion for choosing, but usually the choice is made based on what I want to learn. My focus is on learning more about cloud, DevOps and GitOps, testing new applications, pipelines, a little bit of everything! This is where the learning journey begins and usually results in more serious implementations within the company where I work.

The process is very similar for almost everything. In the case of this blog, I currently use HUGO to generate the website. I won’t go into details about what HUGO is and does, but in a very simple way, each article is a markdown document that is then used to generate a static website.

For other applications, it depends. But usually, I always develop with the awareness that nothing runs on my computer and it will have to run somewhere. Usually, everything ends up inside Docker images.

Version Control: Git is truth, Git is life, GitOps, GitOps, GitOps!

Since there is no DevOps or GitOps without version control, for that, I use Git. To host the code, I use GitLab for private repositories and GitHub for OpenSource projects. Yes… I can have everything on the same, but that way I’m depriving myself of knowledge, and this journey is all about knowledge!

By using both, I not only know both platforms relatively well, but I also get to know their CI/CD tools (GitlabCI, GithubActions, and even TravisCi in the case of PiVPN).

Back to the topic! After testing the application, website,

Version Control: Git is truth, Git is life, GitOps, GitOps, GitOps!

Since there is no DevOps or GitOps without version control, for that, I use Git. To host the code, I use GitLab for private repositories and GitHub for OpenSource projects. Yes… I can have everything on the same, but that way I’m depriving myself of knowledge, and this journey is all about knowledge!

By using both, I not only know both platforms relatively well, but I also get to know their CI/CD tools (GitlabCI, GithubActions, and even TravisCi in the case of PiVPN).

Back to the topic! After testing the application, website, script - whatever you want to call it - I commit the code. Each commit follows the Angular commit style (Hold on a bit, you’ll soon understand why!). I push the code into the “test” branch and then, later, merge into the “master” branch.

And that’s where the manual work ends. I open a beer and let the machines do the work!

Pipelines and Docker: DevOps, a lifestyle!

As soon as the code “lands” on one of the defined branches, the CI/CD tools begin their perfect dance in perfect synchrony! GitHub Actions or GitlaCI are responsible for running these pipelines, which run (more!) tests if necessary.

They build the container image and finally run Semantic-release to automatically calculate and version the code. For this, this tool analyzes each commit and calculates which will be the next version based on them and according to the rules of semantic versioning. Remember the Angular Commit style? Well, that’s why it’s important! Finally, it tags the containers and pushes them to the registry (GitLab, DockerHub, GitHub)!

Kubernetes, Flux CD, Watchtower

Everything running in the cloud is currently operating on a Kubernetes cluster in Google’s cloud (GCP).

I use Terraform to create infrastructure and FluxCD to manage everything running in the cluster.

FluxCD constantly monitors the Git repository where the configurations of the applications running in the cluster are stored and reverts any changes that are not included in the configurations that exist in the repository. This means that if I apply kubectl apply NewApplication.yaml directly from my PC, the changes will be reverted minutes later in true GitOps style!

However, if I commit the configuration and push it to the repository without applying it, FluxCD will detect the new application and apply it automatically! Yay!!!

In addition, FluxCD monitors the container registries and whenever a new version of the application is released, it updates the configurations, commits, pushes to the repository, and applies which in turn causes Kubernetes to update the application containers!

commit 844ec31b7d74838f50b72836d4d1c8c2fc2dd84Continuation:

Author: fluxcdbot <fluxcdbot@4s3ti.net>
Date:   Thu Mar 16 20:27:51 2023 +0000

    pivpn/docs:1.4.0
diff --git a/pivpn/03-docs.yml b/pivpn/03-docs.yml
index 78f5c97..495269e 100644
--- a/pivpn/03-docs.yml
+++ b/pivpn/03-docs.yml
@@ -22,7 +22,7 @@ spec:
         node_pool: "cfplayground-global"
       containers:
       - name: pivpn
-        image: pivpn/docs:1.3.0 # {"$imagepolicy": "flux-system:pivpn-docs"}
+        image: pivpn/docs:1.4.0 # {"$imagepolicy": "flux-system:pivpn-docs"}
         imagePullPolicy: Always
         ports:
         - containerPort: 80

That’s beautiful!

The same applies to everything hosted on my servers but not developed by me, from PrivateBin, Whiteboard, monitoring tools to helm charts!!

What runs in the home lab is a simpler process. I use the :latest tag for everything, and Watchtower takes care of the rest!

Summing up!

The content may not fall here very often, and to tell the truth, the quality of it matters little to me. What really matters to me is the quality of what happens behind this “simple website,” and I’ll even say more! Blessed artificial intelligence that helped me write, structure, correct, and translate this article!

But what I want to emphasize is that a “simple website” sometimes is all it takes to distinguish a professional from a good professional! A “simple website” is all it takes to gain knowledge and understand a wide range of important tools in the professional life, from a little bit of code, whatever language it may be, infrastructure, cloud, CI/CD tools!

In today’s world, where there is more and more work to be done and the demand for new features is increasing, there is less and less time for maintenance and an increasing need to automate routine tasks! Nobody wants to have to do the same things like robots over and over again, that’s what machines are for!

Now, I’m just going to drink a beer while the machines take care of putting this article online! And spend a little more time thinking and questioning myself, “how am I going to improve all of this and automate even more?”