My Journey into DevOps & Infrastructure as Code Tools

I had been hand coding websites for almost 15 years when, in 2010, I started using WordPress as a CMS. Shortly thereafter I started attending the Portland WordPress Meetup on a regular basis. This gave a boost to my web development business and I felt drawn to learn more about the infrastructure required for hosting websites. I was a Windows user and was already familiar with the WAMP stack (Windows, Apache, MySQL, PHP) but WordPress didn’t run all that well on under Windows, and all of the web hosting companies I worked with all ran on Linux. I already had a file server in my home office I had built using the Ubuntu Server operating system so I set up Apache, MySQL, and PHP on it and used that as my local hosting environment. At the Portland WordPress Meetups, I was hearing about things like Git for version control, as well using virtual machines to create environments which matched the various web hosting companies’ environments. So, I downloaded VirtualBox onto my Windows machine and started creating Linux VMs and manually set them up with various configurations. I learned how to use Git and started keeping files under version control. At first, I only used Git under Linux on the file server and in the VMs. When Git for Windows was released I started using it natively on my Windows machine as well. My workflow at that point was as follows: Under Windows, I had mapped the network drive from the Ubuntu file server and would do all of my coding using either Notepad++ or the NetBeans IDE I was learning how to use for my Java programming classes. Since the files actually resided on the Ubuntu file server, editing them on my Windows machine changed them there. I would do my git commits and then push the changes up to the origin repo on Bitbucket. When I was ready to go live with the changes I would either SSH into the web host’s server, if I had that kind of access, or log into the cPanel and use whatever tool would allow me to do a ‘git pull’.

This was all rather tedious, manual, and error-prone. In February of 2013, my workflow took a quantum leap forward when Jeremy Felt gave a talk at the Portland WordPress Meetup and revealed his recently created tool Varying Vagrant Vagrants, which was basically a sophisticated Vagrantfile that spun up an Ubuntu 12.04 VM in VirtualBox and provisioned it automatically with nginx, php-fpm, MySQL, and Memcached. In a matter of a few minutes, one could have a completely provisioned VM with a LEMP stack and WordPress up and running. I had heard of Vagrant, which, at the time, was a tool for automating the creation and provisioning of VMs using VirtualBox only, but I had never tried it.

After that Meetup, I immersed myself in learning about Vagrant and studying the nuances of the scripts Jeremy had written. I became enamored with the infrastructure side of things and with automation. This led to learning about Puppet and Chef, and after that Ansible and Salt. Mitchell Hashimoto, the creator of Vagrant, created a company called HashiCorp and started building other DevOps/Infrastructure as Code tools. I started with Packer, which I used to create my own boxes for use with Vagrant. I heard Gene Kim speak about DevOps at a New Relic Future Talks Meetup and got interested in that new paradigm. I learned about Docker and containers and monitoring tools like New Relic.

After a couple of years of this learning and exploring, I started looking for opportunities to use these skills and tools on client projects. At one point I was hired by an IT consulting firm, which is an all Microsoft shop, to build a couple of VMs for running a Linux/nginx/MySQL/PHP (LEMP) stack to host WordPress sites for their existing clients on one of their Windows Server machines. So, I had to learn how to use Hyper-V for hosting and running the VMs. I used Packer to create a Vagrant box which would run under Hyper-V instead of VirtualBox and then Vagrant to spin up and provision the VMs. The files for this virtual infrastructure were kept in a Git repository. Later, I created an Ansible playbook to manage the infrastructure on the already running VMs in order to eliminate the need for manual updating, patching, etc. of the operating system and software packages inside the VMs.

This led to my taking a break from over 20 years of self-employment and accepting a position as a Systems Engineer for 10up. Many of 10up’s original employees were participants in the Portland WordPress Meetup and we had become acquainted over the years. While at 10up I got the opportunity to learn about a wide variety of hosting platforms and infrastructure setups, including an in-house web hosting platform, which was being managed with Ansible. Several of the projects assigned to me were hosted on Amazon Web Services but I mostly just SSHed into the machines or used the AWS Console to manually manage them. At one point, we decided to migrate one of my clients off of Linode and onto AWS. This was an opportunity to automate the process and to get the infrastructure under version control using Ansible and Git. I was able to use Ansible to set up the VPCs and security groups, spin up EC2 and RDS instances, set up load balancing, etc. Future changes and updates were all handled by Ansible.

While at 10up I also got to use CI/CD tools like Gitlab CI, Jenkins, Travis, Beanstalk, DeployBot, and CodeDeploy. I got proficient with Jira and Confluence.

More recently I’ve gotten into using Terraform to create infrastructure. If the infrastructure uses immutable machines, meaning once they’re spun up they never change, I tend to continue using Terraform to manage it. Terraform works really well at creating and maintaining whatever state I tell it to without my having to tell it how to do so. It does a good job of spinning up new machines with new provisioning and getting rid of the old ones. However, if the machines are mutable, meaning they can be modified after being spun up, I like to use Ansible to manage the updating, the patching, the installing of new software packages, etc. I can use Ansible to set up infrastructure as well as maintain it, as I did with the Linode to AWS migration, but I prefer to use Terraform for the creation and then Ansible for the maintenance.

My current workflow still happens mostly on a Windows laptop, but sometimes from my Linux laptop, which multi-boots RHEL, CentOS, Fedora, SUSE, and Debian. On my Windows machine,
for WordPress development, I use 10up’s WP Local Docker under Docker for Windows and Hyper-V. I still live mostly at a Linux BASH prompt, either in the Git BASH, a Linux terminal emulation, or in Ubuntu running on Windows Subsystem for Linux (WSL). With WSL I can run Ansible on Windows without having to spin up a Linux VM. When I need to use a Red Hat/CentOS flavor of Linux, which is not yet available to run under WSL, I either use my Linux laptop or on the Windows machine a Hyper-V VM. I am currently in the process of architecting a new file and web host server which will run Red Hat 8. I plan to make extensive use of containers.

Published by Robert Lilly

Robert Lilly provides Information Technology consulting and services at RCLilly IT.

Leave a comment

Your email address will not be published. Required fields are marked *