At Least I Have Chicken
Sometimes programming is less about the code you’re writing and more about the tools you’re using. I do not manage this website in an industrial capacity and I feel that my selection of tools reflect that.
I have written this website using .Net – Razor Pages, to be precise – so I feel Microsoft Visual Studio 2022 Community Edition and Microsoft SQL Server Management Studio suffice for my needs. I have been meaning to convert this website from Razor Pages to a more up-to-date Blazor, but that seems like a lot of effort for something that already works fine. Razor Pages might not be the hip new framework that the kids are using, and asynchronous component reloads are borderline impossible without writing your own AJAX scripts, but for a simple website like mine it does the job.
However, I recently did upgrade my toolbox with a whole new integrated system. See, I recently finished reading a book called The DevOps Handbook. In my opinion, mostly an underwhelming book, but it did contain some valuable nuggets of information. I read the second edition, which contained some real-life examples of implementing DevOps in big businesses. So, most of the time it just boiled down to a chapter beginning with theory about a certain aspect of DevOps, and then ending with a concrete example, meaning the same as the beginning but with proper nouns mixed in. After finishing it I’m not sure if I learned more about DevOps or the business culture at Etsy.
Back to the topic. I use DevOps, kanban boards and CI/CD pipelines every day at my work. However, I realized that there is a staggering contrast between my work methods and my hobbyist methods. When I first started in web development in 2019, my job required me to develop and maintain a website using Web Forms and then deploying it to the server using an FTP tool. Eventually my job evolved to developing a Razor Pages project as well, but the methods stayed the same. Only two job changes later was I introduced to continuous integration and continuous deployment.
It has been five years since I started my career in web development, but even as late as this January, if I needed to update this very website, did I publish the files onto a local folder, drag them over to the server using FileZilla and replace the existing server files with the new ones. I don’t want to pull back the curtain too much, but for security reasons this involved some tinkering with the server firewall as well. It was all very inconvenient.
After reading The DevOps Handbook, even if I didn’t learn much new from it, it did give me the inspiration to finally implement a CI/CD pipeline on my server, so I could deploy website updates like a goddamn professional instead of my former caveman methods.
I familiarized myself with Jenkins, an open-source automation tool. It was a bit of an arcane process to get it installed and working on my server, but I finally managed to set it up and get it running on my server. Mind you, this whole website generates zero revenue for me, so I used the lowest available tier possible, meaning this whole website is running on crackers and spit. After installing Jenkins I did have to bump up my memory up to double the previous, so that’s how much I considered it an improvement.
I am a solo developer so I rarely use more than one Git branch in my projects. When there is nobody else to mess up your version history there are also no merge conflicts. With Jenkins up and running, however, I did create a new release branch alongside the master branch. I set up Jenkins to communicate with GitHub so whenever I make a push into the release branch, an automatic build and deploy are started. So, I can safely work on the website in the master branch and whenever I deem it ready for publishing, I merge it with the release branch and push the changes into GitHub.
Since I could update my website anytime I have spare time, my website could also go down any hour of the day for the duration of the update. IIS has this convenient feature that when it detects an app_offline.htm file in the website root folder, it displays that file instead of the website, so the website could go down and the users could still see a maintenance message. This website is running on an Ubuntu server with NGINX, so I had to configure NGINX manually to have the same feature. I created a simple app_offline.htm file and committed it to the version control. Then, I set a new step in the Jenkins deploy process that the file is copied into the root folder at the beginning of the deployment, and then removed when everything is finished. In the NGINX configuration, I set it to serve the client with the app_offline.htm file if it is detected in the root folder, otherwise it would serve the actual website as per usual.
Now I don’t have to bother with FTP tools or managing the server portion of the website manually. Jenkins will do it for me. This system is fairly standard for people in web development, but I was inspired only recently to implement it, even though I consider myself a skilled professional. There are probably more methods I could put into practice, but there is a limit of cost-efficiency with hobbyist projects like this one. First and foremost, I really should update this website to use Blazor. Maybe down the line I could add more unit tests or create a whole program that runs the whole server for me. The best part of being a programmer that when it comes to code, the sky's the limit.