It's Netrobe here again. I would love to introduce you to an open-source project I built using Django, Python, and many amazing tools out there, that helped my team members at Spacepen deploy web products built with React, PHP, HTML, and Python without the need of a Cloud Operations Engineer, Netrobe. It was an amazing project, it allowed frontend designers and backend developers to set auto-deployment on GitHub repositories, which automatically redeploys the project to a production server from GitHub on every update on the repository. This helped to increase product delivery time and iteration testing while in a production environment.
Available on GitHub.
It's an open-source project and open to contributions.
Problem statement 😖
There are several team members and just a DevOps engineer, these team members are mostly frontend developers that are placed in separate groups to work on separate projects. Some of the team members are backend developers. What they all have in common is that they need to deploy their projects to the business server so they can test in production with no hassle. Mostly this deployments are iterations, for example,
build -> deploy -> test -> debug -> build …. These deployments will need the DevOps engineer, Netrobe, to always be available which is usually stressful for both him and them. Mind you, there is no time for each developer to learn about tools that can be used for CI/CD, we just needed a simple way for them to deploy without stress.
No one hates stress more than I do, I like automating everything. If it's possible to automate a certain task, I'm going to do it. 😁
Solution statement 💪🏽
Now the solution is to give team members just the ability to deploy and test each product on the server without the need to wait for the DevOps engineer to be available all the time. They would not even touch the server but deploy and iterate new products seamlessly in production through a friendly web interface.
Solution Chart 😎
SSH Connection 🎯
I planned to use the python Paramiko module to connect to the server through SSH (Secure Shell). This allows me to automatically send commands to the server with SSH securely, for example, commands to deploy Nginx or Apache websites depending on the web server the connected server is configured to use.
GitHub API 🎯
Also, I used GitHub API and GitHub Webhooks to automatically redeploy deployed websites on the server on every new update made to deployed repositories. Logged-in user can connect their GitHub account to the software, and configure repositories that can be deployed and those that should not be deployed. With this implementation, they can easily select a repository they want to deploy to production.
Linode API 🎯
Lastly, Linode was the VPS provider used by the company, so I used Linode's API, which is very robust, to automatically create, update and delete domain records when deploying new repositories. Meaning, that when a user is about to deploy a repository, they can set the domain name for the product they want to deploy right there. This allows them to easily deploy different versions of products to production with different domain names.
How did I come up with this solution
First of all, this is might not the best solution for this project. I knew I would need to talk to the server automatically to do some things. I use SSH almost every day to connect to several servers, so I knew all I needed to do was find a way to do this using python. During my research, I got to know about
Paramiko. It was very bad, there was no good documentation there at all. I learned more about finding resources and codes online. I had to do a good amount of searches on google and narrow down my search results to important ones by adding important keywords. This is one of the best skills to have, just keep on practicing it daily. I have never had a reason to ask a question on StackOverflow myself even when I hit complex problems, I will keep on looking for some kind of clues on the web.
Also, as for the GitHub API, I just searched
GitHub API and
GitHub Webhook just to know if they had an API and webhook. The documentation for their APIs is pretty easy to use. I searched for Linode's API too, it was nice, it's even possible to launch new servers with their API. I learned how to work with multiple APIs easily in a pythonic way. I utilized the
DRY pattern and created a base API class that I can inherit later in other API-specific classes for Linode and GitHub. This way made it easy for me to not repeat header authorization, get, post, and other methods in both classes. With this current implementation, if I want to add GoDaddy's API, Namecheap's API, or any API, it would be easy for me to get started with it, and help me focus on consuming the API instead of reconstructing the wheel.
Furthermore, I learned more commands for Linux Bash when using SSH, I had to automatically run some complex commands, for example, finding files, checking if files exist, creating files, moving and copying files, reading log files from server to website, filtering log files and many more. I had to do a lot of complex things. I had to learn learn and use other design patterns like
Composition over Inheritance, Bridge Pattern, Adaptor Pattern, Singleton Patterns to create reusable components. I had to build dynamic Apache and Nginx configurations, which depend on the domain name, deploy folder, and logs.
Most of the things I learned are;
- Linux commands
- Bash Scripting
- Using the Python Requests module: authorization with headers
- Working with APIs
- Design Patterns: Composition over Inheritance, Bridge Pattern, Adaptor Pattern, Singleton Patterns
- Redis Database: used this for fast in-memory caching and message broker. The message broker part was to help the Django sand manage asynchronous deployment processes.
- Python OS module, it's amazing Pathlib module, JSON module, and more.
- Python Paramiko module
- Python Generators
- GitHub API and GitHub Webhooks: read and track repositories
- Consuming webhooks properly
- Linode API: managing domain records
- Apache and Nginx: webservers for deploying the repositories on the connected servers.
- UML diagrams and Flowcharts: Helped me know to understand the process before doing justice with python.
These are some of the tools I learned while building the project but did not use for it.
- GitHub Actions: setup CI/CD for your repositories. Build, Test, Deploy. I later used this knowledge for other projects, it's an amazing tool to learn. Very good when collaborating with other developers, you can easily set up tests to run anytime a branch is to be merged with the master branch. It was after I learned this tool I found out that most big open source projects out there are using it, they tend to collaborate with thousands of other developers and have to be releasing builds very frequently. Get started with GitHub Actions.
- Message Queues: Redis is not a message queue, but it could do what I needed. Tools I got to know more about were Kafka and RabbitMQ. Message queues are excellent, and can be used when building decoupled services that need to communicate with each other.
I later found some other tools that are better than Paramiko, it's hard to run sudo commands on Paramiko securely. A little tweaking in the way sudo commands are being called made it possible but in the long run it's not a secure way. It's like a hack, that module is ancient. The better tools to use instead of Paramiko are Fabric and Invoke. These tools are built on top of Paramiko, thanks to Jeff Forcier for doing this. My love for open-source projects is too much. Es muy grande. I am planning to use these modules in place of Paramiko. If you want to contribute, you can contact me on Github let's learn together. 💪🏽😁
There are a lot of tools out there that a developer can learn to automate this kind of stuff, but I just wanted to try it out mostly and also help my team members in the process. By doing that, I learned a lot of things in the process. Don't say someone has done it before, push yourself to the limit. When you do that, you will know you have no limit.