Project: Deploying Two-Tier Application on AWS using Automated CI/CD Pipeline
Table of contents
- 🌟Introduction:
- 🌟Pre-requisites:
- 🌟Jenkins Installation:
- 🌟 Jenkins Setup for First Time:
- 🌟Docker & Docker Compose
- 🌟Create Dockerfile & Docker-Compose file
- 🌟Create a Declarative CI/CD Pipeline:
- 🌟 Two-Tier-App Project:
- 🌟GitHub Webhook Trigger:
- 🌟Create a Jenkins file For GITSCM Polling:
- 🌟 Build the Pipeline:
- 🎯Conclusion:
🌟Introduction:
This article will demonstrate how to create a Two-tier Flask application🐍🌐 that has two primary components. First, there is the flask application, which is written in Python 3.x, and Subsequently, there is the MySQL database, requiring manual installation and configuration on our part since its integration is already provided by the 👩💻developer in the Python code. The developer wrote the code and pushed it to GitHub 🔄📂. As DevOps engineers, our responsibility is to establish a seamless pipeline. This pipeline will autonomously retrieve the developer's code from the GitHub repository. Subsequently, it will proceed to undertake the tasks of building, and testing through Jenkins' CI/CD pipelines. Afterward, Jenkins will deploy the code onto a separate AWS server within a 🏭🐳containerized environment.
🌟Pre-requisites:
✨ Familiar with AWS, Docker, Docker-Compose and Jenkins
✨ 2x AWS EC2 instances(OS Ubuntu) setup and in running state
✨ Must have a GitHub account
✨ Fork this repository "ahmednisarhere" to your GitHub.
✨ Must have a Docker Hub account
🌟Jenkins Installation:
🚀Java Installation
Jenkins requires 📦Java for its operation, yet certain distributions don’t include this by default. Update the Debian apt repositories, install OpenJDK 17, and check the installation with the commands:
sudo apt update sudo apt install openjdk-17-jre java -version openjdk version "17.0.7" 2023-04-18 OpenJDK Runtime Environment (build 17.0.7+7-Debian-1deb11u1) OpenJDK 64-Bit Server VM (build 17.0.7+7-Debian-1deb11u1, mixed mode, sharing)
🚀Jenkins Installation
You can set up Jenkins using the following steps.
curl -fsSL https://pkg.jenkins.io/debian-stable/jenkins.io-2023.key | sudo tee \ /usr/share/keyrings/jenkins-keyring.asc > /dev/null echo deb [signed-by=/usr/share/keyrings/jenkins-keyring.asc] \ https://pkg.jenkins.io/debian-stable binary/ | sudo tee \ /etc/apt/sources.list.d/jenkins.list > /dev/null sudo apt-get update sudo apt-get install jenkins
Please visit Jenkins's official website to stay up-to-date with the latest release
🌟 Jenkins Setup for First Time:
Following the successful setup of Jenkins installation, access your Jenkins EC2 instance's security group 🔒 within the AWS console. Add "Inbound rules" according to the instructions displayed in the image below. Port "5000" corresponds to our application, while port "8080" is for Jenkins. To streamline the process, open both of these ports from any IPv4 address. This approach simplifies accessibility, allowing anyone with the correct 🌐 IP and port information to connect.
Upon modifying the "Inbound rule," locate the🌐public IPv4 address and input "your IPv4:8080" into the address bar. This action will lead you to the next page. Locate the "Administrator password"🔒🔑as depicted in the image ("sudo cat /var/jenkins_home/secrets/initialAdminPassword"), then paste this password into the provided field. Finally, click the "continue" button.
Once you've entered the initial 🔑 admin password, you'll be directed to the plugin installation page. Simply click on the "Install suggested plugins" option. 🛠️
Jenkins will take care of automatically installing 🛠️the suggested plugins, as shown below.
Once the recommended plugins are successfully installed on the following page, your next step is to establish the initial 🛠️👤admin user. This user account will be utilized moving forward to access Jenkins.
Afterward, Jenkins will validate its 🔗URL by displaying it. Just proceed by clicking on the "Save" button to 👌complete the setup.
Subsequently, you will be taken 🔜back to the main page🏠.
🌟Docker & Docker Compose
🐳Docker Installation
sudo apt update sudo apt-get install docker.io
🐳 Docker-Compose Installation
sudo apt update sudo apt-get install docker-compose
🌟Create Dockerfile & Docker-Compose file
In our scenario, a developer 👨💻 has committed code to the GitHub repository named "ahmednisarhere" and our responsibility is to transition this code into a containerized setting. To accomplish this, we need to create a 🐳 Dockerfile that will be used to construct the image and subsequently launch the container.
Dockerfile:
# Use an official Python runtime as the base image FROM python:3.9-slim # Set the working directory in the container WORKDIR /app # install required packages for system RUN apt-get update \ && apt-get upgrade -y \ && apt-get install -y gcc default-libmysqlclient-dev pkg-config \ && rm -rf /var/lib/apt/lists/* # Copy the requirements file into the container COPY requirements.txt . # Install app dependencies RUN pip install mysqlclient RUN pip install --no-cache-dir -r requirements.txt # Copy the rest of the application code COPY . . # Specify the command to run your application CMD ["python", "app.py"]
Subsequently, we need to write a "docker-compose.yml" file as a prerequisite for running two containers that must effectively interact to operate the application. In this docker-compose file, the initial container relies on the database container. Initially, it will fetch the "mysql:8" image from DockerHub 🐳, and upon launching the container, it will include mounted volumes and specific credentials set as environment variables. Subsequently, after the database container is established, Then to the second container – the backend. This container will generate an image from the previously written Dockerfile. Then, it's essential to instruct the first container to connect with the other database container using the provided credentials passed through environment variables. Finally, the last step involves running the container on port "5000."🚀
docker-compose.yml:
version: '3' services: backend: build: context: . ports: - "5000:5000" environment: MYSQL_HOST: mysql MYSQL_USER: root #${MYSQL_USER} MYSQL_PASSWORD: ${MYSQL_ROOT_PASSWORD} MYSQL_DB: ${MYSQL_DATABASE} depends_on: - mysql mysql: image: mysql:8 environment: MYSQL_ROOT_PASSWORD: ${MYSQL_ROOT_PASSWORD} MYSQL_DATABASE: ${MYSQL_DATABASE} MYSQL_USER: ${MYSQL_USER} MYSQL_PASSWORD: ${MYSQL_PASSWORD} volumes: - mysql-data:/var/lib/mysql # Mount the volume for MySQL data storage volumes: mysql-data:
🌟Create a Declarative CI/CD Pipeline:
To initiate a job 🛠️, press "Create a job," and in the subsequent window, opt for "Pipeline" 🔨as the job type. Assign a name to the pipeline and then proceed by clicking on the OK button.
Note: Don't use the space in the name of Pipeline.
Upon generating a 📝pipeline job, the subsequent window will emerge as depicted below. At this point, provide a description for the pipeline, select the GitHub project, and paste the 🔗 URL of your GitHub repository.
Scroll down and mark the ✅checkbox labeled "GitHub trigger for GITScm polling." Enabling this checkbox, to informs the pipeline that it will be triggered through a GitHub hook 🔗.
🚀 Now, here's comes the main part, the pipeline script📜. This is where you need to write a Groovy script for your pipeline.
🌟 Two-Tier-App Project:
Within this pipeline 🚀, it's essential to employ a ".env" file containing the MySQL and app container credentials, stored in the directory (/var/lib/jenkins). However, this process can alternatively be achieved using a Jenkins secret file. Furthermore, Jenkins 🔐credentials are utilized here for 🐳DockerHub access and SSH connection to the AWS deployment server.
Groovy Syntax:
pipeline{
agent any
stages{
stage("Clone the Code From GitHub"){
steps{
echo "Pull the Code from GitHub"
git url: "https://github.com/ahmednisarhere/2-tier-flask-app.git", branch: "main"
}
}
stage("Build & Test the Code"){
steps{
// Retrieve password from Jenkins credentials
echo"Build the Code"
sh"docker-compose down && docker-compose --env-file ~/.env up -d"
sh"docker cp message.sql \$(docker ps -f name=mysql -q):/docker-entrypoint-initdb.d/"
echo"Test the Code"
sleep time: 35, unit:'SECONDS'
sh'curl -Is http://localhost:5000 | head -n 1'
}
}
stage("Push the Image to DockerHub"){
steps{
withCredentials([usernamePassword(credentialsId:"dockerHub",passwordVariable:"pass",usernameVariable:"user")]){
sh "docker login -u ${env.user} -p ${env.pass}"
sh "docker commit \$(docker ps -f name=mysql -q) ${env.user}/two-tier-db:latest"
sh "docker commit \$(docker ps -f name=backend -q) ${env.user}/two-tier-app:latest"
sh "docker push ${env.user}/two-tier-app:latest"
sh "docker push ${env.user}/two-tier-db:latest"
sh "docker-compose down"
echo"Image Pushed"
}
}
}
stage("Deploy to AWS Server"){
steps{
withCredentials([sshUserPrivateKey(credentialsId:"my-key", keyFileVariable:"MY_SSH_KEY1", usernameVariable:"SSH_USER1")]){
sh "ssh -o StrictHostKeyChecking=no -i ${MY_SSH_KEY1} ${SSH_USER1} 'sudo rm -rf 2-tier-flask-app && git clone -b prod https://github.com/ahmednisarhere/2-tier-flask-app.git && pwd && cd 2-tier-flask-app && docker-compose up -d'"
}
echo"Deployed"
}
}
}
}
🌟GitHub Webhook Trigger:
Navigate to your GitHub repository and access the settings, as indicated. 👉⚙️
In the settings, locate the webhook in the left-hand 👈 column and then give it a click. 🔗
Click on the 🔗 "Add webhook" option located on the right side. 👆
Include the Jenkins URL 🌐🔗 in this format: "your_ip:8080/github-webhook" and substitute the URL with the Public IP of your Jenkins Server.
Once the webhook is set up successfully, a ✅ green checkmark will become visible.
🌟Create a Jenkins file For GITSCM Polling:
🔧 The next step includes arranging the Jenkins pipeline via a Groovy script stored on GitHub. We specify the exact file location as indicated below. To start, opt for "Pipeline script from SCM" from the pipeline options and set the parameters as follows: 🛠️
SCM : Git
Repository URL : Your-Github-Project-Repo
Creditionals: None (Public Repository)
Branch: Your-Repo-Branch
Script Path: Name of Groovy Script/Path (Jenkinsfile)
🌟 Build the Pipeline:
Your project is all set to be built and will now trigger 🏗️ whenever there's a push 📦 to your GitHub repository! 🚀
🎯Conclusion:
This demo project aims to showcase the DevOps tools and practices, illustrating the functioning of an automated CI/CD pipeline. By utilizing these tools and methods, it becomes possible to release a constant flow of software updates into production to quicken release cycles, lower costs, and reduce the risks associated with development