In this video, we'll walk you through building a Complete CI/CD Pipeline using GitHub, Jenkins (running in a container), SonarQube (containerized), NPM, Docker, Trivy, Amazon ECR, and deploying to Amazon ECS with Fargate. This end-to-end pipeline will automate the process from code commit to production deployment with security and quality checks.
Watch the comprehensive tutorial on YouTube: CI/CD Pipeline with Jenkins, Docker, and AWS ECS
Comprehensive Guide to CI/CD Pipeline with Jenkins, Docker, and AWS ECS
This tutorial provides a detailed walkthrough of building a CI/CD pipeline using Jenkins, Docker, and AWS ECS, aimed at improving software development efficiency and security.
- Setting Up Jenkins: Automated code builds from GitHub repositories.
- SonarQube Integration: Vulnerability checks to ensure secure applications.
- Docker for Image Creation: Building and managing Docker images.
- AWS ECS Deployment: Using AWS Fargate for simplified infrastructure management.
- Resource Management: Cleaning up resources to prevent unnecessary costs.
The video demonstrates the creation of an end-to-end Continuous Integration and Continuous Deployment (CI/CD) pipeline with the following steps:
- Code Build and Vulnerability Checks
- Jenkins setup for GitHub integration.
- Running unit tests and scanning for vulnerabilities using SonarQube.
- Docker Integration
- Building Docker images.
- Pushing images to Amazon Elastic Container Registry (ECR).
- Deployment to AWS
- Deploying Docker images to Amazon ECS with AWS Fargate.
- Setting up Jenkins: Configuring GitHub credentials and defining pipeline stages using a
Jenkinsfile. - Running Tests: Unit tests with npm and code quality analysis with SonarQube.
- Building Docker Images: Creating lightweight Docker images using Alpine.
- Vulnerability Scanning: Using Trivy to scan Docker images.
- AWS Integration:
- Pushing Docker images to AWS ECR.
- Deploying to AWS ECS using Fargate.
- Infrastructure Setup:
- Creating ECS clusters and services.
- Configuring load balancers and security groups.
- Application Management:
- Deploying and managing containerized applications.
- Monitoring performance and ensuring public accessibility.
- Cost Optimization:
- Deleting unused resources to prevent unnecessary expenses.
Run the command below to create the Jenkins Container:
docker run -d --name jenkins-dind \
-p 8080:8080 -p 50000:50000 \
-v /var/run/docker.sock:/var/run/docker.sock \
-v $(which docker):/usr/bin/docker \
-u root \
-e DOCKER_GID=$(getent group docker | cut -d: -f3) \
jenkins/jenkins:ltsCheck Running Docker Container
docker psLog into the Jenkins container
docker exec -it jenkins-dind bashRun the following bash commands in the Jenkins Container's Terminal:
groupadd -for -g $DOCKER_GID docker
usermod -aG docker jenkins
exitdocker restart jenkins-dindCheck the Jenkins Container Logs to Get Jenkins' Initial Admin Password
docker logs -f jenkins-dindOn your browser, open:
PublicIP:8080Paste Initial Admin Password & Install Suggested Plugins Create an Admin User Account with username & password, email, etc.
Create a Classic Personal Access Token on GitHub On your GitHub Account,
- Click on the User Account
- Click on Settings
- Developer settings, and select Personal access tokens and Click Tokens (classic)
- Generate new token, and select Generate new token (classic) Note: jenkins-git-dind, Expiration: 90 days, and Scopes (select the following): repo admin:repo_hook (For Webhooks) Generate token & save it somewhere safe
Add the GitHub Personal Access Token to Jenkins Credentials On the Jenkins UI,
- Click on Manage Jenkins
- Click Credentials
- Under System, click global, and Add Credentials
- Select Kind: "Username with password", Scope: Global, Password: "Paste the jenkins-git-dind here", ID: "jen-git-dind", and Description: "jen-git-dind". Create.
Create Jenkinsfile (a.k.a Pipeline As A Code)
Set the following Recommended Values Change your user to root first,
sudo suor
sudo iRun the commands:
sysctl -w vm.max_map_count=524288
sysctl -w fs.file-max=131072
ulimit -n 131072
ulimit -u 8192Exit the Root User:
exitRun the command below to create the SonarQube Container (Give it a few minutes to get ready):
docker run -d --name sonarqube-dind \
-e SONAR_ES_BOOTSTRAP_CHECKS_DISABLE=true \
-p 9000:9000 \
sonarqube:latestCheck the logs of the SonarQube Container:
docker logs -f sonarqube-dindOn your browser, open:
PublicIP:9000Username: admin Password: admin Update your password by providing a new password
Create a Local Project in SonarQube with Project key, GitHub branch (main), and "Use the global setting".
Create a SonarQube Token and Add it to Jenkins Credentials:
- Click on the User Account, and click on "My Account"
- Go to Security, create a token of type "Global Analysis Token", and expiry date
- Copy and Save the token somewhere safe.
- Go to Jenkins Credentials, select Kind "Secret text", Paste sonar token as the secret and provide ID and description
Go to Jenkins > Manage Jenkins > Systems > SonarQube Installation, and add the SonarQube URL (http://PublicIP:9000) and select the token. Apply and Save.
Go to Jenkins > Manage Jenkins > Add SonarQube Scanner (Install Automatically). Save and Apply
Put Both Jenkins and SonarQube Containers on the same Docker Network
docker network lsCreate Docker Network:
docker network create dind-networkAdd Jenkins and SonarQube Containers to dind-network:
docker network connect dind-network jenkins-dinddocker network connect dind-network sonarqube-dindLogin to Jenkins Container & Establish Communication to the SonarQube Container
docker exec -it jenkins-dind bashUpdate & Install Ping in Jenkins Container once you've logged in:
apt-get update
apt-get install unzip curl iputils-ping -yUse Jenkins Container bash to Ping SonarQube Container:
docker exec -it jenkins-dind ping sonarqube-dindYou should be seeing bytes of data coming in showing established connection between the 2 containers. Exit the Jenkins Container:
exitLogin to the Jenkins Container:
docker exec -it jenkins-dind bashCheck if Trivy is installed in the Jenkins Container:
trivy --versioncurl -sfL https://raw.githubusercontent.com/aquasecurity/trivy/main/contrib/install.sh | sudo sh -s -- -b /usr/local/bin v0.62.1Check if Trivy Installation was a success & exit:
trivy --version
exitRestart the Jenkins Container (optional but recommended):
docker restart jenkins-dindLog in to the Jenkins Container:
docker exec -it jenkins-dind bashapt-get update -y
apt-get install unzip curl -ycurl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
./aws/installCheck AWS CLI version
aws --versionGo to AWS Mgt Console > IAM > Create IAM User (Attach AmazonEC2ContainerRegistryFullAccess policy)
Go to the IAM User > Click Security credentials > Create Access Keys (with CLI use-case) > Create & Download Access keys.
On your terminal, login to Jenkins Container
docker exec -it jenkins-dind bashConfigure AWS IAM User
aws configureCopy and Paste Access key ID, Secret Access key, Region, etc. and exit
Note: Add Access key ID & Secret Access key to Jenkins credentials as Kind: AWS Credentials.
Create Amazon ECR Repository on AWS Mgt Console, and that should provide the links to login, tag, and push image to ECR
Next, create Amazon ECS Cluster (Fargate) on AWS Mgt Console, to serve the docker image pushed to Amazon ECR
- Create cluster > Select Fargate
- Click on Task definition on the left panel and create a new one, and add the instance type, task role, task execution role, container details, etc., and create
- Go into the ECS cluster created earlier and create a Service to expose the container to the public internet
Jenkins Plugins Needed Install the following plugins:
- SonarQube Scanner
- Sonar Quality Gates
- Pipeline
- Docker pipeline
- Docker plugin
- AWS SDK
To maintain cost efficiency, ensure proper deletion of:
- Target groups
- Load balancers
- ECS clusters
- Any associated AWS resources.
This guide empowers developers with a streamlined CI/CD pipeline setup. By combining Jenkins, Docker, and AWS ECS, you can build, test, and deploy applications effectively while maintaining high-quality standards and cost efficiency.