AWS CI/CD Project:

Ghazanfar Ali
11 min readJul 6, 2023

--

Introduction:

In this project we will only use AWS services to build CI/CD pipeline, like code commit to store source code instead of github, AWS code build service to build the artifact as we do from jenkins, build service to build the artifact, S3 bucket to store artifacts, AWS deploy service use to deploy artifacts on various targets like AWS beanstalk where tomcat platform will be running. Beanstalk will also connect with RDS database because our application required database. We will combine everything with AWS pipeline service.

AWS Beanstalk:

First we will see AWS beanstalk service. On beanstalk you can easily run your applications you don’t need ec2 instance or load balancer. It provides an environment where you can deploy and run your applications without worrying about the server configuration details.

First we will create an application on beanstalk:

In AWS Elastic Beanstalk, an “application” represents your entire web application or service. It includes all the necessary files, configurations, and code that make up your application. An application in Elastic Beanstalk is a container for one or more “environments.”

An “environment” in Elastic Beanstalk represents a specific instance of your application running in a particular configuration. It consists of AWS resources such as EC2 instances, load balancers, databases, and more. Each environment can have its own unique settings and configurations.

Now we will create environment inside application:

Always choose the name of your environment according to your need like development, production etc:

Create a iam role with below mentioned policies so that we can use that role in beanstalk:

Use this role in beanstalk, als create ec2→ keypair and use this in beanstalk for future ssh to beanstalk:

Choose default vpc and check mark on public ip for ssh:

Do not enable database here as we would use RDS separately.

In autoscaling select load balance:

If you want your instances do not update all at once it can cause downtime to your application so select rolling option at rolling update option and choose percentage to 50 so that half of instances will update at once:

We have created beanstalk service it will auto create ec2 instances as we defined 2 instances, security group, load balancer, target group, auto scaling group to make sure if any instance down it auto creates that one:

Now we will move toward RDS (Relational database service):

After deployment our application should connect to RDS database and instance health should be healthy.

Steps:

First we will create the RDS instance:

Create new security group for RDS:

Additional information:

Click on submit:

View your database credentials and save it somewhere:

Once database got ready also copy its endpoint detail at “view connection detail” option.

Now we will update the security group:

Now copy the instance security group id that we will add in rds security group so that instances will access the rds through security group policy:

Note: Security group id of both instances are same so copy from any one of them:

Edit inbound rule of RDS:

Now we will ssh to beanstalk instance and connect to RDS from there:

First we will connect to rds instance for tat we need mysql client:

Now connect to RDS using its end point:

Now we will deploy our sql file for that we need git:

We will clone the source code:

checkout to branch vpro-rem.

Here we have sql file which we will push to RDS:

Now push file:

This will initialize the database.

Now we will build the artifact and deploy:

Before that we need to make change in target group health check:

Elastic beanstalk→ vprofile-prod→ configuration → configure instant traffic and scaling → processes → action → edit:

Also enable session stickiness there:

Once settings are save click on the apply:

Now our health check will change from healthy to severe as now we are monitoring the health on /login path that is not available now as we have not deployed our artifact until now:

Now our beanstalk and RDS in ready we also have tested it now time to setup CI /CD pipeline for it, first we will see code commit service thats where we will store our source code instead of using github.

lets go to the service code commit:

First create repo there:

We will use ssh method to access our repo:

We will follow below steps:

First we need iam user so create it first:

Create policy and search for CodeComit:

Allow all action for code our commit that we created:

Add arn in resource section:

Our policy of user vprofile-code-admin created now attach this policy to user and click on create user:

Now go to security section of user and upload ssh public key for code commit:

Generate ssh keys on your local machine:

Store the public key at iam user:

Now create a ssh config file:

Now lets make a test:

Now you can also clone to your code commit repo to your local laptop:

Now we have followed all four steps and successfully clone the code commit repo to our local machine:

Now we will transition our github repo to code commit repo, but issue is this repo has address of github as remote repo we have to change it to codecommit:

First checkout to master branch:

⇒ git checkout master

Then list all the branches which does not have Head and master name so we will move them to /tmp directory:

Now move output to temp:

Now checkout to all branches using for loop:

Now remove remote repo that is github:

And add code commit repo:

Now push all changes:

Now its time to build artifact from this source code using code build service:

This is same like we do in jenkins we fetch the code, build , deploy and notify etc. Its not charge all the time like ec2 instance it works on pay only once you build model:

Note: Always try to use different name like add some numbers in the name otherwise this service creates issue in service role.

Now define from where it will fetch the code also specify the branch name:

Images option is like its own docker image ,

It will create service role that will access few services like s3 bucket, or permission to code commit repository :

Build specification is like what to execute , what command to run, all those things can be specifies through build spec file, or use insert option if you have simple commands like “mvn install”, but our source code has application.properties file and that file needs to be update and we can provide RDS information the backend service information so before we run maven install we should run some commands that gonna search data base values in application.properties file and replace it with RDS values for that we will write build spec file:

We will use below mentioned yaml code in our build spec file:

version: 0.2

#env:

#variables:

# key: “value”

# key: “value”

#parameter-store:

# key: “value”

# key: “value”

phases:

install:

runtime-versions:

java: corretto8

pre_build:

commands:

- apt-get update

- apt-get install -y jq

- wget https://downloads.apache.org/maven/maven-3/3.8.8/binaries/apache-maven-3.8.8-bin.tar.gz

- tar xzf apache-maven-3.8.8-bin.tar.gz

- ln -s apache-maven-3.8.8 maven

- sed -i ‘s/jdbc.password=admin123/jdbc.password=vEWiV4r7GFnjQLpZaYEl/’ src/main/resources/application.properties

- sed -i ‘s/jdbc.username=admin/jdbc.username=admin/’ src/main/resources/application.properties

- sed -i ‘s/db01:3306/vprofile-rds-prod.csz0m8h3t9vj.ap-south-1.rds.amazonaws.com:3306/’ src/main/resources/application.properties

build:

commands:

- mvn install

post_build:

commands:

- mvn package

artifacts:

files:

- ‘**/*’

base-directory: ‘target/vprofile-v2’

Note: you replace RDS password and endpoint as your own.

Now we have to upload the artifacts on amazon s3:

Now make sure to stream the logs on cloud watch:

Project created successfully:

Now we have to build the job and then deploy on beanstalk:

Before we integrate all things together lets test our build job first:

Build logs coming from cloudwatch:

Now its time to connect everything together , we can do that through code pipeline:

Define source code location so whenever any change or commit occurred in source code it will trigger the pipeline, change detection will be done through cloudwatch events:

Now define where we have to deploy our code:

As soon as pipeline is created it will triggered auto:

If deployment complete successfully, then go to beanstalk to test:

Finally our application is successfully deployed on beanstalk using AWS CI/CD services:

Now login to this application with user admin_vp and same password to check it is properly communicating with RDS database:

Now make any commit in the repo that will be detected by cloudwatch event and on the basis of that code pipeline will be triggered.

Thats all in this project.

--

--

Ghazanfar Ali
Ghazanfar Ali

Written by Ghazanfar Ali

I write technical blogs on DevOps and AWS, Azure and GCP

Responses (1)