Implementing DevSecOps in CICD Pipeline:

Ghazanfar Ali
10 min readOct 17, 2023

--

Introduction:

DevSecOps is a set of practices and principles that integrate security into the DevOps (Development and Operations) process. It aims to ensure that security is an integral part of the software development lifecycle, rather than being treated as a separate phase. This approach helps identify and fix security issues early in the development process, reducing the likelihood of security vulnerabilities in the final product.

Here’s a simple explanation of some key components in DevSecOps:

  • 1- RBAC (Role-Based Access Control): RBAC is like assigning specific roles to individuals in a play. Just as actors have roles with different responsibilities, in RBAC, users or systems are given specific roles with defined access rights. This ensures that each person or system can only do what they are supposed to do, limiting unauthorized actions.
  • 2- SAST (Static Application Security Testing): Think of SAST as a spell checker for code. It scans the source code for vulnerabilities without executing the program. Similar to how a spell checker highlights potential errors in a document before it’s even written, SAST identifies security issues in code before it’s run.
  • 3- DAST (Dynamic Application Security Testing): DAST is like a security test while driving a car. It checks for vulnerabilities while the application is running, similar to how you inspect a car’s performance during a test drive. DAST evaluates the application from the outside, looking for potential weaknesses.
  • 4- SCA (Software Composition Analysis): Imagine making a sandwich with various ingredients. SCA is like checking the ingredients for expiration dates and recalls. It scans the software components used in an application to ensure they are up-to-date and free from known vulnerabilities, just as you’d check the ingredients to make sure they are safe to consume.
  • 5- Linting: Linting in the context of Docker involves the use of tools to analyze and validate Dockerfiles and container configurations for best practices, security, and potential issues. It ensures that Docker images are built efficiently, securely, and according to recommended guidelines. The main goal is to prevent problems and vulnerabilities before they become part of a container image.

One of the commonly used Docker linting tools is Hadolint. Hadolint is a linter specifically designed for Dockerfiles. It provides a set of rules to check Dockerfiles for adherence to best practices, and it can be easily integrated into your development process and CI/CD pipeline. You can customize the rules to suit your organization’s specific requirements and guidelines. This tool is come as a docker container.

These components are crucial in DevSecOps to ensure that the development process is secure, controlled, and efficient. They help identify and address security concerns early, preventing issues that could impact the final product’s safety and performance.

DevSecOps is the practise of implementing security practices into the DevOps model, But first understand the DevSecOps we should understand what DevOps and vulnerabilities are.

Vulnerability:

It is is the software code flaw or misconfiguration in the code which attacker or hacker can use to gain unauthorized access to the system or network.

Exploit:

The method hacker use to exploit the vulnerability, exploit could be piece of custom software or sequence of commands.

Threat:

It is the actual event in which one or more exploit use against vulnerability to mount an attack.

Lets Do Some Hands on Practise on DevSecOps!

Introduction:

In this project we will see how can we implement DevSecOps in our jenkins CICD pipeline by using SCA, SAST, and DAST approaches which we have explained in this document. Mainly it is related to how the SDLC can me secure if we try to adopt the methodology of fail early and fail fast, so the main focus of this project to find vulnerabilities in SLDC as early as possible.

First we will create a ubuntu based t2.medium instances for our jenkins and tomcat server, tomcat will be running as production server where we will host our application.

Now we will ssh to our jenkins instance and install java, jenkins and dockers in it, also allow port 8080 in security group of jenkins server, because runs on port 8080:

Verify the installations:

Dockers:

Jenkins:

Java:

Now we will install some plugins in our jenkins:

Now we will install maven as code build tool on same jenkins instance:

Add maven in jenkins:

Now we will setup Tomcat server on tomcat instance:

⇒ wget https://dlcdn.apache.org/tomcat/tomcat-9/v9.0.82/bin/apache-tomcat-9.0.82.zip

⇒ unzip apache-tomcat-9.0.82.zip

⇒ cd apache-tomcat-9.0.82/bin

Run the startup.sh script it wil start the tomcat server, but first assign the executable permission to this file and catalina.sh file:

Check the tomcat on port 8080:

But to access manager app in tomcat we need to assign some roles and create user:

Edit the users.xml file of tomcat:

Comment the below line of below file:

If it not work then also comment the same file in /home/ubuntu/apache-tomcat-9.0.82/webapps/manager/META-INF

Restart the tomcat again with that startup.sh file:

Now access the tomcat manager/html page with manager username and password that created in user.xml file:

Now we will start creating the jenkins pipeline:

Provide the application url on jenkins so jenkins fetch the code from github:

Configure jenkins so it check for new commit every minute then auto build the pipeline if new commit occurred at github:

We will ask jenkins to fetch the pipeline script from the github:

First we will verify the first two stages till build:

Jenkins blue is a good tool to analyze the pipeline:

  • Initialize:
  • This is the first stage of our pipeline.
  • It sets up and configures the environment for the subsequent stages.
  • Inside the steps block, it runs a shell (sh) script that prints the values of two environment variables: PATH and M2_HOME. These environment variables are commonly used in Maven builds.
  • It’s useful for debugging and ensuring that the expected environment variables are set correctly before proceeding with the build.
  • Build:
  • This is the second stage of our pipeline.
  • It’s responsible for building your project.
  • Inside the steps block, it runs the ‘mvn clean package’ command. This is a common Maven command used to clean the project, compile the source code, run tests, and package the application.

In Last stages we have seen how can we generate artifact through maven build now we will see how to deploy that var file created by maven build deploy on tomcat server, we need to copy the artifact var file to our tomcat server’s webapp directory:

First we need to store tomcat ssh credentials o our jenkins:

Now add the new stage in our pipeline:

It successfully run this stage also:

Also copied the .war file in tomcat server:

Webapp is hosted on tomcat server:

Now we will start integrating security checks in our jenkins pipeline, first we will check if any developer unintentionally has commit any of the secrets in the github repo, these secrets can be anything that do not need to be public, we will use the tool trufflehog.

First we will pull its image from docker hub on our jenkins instance:

Now using this image check that is our github repo any password, secret in it which should not be there:

This command can be found on official image of trufflehog on dockerhub:

docker run -it -v “$PWD:/pwd” ghcr.io/trufflesecurity/trufflehog:latest github — repo https://github.com/trufflesecurity/test_keys — debug

Fortunately it did not find any cred on our repo. Now we ill see how to integrate it in our pipeline, we will create a new stage but before the building stage:

This stage will run that same command and save the output in trufflehog file but first check if file already present then delete it and at last it display the output of trufflehog file.

Also add the jenkins user in docker group so it runs the docker commands:

Then reboot the instance once. Then verify:

Now we will discuss another important aspect of DevSecOps which is source composition analysis for that we will integrate OWASP dependency checker for SCA, it will check all the third parties libraries using in the code for vulnerabilities:

First we will save its code available on docker hub on our github repo:

Then add a new stage in jenkins pipeline, this will also happen before build is made:

Check the status:

In the above stage we run the script that we placed on github which scans the vulnerability in the code dependencies which is placed in the same directory where we place the script.

Now we will see how to integrate SAST in our CICD pipeline, It analysis the source code and binaries to find out the vulnerabilities in the code before code went for production, as we are using java bases application we can use sonar for SAST, sometimes it gives false positives also so better to go for manual code review also.

First we will run sonar in jenkins local system as docker container:

Now install the sonarqube plugin in jenkins:

Now we need to configure sonarqube in jenkins to let jenkins know that this is the installation of sonarqube that we need to use, first generate the authentication token on sonarqube that we will use on jenkins:

Generate token in my account → security → generate token:

Use this token on jenkins:

Also add sonarqube in tools of jenkins so it install sonarqube when runs the pipeline:

Now we will add new stage in which we will tell maven to use sonarqube and save the report in target folder:

We have successfully integrated SAST with our pipeline:

Now we will integrate DAST in our pipeline, in DAST we try to attack the application from outside like real attacker, this is only possible when application is deployed, we cannot integrate authenticate DAST scan in pipeline because it take too much time so we will integrate Zap-Baseline scan , we will create a new ubuntu instance for zap as it takes too much docker space:

install docker in it:

Add its ssh credentials in jenkins:

Now we will add DAST stage in our pipeline:

It successfully run the pipeline:

It also found the vulnerabilities but we need to verify its real or false positive:

Thats all in this project!

--

--

Ghazanfar Ali

I write technical blogs on DevOps and AWS, Azure and GCP