Introduction to Jenkins and CI/CD: From Building to Deploying Your Software

Ulises Magana
6 min readMay 2, 2023

--

Photo by Quinten de Graaf on Unsplash

Organizations are increasingly adopting DevOps practices to automate product releases in a controlled and secure manner. By reducing the overhead of making changes to production environments and improving operational efficiency, DevOps helps align development and operations teams. CI/CD pipelines are a key component of the DevOps approach, offering benefits such as faster time-to-market, improved collaboration, and more reliable releases. In this article, we’ll explore the concepts behind CI/CD and introduce one of the most popular tools for implementing pipelines.

Continuous Integration: Continuous Integration (CI) is a software development practice that involves frequently integrating code changes into a shared repository. This is done as soon as possible after the changes have been made, in order to detect any integration errors or conflicts early in the development process. The CI process typically includes automatic builds and tests, to ensure that the integrated code is working properly in the specified environment.

Continuous Delivery: Continuous Delivery (CD) is the process of automatically delivering changes to a specific environment after they have been tested and approved. This process ensures that code changes can be released quickly and reliably, with minimal risk to the production environment. Continuous Delivery typically involves a series of automated steps that start with building and testing the code changes and end with deploying them to a staging environment, where they can be reviewed and approved before being released to production.

Continuous Deployment: Continuous Deployment (CD) is an extension of Continuous Delivery that involves automatically deploying changes to the production environment, once they have been tested and approved in the staging environment. This process is typically used in environments where rapid and frequent releases are required, such as in software-as-a-service (SaaS) applications.

Jenkins: Jenkins is an open-source CI/CD tool that allows developers to set up automated builds using pipelines. These pipelines can include a series of steps, such as building and testing the software, deploying it to a staging environment, and then promoting it to production. Jenkins provides a flexible and extensible platform that can integrate with a variety of other tools and technologies, making it a popular choice for implementing CI/CD pipelines in software development projects.

Concepts and features in Jenkins

Node: A server in Jenkins that either serves as the controller (master node) or an agent (slave node) responsible for running pipeline builds. Pipelines can be distributed across different nodes to optimize execution.

Build: The process of compiling, testing, and packaging software into a distributable format. In Jenkins, a build can consist of a single step or a series of steps.

Job/Project: A set of instructions for a build in Jenkins, including what code to build, what environment to build it in, and what steps to take to build it. Jenkins offers several types of jobs, which are shown in the following image:

Workspace: A directory on a node where a build can read and write files during its execution. In Jenkins, each job run has its own workspace where logs and build artifacts are stored.

Pipeline: A set of stages and steps that define a build pipeline in Jenkins. Pipelines provide a visual representation of the entire build process and can be configured to automatically trigger builds based on specific events.

Agent: A worker node in Jenkins that is responsible for running a build or a set of builds. An agent is usually a slave node, but can also be the master node if no other nodes are configured.

SCM (Source Code Management): The codebase that is used to manage source code and track changes made to it over time. Some examples of SCM tools are Git and SVN.

Artifact: It is the product generated after running a build. For example it can be a file, a package, a container image, etc.

Fingerprint: It is a unique identifier (checksum) for an artifact generated that allows you to track where it was used in your builds and deployments.

Label: It is used to identify specific agents or nodes in Jenkins that are used to restrict which of them can execute a particular stage or job.

Webhook: A webhook acts as a trigger to run a build automatically in Jenkins. It can be set through the repository that you are working on.

Chaining Project: It is a set of projects that are linked together in a sequential order, taking the output of the first job as the input of the following one. There are two types of these:

  • Upstream: It refers to a job that comes before another job in the chain, meaning that its output is used as an input for the downstream job.
  • Downstream: It comes after the upstream job in the chain, meaning that it depends on the output of the previous job executed.

Plugin: It is an extension of functionality in Jenkins that can have the capability to do static code analysis, to mention an example. Some of the most common plugins are Docker, Maven, and SonarQube, apart from the ones that are recommended at the moment of installing Jenkins for the first time.

Trigger: A trigger runs after a certain event that can trigger a build in Jenkins, such as changes to source code, scheduled builds, or even manual triggers to mention a few.

Environment: It can refer to the place where the pipeline is run, but also to the set of variables, configurations, and dependencies required to run the pipeline.

Declarative and Scripted Pipelines

Both of them use the Groovy syntax and are written in a Jenkinsfile which is basically a file that you create and commit to your project’s source code repository to define your pipeline.

Declarative Pipeline: This type of pipeline is easier to write as it has a less steep learning curve compared to the Scripted Pipeline. It is designed to be simpler and more structured, using a declarative syntax to define the pipeline’s stages, steps, and other features.

pipeline {
agent any

stages {
stage('Stage 1') {
steps {
// define steps here
}
}
stage('Stage 2') {
steps {
// define steps here
}
}
}

post {
always {
// define post-build actions here
}
}
}

Scripted Pipeline: This type of pipeline is more flexible and allows for more advanced scripting capabilities compared to the Declarative Pipeline. It is designed for users who are more comfortable with programming and want more control over the pipeline’s logic and flow. It is written using Groovy syntax and can include conditionals, loops, and other programming constructs. Note in the following example that this type of pipeline use ‘node’ blocks to define the agents, makes use of a conditional to determine whether the current branch being built is the main branch or not, and uses an input to prompt the user to confirm whether they want to deploy to production or not.

node {
stage('Checkout') {
checkout scm
}
stage('Build') {
if (env.BRANCH_NAME == 'main') {
echo 'This is the main branch'
} else {
echo "This is ${env.BRANCH_NAME} branch"
}
}
stage('Test') {
echo "Test stage"
}
stage('Deploy') {
input "Deploy to production?"
echo "Deploying to production"
}
}

Key Takeaway

It is essential to grasp the key concepts of CI/CD and the main features of Jenkins to ensure that any project you are working on is up-to-date, reliable, and secure. It’s also of utter importance to know how to write a Jenkinsfile since it defines a pipeline and its configuration to integrate it with source control management, and then enable automated testing, along with being more efficient and scalable.
On the other hand, what’s best about Jenkins is that it improves collaboration, communication and transparency within the development and operations teams. Overall, this tool helps to automate software builds, testing, and deployment to improve the software quality, and save a huge amount of time by detecting early stage failures in the development process.

--

--

Ulises Magana

Cloud & Infrastructure Engineer with diverse experience in software development, database administration, SRE & DevOps.