CI/CD
AUTOMATION

// Automate everything. Deploy fearless.

JENKINS IS THE OLD GUARD—AND THAT'S EXACTLY WHY IT'S POWERFUL.

In an era of shiny new SaaS CI/CD platforms that want to lock you into their ecosystem, Jenkins remains the free, open-source workhorse that runs on your infrastructure. You own it. You control it. You modify it. No vendor lock-in. No per-minute billing. No arbitrary limits.

BUILD YOUR OWN CI/CD PIPELINE.

Jenkins gives you complete control over how software moves from developer keyboard to production. Every step is visible, every configuration is text, every pipeline is version-controlled. When something breaks, you have the logs, the scripts, and the power to fix it—not some support ticket waiting game.

MASTER THE AUTOMATION ENGINE.

Learn to build pipelines that test your code, build your artifacts, deploy to your servers, and notify your team. Learn Groovy scripting to customize every behavior. Integrate with Git, Docker, Kubernetes, Ansible, and hundreds of other tools. Jenkins is infinitely extensible because it's built by developers, for developers.

BEGIN YOUR JOURNEY →

// The Path to CI/CD Mastery

12 lessons. Complete Jenkins control.

LESSON 01

Introduction to Jenkins

What is CI/CD? Installing Jenkins, understanding the architecture, and your first build.

Beginner
LESSON 02

Jenkins Fundamentals

Jobs, builds, plugins, and the Jenkins dashboard. Understanding the core concepts.

Beginner
LESSON 03

Freestyle Projects

Creating your first freestyle job. Build triggers, source management, and build steps.

Beginner
LESSON 04

Pipeline as Code

Jenkinsfile basics. Declarative vs scripted pipelines. Writing your first pipeline.

Intermediate
LESSON 05

Pipeline Syntax Deep Dive

Stages, steps, agents, post-actions. Advanced pipeline syntax and best practices.

Intermediate
LESSON 06

Build Agents & Distributed Builds

Master-agent architecture. Configuring agents. Distributed build execution.

Intermediate
LESSON 07

Plugin Ecosystem

Essential plugins. Pipeline plugins, Docker integration, Git integration, and more.

Intermediate
LESSON 08

Automated Testing

Running tests in pipeline. Test reporting. Quality gates and code coverage.

Intermediate
LESSON 09

Artifact Management

Storing build artifacts. Archiving, artifact repositories, and artifact promotion.

Advanced
LESSON 10

Security & Access Control

Matrix authorization. Role-based access control. Securing your Jenkins instance.

Advanced
LESSON 11

CI/CD with Docker

Building Docker images in pipeline. Docker-in-Docker. Containerized builds.

Advanced
LESSON 12

Jenkins & Kubernetes

Jenkins on Kubernetes. Dynamic agents. Scaling your CI/CD infrastructure.

Advanced

LESSON 01: Introduction to Jenkins

×

What is Jenkins?

Jenkins is an open-source automation server that enables developers to build, test, and deploy software reliably. It's the most widely used CI/CD tool in the world, with over 200,000 active installations. Originally created by Kohsuke Kawaguchi in 2004 as Hudson, it was renamed to Jenkins after a dispute with Oracle.

Jenkins automates the software development process through continuous integration and continuous delivery (CI/CD). It monitors version control systems for changes, automatically triggers builds, runs tests, and can deploy applications to various environments.

⚡ POWER MOVE: Unlike cloud-based CI/CD platforms, Jenkins runs on your infrastructure. You decide where it runs, how it's configured, and who has access. No vendor lock-in.

Why Jenkins in 2024?

With so many SaaS CI/CD platforms available—GitHub Actions, GitLab CI, CircleCI, Travis CI—you might wonder why Jenkins still matters. Here's why:

  • Complete Control: Run Jenkins on your own servers, in your data center, or on a Raspberry Pi in your closet.
  • No Costs: Completely free. No per-minute billing, no build minutes limits, no enterprise pricing tiers.
  • Infinitely Extensible: Over 1,800 plugins available. If a tool has an API, there's probably a Jenkins plugin for it.
  • Mature Ecosystem: Decades of documentation, community support, and best practices.
  • Air-Gapped Deployments: Perfect for secure environments that can't connect to the public internet.

Installing Jenkins

Jenkins can run on any machine that supports Java. Here's how to install it on Linux:

# Add Jenkins repository
curl -fsSL https://pkg.jenkins.io/jenkins.io-2023.key | sudo gpg --dearmor -o /usr/share/keyrings/jenkins.gpg
echo "deb [signed-by=/usr/share/keyrings/jenkins.gpg] https://pkg.jenkins.io/debian binary/" | sudo tee /etc/apt/sources.list.d/jenkins.list

# Update and install
sudo apt update
sudo apt install jenkins -y

# Start Jenkins
sudo systemctl start jenkins
sudo systemctl enable jenkins

After installation, access Jenkins at http://localhost:8080. You'll need the initial admin password from:

sudo cat /var/lib/jenkins/secrets/initialAdminPassword

Jenkins Architecture

Understanding Jenkins architecture is crucial for effective usage:

  • Jenkins Controller (Master): The central node that manages jobs, builds, and user interface. Handles web requests, coordinates builds, and maintains configuration.
  • Jenkins Agents: Worker nodes that execute builds. Can be static (permanently connected) or dynamic (spin up when needed).
  • Executors: Slots on agents that run builds. Each agent can have multiple executors for parallel execution.
  • Plugins: Extensions that add functionality. Everything from Git integration to Docker support comes from plugins.

Your First Build

Let's create your first Jenkins job—a simple freestyle project:

  1. Click "New Item" on the Jenkins dashboard
  2. Enter a name (e.g., "hello-world")
  3. Select "Freestyle project"
  4. Under "Build Steps", add an "Execute shell" step
  5. Enter: echo "Hello from Jenkins!"
  6. Click "Save"
  7. Click "Build Now" to run your first build

Watch the console output. You've just executed your first automated build!

LESSON 02: Jenkins Fundamentals

×

Understanding Jobs and Builds

In Jenkins, a job (now called "item") is a configurable task that Jenkins executes. A build is a single execution of that job. Each build has a number, timestamp, and outcome (success, failure, or unstable).

Jobs can be configured with:

  • Source Code Management: Git, Subversion, Mercurial, etc.
  • Build Triggers: Scheduled, on commit, on upstream build, etc.
  • Build Steps: Shell scripts, batch commands, Ant, Maven, Gradle, etc.
  • Post-build Actions: Archive artifacts, trigger other jobs, send notifications.

Build Triggers

Triggers define when Jenkins starts a build:

1. Poll SCM

Jenkins periodically checks your repository for changes:

# Check every 5 minutes
H/5 * * * *

# Check every 15 minutes during work hours
H/15 9-17 * * 1-5

2. Webhook Trigger

Your version control system notifies Jenkins of changes (more efficient than polling):

# In GitHub, add webhook:
# http://YOUR_JENKINS_URL/github-webhook/

3. Scheduled Build

Run builds on a schedule (useful for reports, maintenance):

# Daily at midnight
H 0 * * *

# Every 6 hours
H H/6 * * *

Parameterized Builds

Make your builds flexible with parameters:

  1. In job configuration, check "This project is parameterized"
  2. Add parameters: String, Boolean, Choice, Password, etc.
  3. Reference parameters in build scripts: $PARAM_NAME
#!/bin/bash
# Build with parameter
echo "Building version: $VERSION"
echo "Environment: $ENVIRONMENT"

# Use in Docker builds
docker build -t myapp:$VERSION .

Workspace

Each job gets a dedicated workspace directory—typically /var/lib/jenkins/workspace/JOB_NAME. This is where:

  • Source code is checked out
  • Builds are executed
  • Artifacts are created

The workspace persists between builds unless you configure cleanup.

Build History & Console Output

Every build provides:

  • Build Number: Sequential identifier
  • Timestamp: When the build started
  • Duration: How long it took
  • Console Output: Complete stdout/stderr from the build
  • Changes: What files changed since the last build
  • Artifacts: Files produced by the build

LESSON 03: Freestyle Projects

×

Freestyle Project Overview

Freestyle projects are the most flexible job type in Jenkins. They allow you to configure various build triggers, source code management, build steps, and post-build actions through a web interface.

⚡ NOTE: While freestyle projects are easy to start with, Jenkinsfile-based pipelines (covered in later lessons) are better for complex workflows. They can be version-controlled and are easier to maintain.

Creating a Freestyle Project with Git

Let's build a real project that pulls from Git:

  1. Create a new freestyle project
  2. Under "Source Code Management", select Git
  3. Enter your repository URL:
    https://github.com/yourusername/your-repo.git
  4. Specify branches: */main or */master
  5. Add build steps (in order):

Step 1: Checkout Code

Jenkins automatically checks out code, but you can customize this.

Step 2: Run Tests

#!/bin/bash
npm install
npm test

Step 3: Build

#!/bin/bash
npm run build

Step 4: Post-Build Actions

  • Archive artifacts: build/**/*
  • Publish test results: **/test-results/*.xml
  • Trigger other projects (downstream)

Environment Variables

Jenkins provides built-in environment variables:

# Useful Jenkins environment variables
$WORKSPACE          # Path to job workspace
$BUILD_NUMBER       # Current build number
$BUILD_URL          # URL to this build
$JOB_NAME           # Name of the job
$GIT_COMMIT         # Current Git commit hash
$GIT_BRANCH         # Current Git branch
$CHANGES            # Changes since last build

You can also define custom environment variables in job configuration.

Build Thresholds

Configure when builds should fail:

  • Build Step Failure: Fail immediately if any step returns non-zero
  • Test Results: Mark unstable if tests fail, or mark failed if tests don't run
  • Archive Artifacts: Fail if no artifacts are archived

Practice Exercise

Create a freestyle project that:

  1. Polls a Git repository every 10 minutes
  2. Runs a shell script that prints the current date and Git commit
  3. Archives a simple text file as an artifact
  4. Sends an email notification on failure

LESSON 04: Pipeline as Code

×

Why Pipelines?

Pipeline as Code (PaC) means defining your entire CI/CD workflow in a file called Jenkinsfile that's version-controlled alongside your code. This brings:

  • Version Control: Track changes to your build process
  • Code Review: Pull requests for build changes
  • Branch-Specific Pipelines: Different builds for different branches
  • Single Source of Truth: The pipeline definition travels with the code

Jenkinsfile Basics

A Jenkinsfile can be written in two syntaxes:

Declarative Pipeline (Recommended)

pipeline {
    agent any
    
    environment {
        MY_VAR = 'value'
    }
    
    stages {
        stage('Build') {
            steps {
                echo 'Building...'
                sh 'npm install'
            }
        }
        stage('Test') {
            steps {
                echo 'Testing...'
                sh 'npm test'
            }
        }
        stage('Deploy') {
            steps {
                echo 'Deploying...'
            }
        }
    }
    
    post {
        always {
            echo 'Pipeline complete'
        }
        success {
            echo 'Build succeeded!'
        }
        failure {
            echo 'Build failed!'
        }
    }
}

Scripted Pipeline

node {
    stage('Build') {
        sh 'npm install'
    }
    stage('Test') {
        sh 'npm test'
    }
    stage('Deploy') {
        sh './deploy.sh'
    }
}

Creating a Pipeline Job

Two ways to use Jenkinsfile:

Option 1: Pipeline Script

Paste Jenkinsfile directly into Jenkins job configuration.

Option 2: Pipeline Script from SCM

Jenkinsfile lives in your repository:

  1. Create a new Pipeline job
  2. Under "Pipeline", select "Pipeline script from SCM"
  3. Configure your Git repository
  4. Specify Jenkinsfile path (e.g., Jenkinsfile)

Basic Pipeline Structure

pipeline {
    // Where to run (any agent, or specific label)
    agent { label 'my-agent' }
    
    // Environment variables
    environment {
        APP_NAME = 'myapp'
        REGISTRY = 'docker.io'
    }
    
    // Define stages
    stages {
        stage('Checkout') {
            steps {
                // Get source code
                checkout scm
            }
        }
        stage('Build') {
            steps {
                // Build your application
                sh 'make build'
            }
        }
        stage('Test') {
            steps {
                // Run tests
                sh 'make test'
            }
        }
    }
    
    // Actions after stages
    post {
        always { cleanWs() }
        success { echo 'Done!' }
    }
}

Practice Exercise

Create a Jenkinsfile that:

  1. Has three stages: Build, Test, Deploy
  2. Uses environment variables for app name
  3. Has post-build actions for success and failure
  4. Save it to a Git repository and configure a Pipeline job to use it

LESSON 05: Pipeline Syntax Deep Dive

×

Stages and Steps

Stages are logical groupings of work. Steps are the actual commands that do the work.

stage('Build') {
    steps {
        sh 'npm install'
        sh 'npm run build'
    }
}

You can run multiple steps in sequence. Each step must succeed for the next to run.

When Directive

Conditionally execute stages:

stages {
    stage('Deploy Prod') {
        when {
            branch 'main'
        }
        steps {
            sh './deploy-prod.sh'
        }
    }
    stage('Deploy Staging') {
        when {
            branch 'develop'
        }
        steps {
            sh './deploy-staging.sh'
        }
    }
}

Other when conditions:

when {
    environment name: 'DEPLOY_TO', value: 'production'
    expression { return params.DEPLOY }
}

Parallel Stages

Speed up builds by running stages in parallel:

stages {
    stage('Test') {
        parallel {
            stage('Unit Tests') {
                steps {
                    sh 'npm run unit-test'
                }
            }
            stage('Integration Tests') {
                steps {
                    sh 'npm run integration-test'
                }
            }
            stage('E2E Tests') {
                steps {
                    sh 'npm run e2e-test'
                }
            }
        }
    }
}

Matrix Configuration

Run the same stage with different configurations:

matrix {
    axes {
        axis {
            name 'NODE_VERSION'
            values '14', '16', '18', '20'
        }
        axis {
            name 'OS'
            values 'ubuntu', 'centos'
        }
    }
    stages {
        stage('Test') {
            sh 'npm test'
        }
    }
}

Post Actions

Actions that run after all stages:

post {
    always {
        echo 'Runs regardless of result'
    }
    success {
        echo 'Only runs on success'
    }
    failure {
        echo 'Only runs on failure'
    }
    unstable {
        echo 'Runs when build is unstable'
    }
    changed {
        echo 'Runs when build result differs from previous'
    }
    cleanup {
        echo 'Runs after always, even if failed'
        cleanWs()
    }
}

Advanced: Timeout and Retry

stage('Deploy') {
    steps {
        timeout(time: 10, unit: 'MINUTES') {
            retry(3) {
                sh './deploy.sh'
            }
        }
    }
}

LESSON 06: Build Agents & Distributed Builds

×

Understanding Agent Architecture

Jenkins can distribute builds across multiple machines. The architecture consists of:

  • Controller (Master): Handles web UI, API requests, scheduling builds
  • Agents: Execute the actual build jobs
  • Executors: Slots on agents that run jobs in parallel
⚡ BEST PRACTICE: Don't run builds on the Jenkins controller. Use agents. This keeps the controller responsive and allows you to scale builds horizontally.

Agent Types

Permanent Agents

Static machines that are always available:

# In Jenkins UI:
# Manage Jenkins > Manage Nodes > New Node
# Configure:
#   - Number of executors: 2
#   - Remote root directory: /var/jenkins/agents/agent1
#   - Labels: linux docker
#   - Launch method: via SSH

JNLP Agents

Agents that connect via Java Web Start:

# On agent machine:
java -jar agent.jar -jnlpUrl http://jenkins:8080/computer/agent1/slave-agent.jnlp -secret SECRET_TOKEN

Cloud Agents

Dynamic agents that spin up on demand (AWS, Azure, Kubernetes):

# Install Amazon EC2 plugin
# Configure cloud in Manage Jenkins > Manage Clouds
# Agents automatically launch and terminate based on demand

Configuring Agents in Pipeline

pipeline {
    // Run on any available agent
    agent any
    
    // Or specify by label
    agent { label 'docker && linux' }
    
    // Or use multiple labels (AND logic)
    agent { label 'docker && ubuntu && high-memory' }
    
    // Or exclude labels
    agent {
        node {
            label '!windows'
        }
    }
    
    stages {
        stage('Build') {
            agent { label 'docker' }
            steps {
                sh 'docker build .'
            }
        }
    }
}

Docker Agents

Run builds in isolated Docker containers:

pipeline {
    agent {
        docker {
            image 'node:18-alpine'
            label 'docker-host'
            args '-v /tmp:/tmp'
        }
    }
    stages {
        stage('Build') {
            steps {
                sh 'node --version'
            }
        }
    }
}

This pulls the Docker image, runs a container, and executes the pipeline inside it.

Kubernetes Agents

For dynamic scaling, use Kubernetes:

  1. Install Kubernetes plugin
  2. Configure Kubernetes cloud in Jenkins settings
  3. Define pod templates in pipeline:
pipeline {
    agent {
        kubernetes {
            label 'pod-template'
            defaultContainer 'jnlp'
            yaml '''
apiVersion: v1
kind: Pod
spec:
  containers:
  - name: builder
    image: node:18
    command:
    - cat
    tty: true
'''
        }
    }
    stages {
        stage('Build') {
            container('builder') {
                sh 'npm install'
            }
        }
    }
}

LESSON 07: Plugin Ecosystem

×

Essential Jenkins Plugins

Jenkins' power comes from its plugin ecosystem. Here are must-have plugins:

Pipeline & Workflow

  • Pipeline: Core pipeline support (usually pre-installed)
  • Blue Ocean: Modern pipeline visualization
  • Pipeline: Stage View: Classic stage visualization

Source Control

  • Git: Git integration (usually pre-installed)
  • GitHub Integration: GitHub webhooks and PR support
  • GitLab: GitLab integration
  • Bitbucket: Bitbucket integration

Build Tools

  • Docker Pipeline: Docker support in pipelines
  • Maven Integration: Maven build support
  • Gradle: Gradle build support

Installing Plugins

# Via Jenkins UI:
# Manage Jenkins > Manage Plugins > Available
# Search and install

# Via REST API:
curl -X POST http://jenkins:8080/pluginManager/installNecessaryPlugins \
  -d ""

# Or download .hpi file and copy to:
# /var/lib/jenkins/plugins/

Using Pipeline Plugins

GitHub Integration

pipeline {
    triggers {
        githubPush()
    }
    stages {
        stage('Build') {
            steps {
                echo 'GitHub push triggered this build'
            }
        }
    }
}

Docker Pipeline

pipeline {
    agent {
        docker {
            image 'maven:3.8-openjdk-11'
            args '-v ~/.m2:/root/.m2'
        }
    }
    stages {
        stage('Build with Maven') {
            sh 'mvn --version'
        }
    }
}

Plugin Management Best Practices

  • Keep plugins updated but test in staging first
  • Minimize plugin count - only install what you need
  • Check compatibility with your Jenkins version
  • Review plugin permissions - some plugins require risky permissions
  • Backup before updates - backup $JENKINS_HOME
⚡ SECURITY: Only install plugins from trusted sources. Malicious plugins can execute arbitrary code on your Jenkins server.

LESSON 08: Automated Testing

×

Why Test in CI/CD?

Automated testing in CI/CD catches bugs early, prevents bad code from reaching production, and gives confidence in your deployments. A proper CI/CD pipeline should:

  1. Run unit tests on every commit
  2. Run integration tests before deployment
  3. Fail the build if tests fail
  4. Report test results clearly

Running Tests in Pipeline

pipeline {
    agent any
    stages {
        stage('Install Dependencies') {
            sh 'npm install'
        }
        stage('Unit Tests') {
            steps {
                sh 'npm test -- --coverage'
            }
        }
        stage('Integration Tests') {
            steps {
                sh 'npm run integration-test'
            }
        }
    }
    post {
        always {
            junit 'test-results/*.xml'
            cobertura coberturaReportFile: 'coverage/cobertura-coverage.xml'
        }
    }
}

Test Reporting

Jenkins can display beautiful test reports:

JUnit XML Format

# Most test frameworks can output JUnit XML
# npm test -- --junit test-results/junit.xml
# mvn test -Dtest=JUnitTest -Dsurefire.useFile=false

# Jenkins configuration:
post {
    always {
        junit 'test-results/**/*.xml'
    }
}

Other Report Formats

  • Cobertura: Java code coverage
  • JaCoCo: Java code coverage (more modern)
  • Coverage.py: Python coverage
  • Istanbul: JavaScript coverage

Quality Gates

Enforce minimum quality standards:

pipeline {
    stages {
        stage('Test & Coverage') {
            steps {
                sh 'npm test && npm run coverage'
            }
            post {
                always {
                    jacoco()
                }
            }
        }
        stage('Quality Gate') {
            steps {
                script {
                    def coverage = readFile('coverage/coverage.json')
                    def threshold = 80
                    if (coverage < threshold) {
                        error "Code coverage ${coverage}% is below threshold ${threshold}%"
                    }
                }
            }
        }
    }
}

Test Parallelization

Speed up test execution with parallelization:

pipeline {
    stages {
        stage('Parallel Tests') {
            parallel {
                stage('Unit Tests') {
                    sh 'npm run unit-tests'
                }
                stage('Integration Tests') {
                    sh 'npm run integration-tests'
                }
                stage('E2E Tests') {
                    sh 'npm run e2e-tests'
                }
            }
        }
    }
}

LESSON 09: Artifact Management

×

What are Artifacts?

Artifacts are files produced by a build—binaries, Docker images, installers, reports. Jenkins can archive these files so they're available after the build completes.

Archiving Artifacts

In Freestyle Jobs

Configure in post-build actions:

  • Archive the artifacts
  • Files to archive: build/**/*.jar
  • Archive artifacts even if the build fails? (optional)

In Pipeline

pipeline {
    stages {
        stage('Build') {
            sh 'npm run build'
        }
    }
    post {
        success {
            archiveArtifacts artifacts: 'build/**/*', fingerprint: true
        }
    }
}

The fingerprint option helps track which build produced which artifact.

Artifact Retention

// Keep artifacts for 5 days, max 10 builds
archiveArtifacts artifacts: 'build/**', 
    allowEmptyArchive: true, 
    retentionDays: 5, 
    maxBuilds: 10

Or configure globally in Manage Jenkins > Configure System > Artifact Manager

Promoting Builds

Promote builds to different "environments" (e.g., from staging to production):

// Using promotion plugin
pipeline {
    stages {
        stage('Build') {
            steps {
                sh 'make build'
            }
        }
    }
}

// Promotion process:
// 1. Configure in job > Promotion
// 2. Name: "Production Ready"
// 3. Criteria: Manual approval + tests passed
// 4. Actions: Copy artifacts, trigger deployment

External Artifact Storage

For production, use external artifact repositories:

Artifactory

// Using Artifactory plugin
pipeline {
    stages {
        stage('Build & Publish') {
            steps {
                rtBuildInfo()
                rtUpload (
                    serverId: 'artifactory-server',
                    repo: 'libs-release-local',
                    artifacts: {
                        pattern: 'build/**/*.jar'
                    }
                )
            }
        }
    }
}

AWS S3

// Using S3 plugin
pipeline {
    post {
        success {
            s3Upload(
                bucket: 'my-artifacts',
                file: 'build/app.jar',
                path: "builds/${BUILD_NUMBER}/app.jar"
            )
        }
    }
}

LESSON 10: Security & Access Control

×

Jenkins Security Overview

Jenkins is powerful, which means it needs strong security. A compromised Jenkins can execute arbitrary code on your infrastructure.

⚡ CRITICAL: Never expose Jenkins to the public internet without authentication. Use a VPN, firewall, or require authentication.

Global Security Configuration

Configure in Manage Jenkins > Security:

Security Realm

  • Jenkins' own user database: Simple, for small teams
  • LDAP: Integrate with corporate directory
  • Unix user/group database: Use system users
  • SAML/OAuth: SSO integration

Authorization

  • Logged-in users can do anything: Simple, risky
  • Matrix-based security: Fine-grained permissions
  • Role-based strategy: Define roles and assign to users

Role-Based Access Control

Install Role Strategy Plugin for granular control:

  1. Install "Role-based Authorization Strategy" plugin
  2. Enable in Security > Authorization > Role-Based Strategy
  3. Manage and Assign Roles:
# Manage Roles:
# - Global roles: admin, developer, viewer
# - Project roles: regex patterns for job names
# - Agent roles: for agent management

# Example roles:
# Global:
#   - admin: Overall/Administer
#   - developer: Overall/Read + Job/Build + Job/Create
#   - viewer: Overall/Read
#
# Project (regex):
#   - dev-.*: Job/Build on jobs matching dev-*
#   - prod-.*: Job/Build + Job/Deploy on prod-*

Credential Management

Store secrets securely with Jenkins Credentials:

  1. Go to Manage Jenkins > Manage Credentials
  2. Add credentials: Username/password, SSH key, Secret file
  3. Reference in pipeline:
pipeline {
    environment {
        DOCKER_CREDS = credentials('docker-hub-credentials')
    }
    stages {
        stage('Deploy') {
            steps {
                sh 'docker login -u $DOCKER_CREDS_USR -p $DOCKER_CREDS_PSW'
            }
        }
    }
}

For plain secrets:

withCredentials([string(credentialsId: 'my-secret', variable: 'MY_SECRET')]) {
    sh 'echo $MY_SECRET'
}

CSRF Protection

Enablecrumb scattering to prevent cross-site request forgery:

  • Go to Manage Jenkins > Security > CSRF Protection
  • Enable "Crumb Algorithm" (default)
  • For API calls, pass crumb header or use API tokens

Hardening Jenkins

  • Use HTTPS: Configure behind a reverse proxy
  • Firewall: Only allow access from trusted IPs
  • Disable CLI: Unless needed
  • Limit script execution: Use Groovy sandbox
  • Audit logging: Enable audit trail plugin
  • Regular updates: Keep Jenkins and plugins patched

LESSON 11: CI/CD with Docker

×

Docker in Jenkins

Docker transforms CI/CD by providing consistent build environments. Let's explore how to use Docker in Jenkins pipelines.

Building Docker Images

pipeline {
    agent any
    environment {
        REGISTRY = 'docker.io'
        IMAGE_NAME = 'myapp'
        REGISTRY_CREDS = credentials('docker-hub')
    }
    stages {
        stage('Checkout') {
            checkout scm
        }
        stage('Build Image') {
            steps {
                sh '''
                    docker build -t $IMAGE_NAME:$BUILD_NUMBER .
                    docker build -t $IMAGE_NAME:latest .
                '''
            }
        }
        stage('Test Image') {
            steps {
                sh '''
                    docker run --rm $IMAGE_NAME:$BUILD_NUMBER npm test
                '''
            }
        }
        stage('Push Image') {
            steps {
                sh '''
                    echo $REGISTRY_CREDS_PSW | docker login $REGISTRY -u $REGISTRY_CREDS_USR --password-stdin
                    docker push $IMAGE_NAME:$BUILD_NUMBER
                    docker push $IMAGE_NAME:latest
                    docker logout
                '''
            }
        }
    }
}

Docker-in-Docker (DinD)

Sometimes you need to run Docker inside Docker:

// In agent configuration or pipeline
agent {
    docker {
        image 'docker:24-dind'
        args '--privileged -v /var/run/docker.sock:/var/run/docker.sock'
    }
}

stages {
    stage('Build') {
        steps {
            sh 'docker build .'
        }
    }
}
⚡ SECURITY: DinD with privileged mode is risky. Only use in trusted environments. Consider using Docker socket mounting instead when possible.

Multi-Stage Docker Builds

Build efficient, small images with multi-stage builds:

# Dockerfile
# Build stage
FROM node:18 AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

# Production stage
FROM node:18-alpine AS production
WORKDIR /app
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
CMD ["node", "dist/index.js"]
# In pipeline
stage('Build Optimized Image') {
    sh 'docker build --target production -t myapp:optimized .'
}

Docker Build Secrets

Handle secrets during build without exposing them in layers:

# BuildKit secrets
# Enable BuildKit
export DOCKER_BUILDKIT=1

# Build with secrets
docker build --secret id=npm,env=NPM_TOKEN .

# Dockerfile
RUN --mount=type=secret,id=npm \
    NPM_TOKEN=$(cat /run/secrets/npm) npm install

Docker Registry Integration

Amazon ECR

pipeline {
    environment {
        AWS_REGION = 'us-east-1'
        ECR_REGISTRY = "${AWS_ACCOUNT}.dkr.ecr.${AWS_REGION}.amazonaws.com"
    }
    stages {
        stage('Build & Push to ECR') {
            steps {
                sh '''
                    aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $ECR_REGISTRY
                    docker build -t $ECR_REGISTRY/myapp:$BUILD_NUMBER .
                    docker push $ECR_REGISTRY/myapp:$BUILD_NUMBER
                '''
            }
        }
    }
}

LESSON 12: Jenkins & Kubernetes

×

Jenkins on Kubernetes

Running Jenkins in Kubernetes provides dynamic scaling, resource efficiency, and infrastructure as code. Let's set up Jenkins on Kubernetes and configure dynamic agents.

Deploying Jenkins to Kubernetes

# jenkins.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: jenkins
spec:
  replicas: 1
  selector:
    matchLabels:
      app: jenkins
  template:
    metadata:
      labels:
        app: jenkins
    spec:
      containers:
      - name: jenkins
        image: jenkins/jenkins:lts
        ports:
        - containerPort: 8080
        - containerPort: 50000
        volumeMounts:
        - name: jenkins-home
          mountPath: /var/jenkins_home
      volumes:
      - name: jenkins-home
        persistentVolumeClaim:
          claimName: jenkins-pvc
---
apiVersion: v1
kind: Service
metadata:
  name: jenkins
spec:
  type: LoadBalancer
  ports:
  - port: 80
    targetPort: 8080
  selector:
    app: jenkins
# Apply
kubectl apply -f jenkins.yaml

# Get admin password
kubectl exec -it jenkins-xxxxx cat /var/jenkins_home/secrets/initialAdminPassword

Kubernetes Plugin Configuration

  1. Install "Kubernetes" plugin
  2. Go to Manage Jenkins > Manage Clouds > Add a new cloud > Kubernetes
  3. Configure Kubernetes URL (or leave empty for in-cluster)
  4. Set namespace (or leave for default)
  5. Add Kubernetes pod templates:
# Pod template configuration:
# - Name: builder
# - Label: builder
# - Container template:
#   - Name: jnlp
#   - Image: jenkins/inbound-agent:latest
#   - Working directory: /home/jenkins/agent
#   - Command: "" (leave empty)
#   - Arguments: "" (leave empty)

Dynamic Agents in Pipeline

pipeline {
    agent {
        kubernetes {
            label 'jenkins-builder'
            defaultContainer 'builder'
            yaml '''
apiVersion: v1
kind: Pod
spec:
  containers:
  - name: builder
    image: node:18-alpine
    command:
    - cat
    tty: true
    volumeMounts:
    - name: workspace
      mountPath: /workspace
  volumes:
  - name: workspace
    emptyDir: {}
'''
        }
    }
    stages {
        stage('Build') {
            container('builder') {
                sh '''
                    cd /workspace
                    npm install
                    npm test
                '''
            }
        }
    }
}

Custom Agent Images

Create custom agent images for your tech stack:

# Dockerfile for custom agent
FROM jenkins/inbound-agent:latest

# Install tools you need
RUN apk add --no-cache \
    nodejs \
    npm \
    python3 \
    pip \
    docker-cli \
    kubectl \
    helm

# Create working directory
WORKDIR /home/jenkins/agent

# Switch back to jenkins user
USER jenkins
# Push to your registry
docker build -t myregistry/jenkins-agent:node18 .
docker push myregistry/jenkins-agent:node18

Then use in pipeline:

agent {
    kubernetes {
        label 'custom-agent'
        yaml '''
apiVersion: v1
kind: Pod
spec:
  containers:
  - name: jnlp
    image: myregistry/jenkins-agent:node18
'''
    }
}

Resource Management

Configure resource requests and limits:

agent {
    kubernetes {
        label 'builder'
        yaml '''
apiVersion: v1
kind: Pod
spec:
  containers:
  - name: builder
    image: node:18
    resources:
      requests:
        memory: "512Mi"
        cpu: "250m"
      limits:
        memory: "1Gi"
        cpu: "500m"
'''
    }
}

Conclusion

You've completed the Jenkins mastery guide. You now know how to:

  • Install and configure Jenkins
  • Create freestyle and pipeline jobs
  • Use Jenkinsfile for pipeline as code
  • Configure agents and distributed builds
  • Extend Jenkins with plugins
  • Run automated tests
  • Manage build artifacts
  • Secure your Jenkins installation
  • Integrate with Docker
  • Scale with Kubernetes

Next steps:

  • Build your own CI/CD pipelines
  • Explore the plugin ecosystem
  • Set up Jenkins on Kubernetes
  • Integrate with your deployment tools