Block current Jenkins Job until the other job is completed

  • Install Build Blocker Jenkins plugin

Build Blocker plugin

  • Go to Manage Jenkins > Manage Plugins and install Build Blocker plugin

  • Open a Jenkins job, click configure , and observe the below block

  • You can add multiple jobs as below; string prefix or suffix is allowed too

  • Simple Jenkins Job DSL for Build Blocker
  • Jenkins DSL for Build Blocker with options (single job)
blockOn("Job_name_to_wait", {
  • Jenkins DSL for Build Blocker with multiple jobs
blockOn(['job_name_1', 'job_name_2']) {


Auto-trigger Jenkins job on GitHub commit

  • Go to Jenkins job > Build Triggers, select ​Github hook trigger for GITScm polling and save Job (make sure the github plugin is installed)

  • Similar action can be performed using the following Jenkins Job-DSL snippet
triggers {
  • Go to GitHub project Repository > Settings > Webhooks
  • Click Add webhook and provide the WebHook URL
  • The below payload URL is something the Jenkins uses to receive request from remote GitHub repository whenever there is a commit (push) is made

  • If your Jenkins server has a public IP, use it. For local and testing purpose, use ngrok tool or just port forward your local machine IP
  • ngrok exposes local servers behind NATs and firewalls to the public internet over secure tunnels
  • Download ngrok from the below link
  • Unzip and open terminal to run the given cmd
./ngrok http 8080
  • Copy the pseudo url generated for

  • Copy & Paste it as a payload url in GitHub Webhooks to post APIs once the git push action is applied

  • Go to Jenkins > Manage Jenkins > Configure System
  • You will see GitHub section if the github plugin is installed
  • Click on Advanced options

  • Copy & Paste the same hook url for github configuration

  • Observe the client when a push is made in the github (GitHub sends API calls through the given payload url)

  • Eventually the Jenkins recognizes the commit made in the remote and triggers respective Jenkins job

Dockerize and integrate SonarQube with Jenkins

This post needs a basic knowledge from the previous post. Let’s see how to make SonarQube integration with Jenkins for code quality analysis in a live docker container

Dockerize SonarQube

  • Create a docker-compose.yml file with sonarqube and postgres latest images
  • Make sure you’ve sonar and sonar-scanner libs pre-installed in your local machine
  • Set login username and password as admin while executing the runner
sonar-scanner -Dsonar.projectKey=project_key -Dsonar.sources=. -Dsonar.login=admin -Dsonar.password=admin

Jenkins Integration

  • Install SonarQube Scanner Jenkins plugin

  • Go to Manage Jenkins > Configure System and update the SonarQube servers section

  • Go to Manage Jenkins > Global Tool Configuration and update SonarQube Scanner as in the below image

  • Now, create a jenkins job and setup SCM (say, git)
  • Choose Build > Execute SonarQube Scanner from job configure

  • Now, provide the required sonar properties in Analysis properties field. [Mention the path to test source directories in the following key, sonar.sources]

  • These sonar properties can be also be served from a file inside the project, named (see github for more details)
  • Now, update Path to project properties field in the project execute build

  • Observe the results in docker container’s host url

Jenkins environment variable setup

Manual Feed

# set custom job name
(note: by default, you can use $JOB_NAME which doesn't require any env setup)

# create a shell file and add the below statement
echo 'export VN_NAME="$jobName"' > ~/

# set user privilege
chmod 750 ~/

# temp env variable call
source ~/
echo $VN_NAME

Jenkins Plugin to inject ENV variable

  • By default you will have the Jenkins plugin EnvInject API Plugin
  • Go to Job > Build Environment  and select Inject environment variables to the build process
  • Set variable key and value as in below

Default Jenkins Configure

The environment variables can also be set through Global properties

  • Go to Manage Jenkins > Configure System > Global properties
  • Add the variable key and value


Trigger multi Jenkins jobs in parallel – Pipeline project

This post helps you understand how to parallelize multiple Jenkins jobs in a single run. If you need a basic understanding of how pipeline Jenkins job works, please follow this post

Approach #1

  • All the stages declared under a parallel block will be executed in parallel; let’s say,
parallel {
    // the below 3 stages execute at the same time
    stage('Parallel Test 1') {
        build(job: "jenkins_job_1")
    stage('Parallel Test 2') {
        // the below jobs run sequentially one after the other
        build(job: "jenkins_job_1")
        build(job: "jenkins_job_2")
        build(job: "jenkins_job_3")
    stage('Parallel Test 3') {
        echo "executing at last"
  • The below image depicts how the multiple Jenkins jobs were triggered at the same time

  • Copy the below example Jenkinsfile for practice
  • Blue Ocean view after completion of the pipeline job


Approach #2

In this approach, we will see how to group multiple jobs and run them in parallel (this is the only extended approach for parallelization in Jenkins pipeline project)

  • Custom syntax to group multiple jobs to run in parallel
stages {
    stage('single run') {
        parallel {
            stage('Parallel Test 1') {
                steps {
                    script {
                        def group1 = [:]
                        group1['test_1'] = {
                            build(job: 'jenkins_job_1')
                        group1['test_2'] = {
                            build(job: 'jenkins_job_2')
                        parallel group1
            stage('Parallel Test 2') {
                steps {
                    script {
                        def group2 = [:]
                        group2['test_3'] = {
                            build(job: 'jenkins_job_3')
                        group2['test_4'] = {
                            build(job: 'jenkins_job_4')
                        parallel group2
  • Copy the below example Jenkinsfile for practice
  • Blue Ocean view after completion of the pipeline job

Minikube Dockerized Selenium Grid

Minikube is an interactive Kubernetes system orchestrating Docker containers that can be used locally for testing purpose

  • Install minikube (this config is for MAC); and for other platforms, follow this
brew cask install minikube

# check version
minikube version


curl -Lo minikube && chmod +x minikube && sudo cp minikube /usr/local/bin/ && rm minikube
  • Let’s config Kubernetes system in an interactive mode; what we need first is to start minikube
minikube start

# check status
minikube status
  • Now, try to launch the minikube dashboard; launching minikube dashboard will take you to the default browser with the interface as seen below,
minikube dashboard

# to get the url alone
minikube dashboard --url=true

  • Now, click on the create link text on top-corner of the dashboard page

  • Import the below kubernetes_selenium_grid.json file created by me that generates a Selenium hub service, 1 Hub and, 1 Chrome Node; you can actually increase the Nodes manually from the dashboard itself

  • Observe the Service created, which is up and running
kubectl get service

  • Observe the created Selenium Hub and Chrome node, which is up and running
# to check all the deployments
kubectl get deployments
kubectl get deployments --namespace default

# to check all the pods (containers)
kubectl get pods
kubectl get pods --namespace default

  • Get the newly generated selenium Host address and Port number; here, selenium-hub is the custom service created by me
minikube service selenium-hub --url
minikube service --namespace default selenium-hub --url

  • Now, configure the driver initialization
options = {
  'chromeOptions' => {
    'args' => ['disable-infobars', 'disable-gpu', 'privileged', 'ignore-certificate-errors']

caps =
@driver = Selenium::WebDriver.for :remote, :url => "", desired_capabilities: caps
  • Run your tests, which will pick-up the chrome node that we actually created through Kubernetes

Jenkins Job DSL for beginners

This post helps you to create a basic Jenkins job-dsl groovy file for a previously created readymade free-style Jenkins job. Follow previous post to set up a main DSL job to automate all the Jenkins jobs:

  • Create a Jenkins Job-DSL groovy file, your_jenkins_job_name.groovy 
freeStyleJob('your_jenkins_job_name') {
    description 'your project description'
    logRotator(numToKeep = 100)
    scm {
       git {
        remote {
            # provide the credentials to let you clone git repo

# insert multiple shell files to execute sequentially
steps {

# enable Jenkins Global passswords
wrappers {
    injectPasswords{ injectGlobalPasswords() }

# publish a conslidated html report in the Jenkins dashboard
publishers {
   publishHtml {
    report('your_project/reports/') {
        reportName('Report Title')
        reportFiles('your_report_1.html, your_report_2.html, your_report_3.html')
  • The above Groovy script can also be generated by readymade through a Jenkins plugin, XML Job to Job DSL 
  • Go to Manage Jenkins -> Manage Plugins -> Available
  • Search for XML Job to Job DSL Plugin, install and restart Jenkins after download

  • Now, you will be finding an option to convert any of your Jenkins job to DSL
  • Click on the XML Job to DSL link, select the job to be converted, and click on the Convert selected to DSL button

  • Finally, download the DSL file