Setup a Jenkins Local DevOps Environment using Docker and WSL2

  • Test a simple Python app, which will be hosted in a repository on the local Git server
  • Deploy the script/app to target s3 bucket on our local mock s3 service, where in a real world scenario, might be a component of an AWS hosted process

Before Commencing…

  • Windows 10 Pro (2004)
  • WSL2 enabled with Ubuntu set as the default distro, i.e. the output of the following command is identical to the output generated on your system
PS C:\> wsl --list --verbose                 
* Ubuntu Running 2
Description:    Ubuntu 20.04.1 LTS
Release: 20.04
Codename: focal
  • WSL2 based Docker engine has been enabled via Docker settings
  • All commands throughout the article have been executed from a WSL prompt, with related folders also residing on the WSL2 filesystem under parent $HOME/dev-env

1. Local Git Service for SCM — using Gogs

Running Gogs as a Local Git Server

$ mkdir -p $HOME/dev-env/gogs/data
$ docker run -d --name=gogs -p 10022:22 -p 10080:3000 -v $HOME/dev-env/gogs/data:/data gogs/gogs

Configure Git Service

Signup to Local Service and Create Repo

  • Create a new repo named first-repo

Method 1: WSL2 Using HTTP

  • To initialise the repo and create/push a, run the following on the host:
$ mkdir -p $HOME/dev-env/my-repos/first-repo
$ cd $HOME/dev-env/my-repos/first-repo
$ echo "# My First Repo">
$ git init
$ git add
$ git commit -m "first commit"
$ git remote add origin http://git-user@localhost:10080/git-user/first-repo.git
$ git push -u origin master
  • You’ll be asked for local server’s Git credentials
Enumerating objects: 3, done.
Counting objects: 100% (3/3), done.
Writing objects: 100% (3/3), 208 bytes | 208.00 KiB/s, done.
Total 3 (delta 0), reused 0 (delta 0)

Password for 'http://git-user@localhost:10080':********
  • The repo’s config file at .git/config should contain the correct remote
[remote "origin"]
url = http://git-user@localhost:10080/git-user/first-repo.git
fetch = +refs/heads/*:refs/remotes/origin/*
  • To add a git identity for the repo
$ git config -f $HOME/dev-env/my-repos/first-repo/.git/config \
--add git-user

$ git config -f $HOME/dev-env/my-repos/first-repo/.git/config \

Method 2: WSL2 Using SSH

  • Ensure that your SSH public key has been installed onto local Git server
  • Update SSH config on the host to reflect the correct private key location to use for the SSH connection to the GIT service.
host localhost
IdentityFile ~/.ssh/id_rsa
  • To test SSH connectivity:
$ ssh -T git@localhost -p 10022

Hi there, You've successfully authenticated, but Gogs does not provide shell access.
If this is unexpected, please log in with password and setup Gogs under another user.
  • Create and commit/push from host to remote:
$ mkdir -p $HOME/dev-env/my-repos/first-repo
$ cd $HOME/dev-env/my-repos/first-repo
$ echo "# My First Repo">
$ git init
$ git add
$ git commit -m "first commit"
$ git remote add origin ssh://git@localhost:10022/git-user/first-repo.git
$ git push -u origin master
  • Quick check of repo’s config file contents shows the correct ssh endpoint address being used.
[remote "origin"]
url = ssh://git@localhost:10022/git-user/first-repo.git
fetch = +refs/heads/*:refs/remotes/origin/*
  • Once configuration is complete, you can stop/remove the container. We will be using it later on within the docker-compose service definitions
  • At this point, you can start to work with the repo in VSCode by running:
$ cd $HOME/dev-env/my-repos/first-repo
$ code .

Working with Repos Outside of WSL2

2. Local S3 Cloud Service

$ mkdir $HOME/dev-env/ninja
$ docker run -d -p 9444:9000 -v $HOME/dev-env/ninja:/home/sirius/data scireum/s3-ninja:6.4
  • Make note of the Access/Secret Keys. These will be used to configure connection settings for use in pipelines.
  • Stop/remove the container

3. Jenkins Docker Plugin and Agent Image


Creating a Custom Jenkins Agent Image

FROM jenkins/agent:latest-stretch-jdk11

ARG user=jenkins


LABEL Description="This image is derived from jenkins/agent openjdk11. \
It includes docker static binary"

USER root

&& tar xzvf docker-${DOCKERVERSION}.tgz --strip 1 \
-C /usr/local/bin docker/docker \
&& rm docker-${DOCKERVERSION}.tgz

WORKDIR /home/${user}

USER ${user}
  • Build and tag the image using:
docker build --rm -t jenkins/agent:custom .

4. Jenkins Master Node

$ mkdir $HOME/dev-env/jenkins_home

$ docker run -p 8080:8080 -p 50000:50000 -d \
-v $HOME/dev-env/jenkins_home:/var/jenkins_home jenkins/jenkins:lts
  • Run the container and configure Jenkins from address http://localhost:8080
  • The initial admin password can be located via the host, at:
$ cat $HOME/dev-env/jenkins_home/secrets/initialAdminPassword
  • During the setup process, choose Install Suggested Plugins and following the instructions below to add the required plugins for Docker/Docker Pipelines.

Install Docker, Docker Pipeline & AWS Plugins

  • Once config is complete, Install Docker Plugin by choosing Manage Jenkins --> Manage Plugins --> Available and searching for Docker.
  • Choose Install without restart
  • Repeat the procedure for:

Configure Docker Cloud Template

  • Go to Manage Jenkins --> Manage Nodes and Clouds --> Configure Clouds
  • From the Add New Cloud drop down, choose Docker
  • Configure a new Docker Template as shown below.

Add Credential Provider for Git & AWS

  • Navigate to Manage Jenkins --> Configure Credential Providers add the types

Configure GIT & AWS Credentials on Jenkins Master Node

  • Manage Jenkins -> Manage Credentials --> New Item
  • Using the S3 Secret/Access keys taken from section Local S3 Cloud Service, configure access credentials for S3 bucket.

5. Create docker-compose.yml

  • $HOME/dev-env/docker-compose.yml
version: '3.7'
image: gogs/gogs
container_name: github_mock
- ./gogs/data:/data
- "10022:22"
- "10080:3000"

image: scireum/s3-ninja:6.4
container_name: s3_mock
hostname: aws-s3
- ./ninja:/home/sirius/data
- "9444:9000"

image: jenkins/jenkins:lts
container_name: jenkins_master
- github_mock
- s3_mock
- ./jenkins_home:/var/jenkins_home
- /var/run/docker.sock:/var/run/docker.sock
- s3_mock
- github_mock
- "8080:8080"
- "50000:50000"

driver: default
- subnet: ""
  • jenkins_master:
  • github_mock:
  • s3_mock :
  • The Jenkins docker agent container is not referenced in the docker-compose. It is provisioned as requested by pipeline node/agent directives. In the configuration of the agent Cloud template, we’ve included the name of the docker compose network that the agent container should join during initialisation.

6. Start the Environment

  • Stop/remove all prior containers that were previously launched, if you haven’t already done so
  • Start the services using docker-compose
$ cd $HOME/dev-env
$ sudo docker-compose up -d
$ sudo chmod a+rwx /var/run/docker.sock
  • Note: The sudo chmod a+rwx /var/run/docker.sock , included in the above block of commands, was required to address a Permission Denied issue when the attempting to connect from Jenkins node to the Docker daemon at internal container mount point /var/run/docker.sock. This should not be attempted in a production environment and was used as a workaround for this specific setup.
  • To check the containers & internal IPs:
$ docker ps -q | xargs docker inspect \
--format='{{ .Name }} {{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}'

  • The above, shows 3 containers with their respective IP address on the internal docker network created as part of the docker-compose.yml
$ docker network ls 

41b15682d16f dev-env_dev_ops bridge local

7. Testing the Environment

Jenkins Pipeline Overview

  • Fetch the Git repo (first-repo)
  • Build the image using the repo’s Dockerfile and assign name/tag as hello_py:1
  • Run the image as a container on agent/slave and execute some commands from within this container, including
  • Fetch the same git repo first-repo into the master node's workspace
  • Push the app to our self hosted S3 service bucket, s3://my-app/

Create & Upload Components to Git

  • Create a python script, which contains print("Hello World")
  • Create a Dockerfile containing the necessary steps to derive an image based on python:3.8, which we will use to test our python script
FROM python:3.8

# set the working directory in the container
WORKDIR /usr/src/app

# copy the content of the local src directory to the working directory
  • Commit/push our code and Dockerfile to the repo

Creating the pipeline

  • Create a new pipeline from Jenkins Master node via the portal’s New Item selection.
  • Enter local-devops-net-test as the name of the pipeline
  • In the Definition section, choose Pipeline script
  • Paste the following pipeline definition into the Script input area
  • Once the pipeline has been saved, select Build Now from pipeline menu choices. If all goes well, you should see something similar to the following in the console output.

Running the Pipeline using Jenkinsfile from Git

  • Create a Jenkinsfile with the contents of the pipeline definition above, and commit/push to first-repo
  • Modify the pipeline configuration to use the Jenkinsfile from repository first-repo
  • Rerun the build, and if all goes well, the console output should log the Jenkinsfile was fetched from the git repo.

Closing Remarks



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Tony Tannous

Tony Tannous


Learner. Interests include Cloud and Devops technologies.