Deploy AWS Lambda Functions to Local/Cloud Environments with Jenkins and Terraform

In a previous post, I documented learnings from setting up a local Devops environment.

This article aims to apply a potential use-case for this environment by including support for Terraform. The following overview of the goals to be achieved should help you in deciding whether to read on:

Tweak/expand the local Devops environment, mentioned above, to include support for Terraform and Localstack.

Use the environment to create a Continuous Delivery Jenkins pipeline to deploy a Python-based Lambda function to Localstack and AWS Cloud.

The Lambda function is to be triggered by an S3 event, in response to a csv file being uploaded to an input bucket. Once triggered, the Lambda processes the file and exports results to an output bucket.

The key topics and learning objectives are:

  • Use docker-compose to bring up a stack comprising of 3 networked containers running the following services
    Gogs — locally hosted Git SCM
    Localstack — mock AWS environment
    Jenkins node — CI/CD
  • Using the stack, create a Jenkins Continuous Delivery pipeline to package and deploy a sample Python-based Lambda function using Terraform
  • Configure the pipeline to deploy the Lambda to
    ◽ AWS Cloud environment, deployed only after interactive prompt confirmation
  • The Lambda’s primary purpose is to load a csv file from an input bucket and transpose the contents, using a Pandas Dataframe, into a given output bucket
  • Configure the Lambda function to be triggered by an event, attached to the input bucket configuration
  • Set up a Gogs Webhook, to automatically trigger the Jenkins job when changes are pushed to locally hosted SCM
  • Test the deployed Lambda function by uploading sample file to the input bucket

All components required to achieve the final framework are included in a git repo, which forms the basis for establishing the set up.

Preparing Local Environment with docker-compose

Clone Git Repository

Clone the repository mentioned above from github. This repository will eventually make its way onto our local, self-hosted Git SCM, aka Gogs.

$ cd $HOME
$ git clone \
$ cd jenkins-deploy-lambda-terraform
$ rm -fr .git

Create Host Directories

Create host directories to be used for persisting data from container mounts.

$ mkdir -p /tmp/gogs/data
$ mkdir -p /tmp/jenkins_home
$ mkdir -p /tmp/localstack

Build Custom Jenkins Docker Image

Build the custom Jenkins docker image, which includes the AWS CLI, Terraform and Python distribution.

$ cd $HOME/jenkins-deploy-lambda-terraform
$ docker build -t jenkins/jenkins:master .

Bring up the Stack

Using docker-compose, bring up the stack using:

$ AWS_CRED=$HOME/.aws TMPDIR=/tmp/localstack docker-compose up -d

This brings up 3 containers, accessible from the host via the following endpoints:


or, if accessing from internal docker-compose network:


Note on AWS Credentials

In the command used to bring up the stack, is set to host location of AWS config/credentials folder (), which is mounted into the home directory () of the user running within the Jenkins container.

version: '3.7'
- ${AWS_CRED}:/var/jenkins_home/.aws

Two profiles should be added to the host’s file to support deployment to each target environment, i.e Localstack (), and AWS Cloud ().


[profile aws_local]
region = us-east-1
output = json
aws_access_key_id = test
aws_secret_access_key = test

[profile aws_cloud]
region = us-east-1
output = json
aws_access_key_id = <your AWS key>
aws_secret_access_key = <your AWS secret>

Configure Gogs SCM and Upload Code Repository

Configure Gogs — Locally Hosted Git

To configure Gogs as a local Git SCM host, visit and complete the configuration. Refer to Appendix A for config values. The remainder of the article uses these values.

Once Gogs has been configured, we can create an empty repository to host our source.

Create Git Repository on Gogs Host and Push Code

From , create an empty repository named , leaving "Initialise this repository with selected files and template" unticked.

Initialise and push the local repo we cloned earlier on, to our self hosted remote:

The repository should be accessible via:


Overview of Repository Components

The following is a listing of the repository source.

├── Dockerfile
├── Jenkinsfile
├── Pipfile
├── Pipfile.lock
├── my-lambda
│ ├── iam
│ │ ├── lambda_iam_policy.json
│ │ └── lambda_iam_trust_policy.json
│ └── src
│ └──
├── lambda-input-test-file.csv
├── aws_local
│ ├──
│ ├──
│ └──
├── aws_cloud
│ ├──
│ ├──
│ └──
├── docker-compose.yml

The main components are:
: Lambda source code and IAM policy definitions
: Test input file for Lambda function
: Terraform components for deployment to Localstack
: Terraform components for deployment to AWS Cloud account
: Contains the deployment pipeline

Walkthrough of Terraform Code for Lambda Deployment

The folder structures hosting Terraform code, and , are almost identical, therefore a walkthrough of only one of these is covered (). The differences are mainly related to provider configuration, i.e. Localstack vs AWS Cloud.


This file contains the Terraform variables along with comments describing their purpose.

# AWS Regionvariable "aws_region" {
type = string
default = "us-east-1"

# Application name to include in names of AWS resources

variable "app_name" {
type = string
default = "transposer"

# AWS Account (for Localstack, value is zeroes)

variable "aws_account" {
type = string
default = "000000000000"

# AWS profile to source credentials

variable "aws_profile" {
type = string
default = "aws_local"

# Source name and location containing Lambda zip.
# Zip is created during the Jenkins pipeline.

variable "lambda_zip" {
type = string
default = "../dist/"

# Deployment target - AWS Cloud (aws_cloud)
# or Localstack (aws_local)

variable "env" {
description = "Env - localstack or cloud"
type = string
default = "aws_local"


Contains Terraform resources required for deploying the Lambda. Comments/descriptions are included for each resource.

# Localstack provider configuration

provider "aws" {
region = var.aws_region
profile = var.aws_profile
s3_force_path_style = true
skip_credentials_validation = true
skip_metadata_api_check = true
skip_requesting_account_id = true
insecure = true

endpoints {
acm = "https://localstack:4566"
apigateway = "https://localstack:4566"
cloudformation = "https://localstack:4566"
cloudwatch = "https://localstack:4566"
dynamodb = "https://localstack:4566"
ec2 = "https://localstack:4566"
es = "https://localstack:4566"
firehose = "https://localstack:4566"
iam = "https://localstack:4566"
kinesis = "https://localstack:4566"
kms = "https://localstack:4566"
lambda = "https://localstack:4566"
rds = "https://localstack:4566"
route53 = "https://localstack:4566"
s3 = "https://localstack:4566"
secretsmanager = "https://localstack:4566"
ses = "https://localstack:4566"
sns = "https://localstack:4566"
sqs = "https://localstack:4566"
ssm = "https://localstack:4566"
stepfunctions = "https://localstack:4566"
sts = "https://localstack:4566"

# Create IAM role with trust relationship for lambda service

resource "aws_iam_role" "iam_role_lambda" {
name = var.app_name

assume_role_policy = file("${path.module}/../my-lambda/iam/lambda_iam_trust_policy.json")

# IAM policy template

data "template_file" "lambda_iam_policy" {
template = file("${path.module}/../my-lambda/iam/lambda_iam_policy.json")
vars = {
app_name = var.app_name
aws_region = var.aws_region
aws_account = var.aws_account

# Create the policy

resource "aws_iam_policy" "iam_policy" {
name = var.app_name
path = "/"
description = "IAM policy for lambda"
policy = data.template_file.lambda_iam_policy.rendered

# Attach policy to IAM role

resource "aws_iam_role_policy_attachment" "policy_for_lambda" {
role =
policy_arn = aws_iam_policy.iam_policy.arn

# Input bucket used by Lambda

resource "aws_s3_bucket" "in_bucket" {
bucket = "${var.app_name}-input"

# Output bucket used by lambda

resource "aws_s3_bucket" "out_bucket" {
bucket = "${var.app_name}-output"

# Create Lambda function

resource "aws_lambda_function" "func" {
filename = "../dist/${var.lambda_zip}"
function_name = var.app_name
role = aws_iam_role.iam_role_lambda.arn
handler = "my-lambda.lambda_handler"

depends_on = [

source_code_hash = filebase64sha256("../dist/${var.lambda_zip}")

runtime = "python3.8"

# Lambda function environment variables

environment {
variables = {
OUTPUT_BUCKET = "${var.app_name}-output"

# Add permissions to allow s3 to trigger lambda function

resource "aws_lambda_permission" "allow_s3_trigger" {
statement_id = "AllowTriggerFromS3"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.func.arn
principal = ""
source_arn = aws_s3_bucket.in_bucket.arn

# s3 event config:
# Configure bucket notification for triggering lambda
# when s3 csv file uploaded to input bucket

resource "aws_s3_bucket_notification" "bucket_notification" {
bucket =

lambda_function {
lambda_function_arn = aws_lambda_function.func.arn
id = "s3_trigger"
events = ["s3:ObjectCreated:*"]
filter_suffix = "csv"

depends_on = [aws_lambda_permission.allow_s3_trigger]

Configuring Terraform Variables for AWS Cloud Deployment

Before deployment to AWS Cloud, needs to be updated to reflect details of the target AWS Account.


Modify this file, replacing xxxxxxxxxxxx with the target AWS Cloud account:

variable "aws_account" {
type = string
default = "xxxxxxxxxxxx"

Jenkins Configuration and Deployment Pipeline

Configure Jenkins and Install Gogs Plugin

Visit and complete the Jenkins setup.

Once complete, add the via Jenkin's . This plugin is required for establishing Webhooks between Gogs and Jenkins.

Create New Jenkins Job

Create a new Jenkins job:

  • New Item Pipeline
  • Enter an item name → jenkins-deploy-lambda-terraform
  • Choose Pipeline for job type

On the job configuration screen, select/enter the following values

Gogs Webhook

  • Enable option → Use Gogs secret
  • Secret → <choose a password> (example, mysecretlambda)
    This will be required in the following section, when setting up the Webhook from within Gogs

Build Triggers

  • Enable option → Build when a change is pushed to Gogs


  • Definition → Pipeline script from SCM
  • SCM → Git
  • Repository URL

Note: we use the container’s internal host and port in the repo URL

  • Credentials → Add → Jenkins → kind → username with password
    password <your password as chosen during Gogs config>
    ID git-creds

Note: and should correspond to your Gogs git username and password, as created during the configuration of Gogs (if you've used the values mentioned in the Appendix, then is )

  • Branches to build → */master
  • Script Path → Jenkinsfile

Configure Gogs Webhook

To configure the Gogs Webhook for the new job,

  • Head back to
  • Go to the repository’s Settings and choose Webhooks from the left panel
  • For Payload URL, the format is as follows:

which resolves to,


Again, we use the container’s internal network host and port in the URL.

  • Application Content application/json
  • Secret <secret for the webhook>
    The secret must match what was entered during the Jenkin’s job setup for Gog’s Webhook, e.g.: mysecretlambda)
  • When should this webhook be triggered? Just the push event
  • Add Webhook

Once complete, any changes pushed to the repository should automatically trigger the Jenkins job.

We can perform a test from Gogs to ensure the Webhook has been correctly configured. Gogs offers a option which attempts to trigger the job on-request.

Test Gogs->Jenkins Webhook

To test the Webhook,

  • Go to Gogs
  • Go to for repo,
  • Under , choose the Webhook we just configured
  • Click
  • This should trigger the Jenkins job
  • Go back to Jenkins and view the build history for our job
  • The job should automatically deploy the Lambda to Localstack (i.e, the first 4 stages automatically executed)
  • The very last stage (stage 5), deploys the Lambda to configured AWS Cloud account only when is selected in response to an interactive prompt. The stage automatically aborts after timeout duration (60sec) if a response has not been received

Test Deployed Lambda using Localstack

With the Lambda successfully deployed to Localstack, we run a test using the supplied input file (contains 3 sample rows with 5 columns). To trigger the Lambda, copy the file to the S3 input bucket:

$ cd $HOME/jenkins-deploy-lambda-terraform$ aws --profile localstack \
--endpoint-url http://localhost:4566 \
s3 cp lambda-input-test-file.csv \

This should trigger the function in a separate container. The container is not removed after execution completes, this allows for troubleshooting errors by examining the container logs.

If the Lambda executes successfully, a transposed version of the input file should be located at

A quick look at the contents should show 5 rows, 3 columns:

$ aws --profile localstack \
--endpoint-url http://localhost:4566 \
s3 cp \
s3://transposer-output/lambda-input-test-file_transposed.csv -



From here onwards, Lambda code changes applied to local repo, and pushed to the self-hosted Git instance, would automatically trigger the Jenkins pipeline for redeployment.

Deploying to AWS Cloud

Once the Lambda is tested successfully on Localstack, it can be deployed to an AWS Cloud account for further testing by responding with “Proceed” to the interactive prompt for the last stage of the Jenkins pipeline. The additional testing would be targeted at covering off on aspects that cannot be practically tested in a local environment.


So what do we gain from all this?

My personal take-home points from having gone through the set up:

  • An understanding/appreciation of technologies used by Devops
  • An insight into Terraform
  • Having my own local environment which allows for more destructive testing
  • The typical AWS resources associated with creating and implementing a Lambda
  • Working with Lambda S3 event triggers
  • Packaging Python-based Lambdas containing non-standard libraries
  • Reducing AWS costs by using Localstack for the majority of testing and keeping Cloud usage to a minimum
  • Minimising manual tasks by incorporating automation tweaks — such as the Gogs Webhook

Why did you use Terraform and not Ansible?

Some may argue that Ansible is a more suitable option for the task. I’m sure there are several reasons for and against the use of Ansible vs Terraform. This article was focused more on the learning “how to”, by using Terraform, instead of the “why nots”.


Configure Gogs Git Service

Visit and complete the initial setup. Below are sample config values.

After selecting , a redirect to occurs, however, since we have the container's internal port of mapped to our external host port , we need to change the port in the URL to .

Sign up

The first user to go through the signup process is allocated as Administrator. The below shows sign-up of initial user .



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store