How to deploy a REST API AWS Lambda using Chalice and AWS Code Pipeline

This article shows how to deploy an AWS Lambda using the Chalice library and AWS Code Pipeline.

In the first step, we’ll define the lambda function. Then we’ll write the deployment script and a definition of a Code Pipeline step to run the script.

AWS Lambda with Chalice

We have to create a new Chalice application and wrap our lambda function in the route decorator. We’ll put the Python code in the app.py file.

1
2
3
4
5
6
7
from chalice import Chalice

app = Chalice(app_name="predictor")

@app.route("/", methods=["POST"])
def index():
    return ""

In addition to the Python code, we must specify the configuration in the .chalice/config.json file:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
{
  "version": "2.0",
  "app_name": "lambda-function-name",
  "autogen_policy": false,
  "automatic_layer": true,
  "environment_variables": {
      "NAME_OF_ENV_VARIABLE": "ITS VALUE"
  },
  "stages": {
    "dev": {
      "api_gateway_stage": "api"
    }
  }
}

If the lambda function requires any IAM permissions, we put the IAM policy definition in the .chalice/policy-dev.json file. Note that the dev part of the name refers to the API Gateway stage, so if we use a different one, we have to rename the file and change the stage name in .chalice/config.json.

Finally, we have to create a requirements.txt file with the lambda function libraries. Be careful because there is a 50MB size limit for the AWS Lambda deployment package (both your code and all dependencies).

Deployment script

Before we start, we create the requirements-dev.txt file containing the libraries used for deploying the code:

1
2
3
4
-r requirements.txt

boto3==1.14.12
chalice==1.20.0

After defining the development dependencies, we create the deploy_lambda.sh file in which we use Chalice to deploy our code:

1
2
3
#!/bin/bash

chalice deploy --stage dev

Expected directory structure

When we finish, the directory structure should look like this:

1
2
3
4
5
6
7
8
lambda
  - .chalice
    - config.json
    - policy-dev.json
  - app.py
  - deploy_lambda.sh
  - requirements.txt
  - requirements-dev.txt

AWS Code Pipeline deployment step

To deploy the code, we create an AWS Code Pipeline project:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
EndpointDeploymentProject:
    Type: AWS::CodeBuild::Project
    Properties:
      Name: !Sub ${AWS::StackName}-pipeline-endpointdeployment
      Description: Deploys a Lambda
      ServiceRole: !GetAtt CodeDeploymentRole.Arn
      Artifacts:
        Type: CODEPIPELINE
      Environment:
        Type: LINUX_CONTAINER
        ComputeType: BUILD_GENERAL1_SMALL
        Image: aws/codebuild/python:3.6.5
      Source:
        Type: CODEPIPELINE
        BuildSpec: !Sub |
          version: 0.2
          phases:
            pre_build:
              commands:
                - echo "Installing requirements"
                - pip install --upgrade pip
                - pip install -r lambda/requirements-dev.txt
            build:
              commands:
                - echo "Running Chalice"
                - cd lambda
                - bash deploy_lambda.sh
            post_build:
              commands:
                - echo "Deployed"
          artifacts:
            files:
              - '**/*'
      TimeoutInMinutes: 30

Besides the build step definition, we must define the code deployment role and its permissions. It is better to limit the permissions as much as possible, but if you don’t care about configuring fine-grained permissions, you can use this configuration:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
CodeDeploymentRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Sub ${AWS::StackName}-codedeploy-role
      AssumeRolePolicyDocument:
        Statement:
          - Action: ["sts:AssumeRole"]
            Effect: Allow
            Principal:
              Service: [codebuild.amazonaws.com]
        Version: "2012-10-17"
      Path: /
      Policies:
        - PolicyName: UploadAccess
          PolicyDocument:
            Version: "2012-10-17"
            Statement:
              - Action:
                  - codepipeline:*
                  - s3:*
                  - logs:*
                  - iam:GetRole
                  - iam:CreateRole
                  - iam:PutRolePolicy
                  - iam:PassRole
                  - lambda:*
                  - apigateway:*
                Effect: Allow
                Resource: "*"

In the end, we add a step to the AWS Code Pipeline:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
- Name: Deploy_Endpoint
    Actions:
    - Name: EndpointDeployment
        ActionTypeId:
        Category: Build
        Owner: AWS
        Provider: CodeBuild
        Version: "1"
        Configuration:
        ProjectName: !Ref "EndpointDeploymentProject"
        InputArtifacts:
        - Name: dpl
        OutputArtifacts:
        - Name: enddpl
        RunOrder: "4"

Have you noticed that I deploy the code using a CodeBuild step? If you know how to get the same result using CodeDeploy, let me know!

Did you enjoy reading this article?
Would you like to learn more about software craft in data engineering and MLOps?

Subscribe to the newsletter or add this blog to your RSS reader (does anyone still use them?) to get a notification when I publish a new essay!

Newsletter

Do you enjoy reading my articles?
Subscribe to the newsletter if you don't want to miss the new content, business offers, and free training materials.

Bartosz Mikulski

Bartosz Mikulski

  • Data/MLOps engineer by day
  • DevRel/copywriter by night
  • Python and data engineering trainer
  • Conference speaker
  • Contributed a chapter to the book "97 Things Every Data Engineer Should Know"
  • Twitter: @mikulskibartosz
Newsletter

Do you enjoy reading my articles?
Subscribe to the newsletter if you don't want to miss the new content, business offers, and free training materials.