OpenFaaS Logo | Source:

In this tutorial we will look how to migrate your AWS Lambda function (Node.js) to OpenFaaS


Why Migrate to OpenFaas?

Cloud Functions are awesome as it is, it's super cheap and fits most use-cases. Then you get OpenFaas, which has a lot of pro's when you compare it with Cloud Functions.

Below, a couple of pro's with my experience using OpenFaas:

  • Hosting functions on your own infrastructure to comply with localization standards.
  • Hosting functions on resources to suite the use-case (CPU, Memory, GPU intensive tasks).
  • You can use your existing Kubernetes or Docker Swarm cluster to deploy OpenFaas on.
  • No limits on TTL, which allows long running functions.
  • You are not limited to a specific cloud provider.
  • You have a function store and a active community that contributes to the function store, that is super helpful to bootstrap your projects.
  • Auto-scaling enabled by default.
  • a Range of supported programming languages, and you can even use bash, which is super awesome!
  • Super easy to learn and from my experience easier to work with.
  • The cli client, faas-cli makes working with openfaas even easier.
  • Grafana, Prometheus and AlertManager comes with the framework out of the box, which allows you to view metrics of your functions and setup alarming.

In my experience, I already have a Docker Swarm cluster running, where the resources are managed by a cloud provider and monitoring, high-availability and self-healing has been put into place.

Now that I can use OpenFaas on my current setup, is just amazing and suits my use-case perfectly.


Our end goal of migrating our AWS Lambda Function to OpenFaas:


Our Application

Our serverless application in AWS consists of API Gateway, DynamoDB and Lambda (Node.js).

For this demonstration, I kept the application very basic, which will execute a GetItem on our DynamoDB table, when we make a GET request on our API Gateway Resource.

In this scenario, I have hard-coded the hash-key value to ruan.bekker.

The flow will look like this:

-> API: /dev/person,
-> Lambda calls DynamoDB: {"id": "ruan.bekker"},
-> Response: {"id": "ruan.bekker", "name": "ruan", ...}

AWS Setup

For full transparency, I will setup the AWS Stack with Serverless:

$ mkdir -p ~/dev/aws-node-get-dynamodb \
  && cd ~/dev/aws-node-get-dynamodb
$ npm install -g serverless
$ serverless create --template aws-nodejs

Create the lambda function:

$ mkdir function/handler.js
$ cat function/handler.js

'use strict';

const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports.identity = (event, context, callback) => {
  const params = {
    TableName: process.env.DYNAMODB_TABLE,
    Key: {
      id: 'ruan.bekker',

  dynamoDb.get(params, (error, result) => {
    if (error) {
      callback(null, {
        statusCode: error.statusCode || 501,
        headers: { 'Content-Type': 'text/plain' },
        body: 'GetItem Failed',

    const response = {
      statusCode: 200,
      body: JSON.stringify(result.Item),
    callback(null, response);

Our serverless definition file:

$ cat serverless.yml

service: aws-node-get-dynamodb
frameworkVersion: ">=1.1.0 <2.0.0"

  name: aws
  runtime: nodejs10.x
    DYNAMODB_TABLE: my-dynamodb-table
    - Effect: Allow
        - dynamodb:GetItem
      Resource: "arn:aws:dynamodb:${opt:region, self:provider.region}:*:table/${self:provider.environment.DYNAMODB_TABLE}"

    handler: functions/handler.identity
      - http:
          path: person
          method: get
          cors: true

      Type: 'AWS::DynamoDB::Table'
      DeletionPolicy: Retain
            AttributeName: id
            AttributeType: S
            AttributeName: id
            KeyType: HASH
          ReadCapacityUnits: 1
          WriteCapacityUnits: 1
        TableName: ${self:provider.environment.DYNAMODB_TABLE}

Deploy the stack:

$ serverless deploy --region eu-west-1
Serverless: Packaging service...
Serverless: Excluding development dependencies...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Uploading service file to S3 (7.38 MB)...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
Serverless: Stack update finished...
Service Information
service: aws-node-get-dynamodb
stage: dev
region: eu-west-1
stack: aws-node-get-dynamodb-dev
resources: 12
api keys:
  GET -
  get: aws-node-get-dynamodb-dev-get
Serverless: Run the "serverless" command to setup monitoring, troubleshooting and testing.

Now our stack is deployed, let's write an item to DynamoDB.

Since the focus is on migration, I have hard coded the hash key to ruan.bekker, so let's create the item to DynamoDB:

$ aws dynamodb put-item \
  --table-name my-dynamodb-table --item \
    "id": {"S": "ruan.bekker"},
    "name": {"S": "ruan"},
    "surname": {"S": "bekker"},
    "country": {"S": "south africa"},
    "age": {"N": "32"}

Make a GET request against the API Gateway URL:

$ curl
{"id":"ruan.bekker","surname":"bekker","name":"ruan","country":"south africa","age":32}

And you can see the item has been retrieved from DynamoDB.

Setup the OpenFaaS Function:

Create a new Node.js OpenFaaS Function (note that I have my image prefix and gateway url for my setup, as shown below)

$ mkdir -p ~/dev/lambda-to-openfaas-migration \
  && cd ~/dev/lambda-to-openfaas-migration
$ faas-cli new \
  --lang node person \
  --prefix=ruanbekker \
$ mv person.yml stack.yml

In my scenario, I will create the AWS Access Keys and Secret Keys as OpenFaaS Secrets:

$ faas-cli secret create my-aws-secret-key --from-literal="your-access-key"
$ faas-cli secret create my-aws-access-key --from-literal="your-secret-key"

Provide the aws-sdk dependency in our package.json, that we will require to interact with AWS:

$ cat person/package.json
  "name": "function",
  "version": "1.0.0",
  "description": "",
  "main": "handler.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  "keywords": [],
  "author": "",
  "license": "ISC",
  "dependencies": {
    "aws-sdk": "latest"

Our stack definition:

$ cat stack.yml
  name: openfaas
    lang: node
    handler: ./person
    image: ruanbekker/person:latest
      content_type: application/json
      DYNAMODB_TABLE: my-dynamodb-table
      AWS_REGION: eu-west-1
      - my-aws-access-key
      - my-aws-secret-key

We still have our AWS Lambda function code from our initial setup, but let's say our stack was already provisioned and we dont have any copy of it locally.

We will download the lambda deployment package:

$ mkdir aws-lambda \
  && cd aws-lambda
$ lambda_url=$(aws lambda get-function --function-name serverless-rest-api-with-dynamodb-dev-get  | jq -r .Code.Location)
$ curl -o "${lambda_url}"

Extract the deployment package and replace the lambda function handler with the generated openfaas handler:

$ unzip
$ cd ..
$ mv aws-lambda/function/handler.js person/handler.js

Next we will need to modify our handler to include our secrets and environment variables:

$ cat person/handler.js

'use strict';
const fs = require('fs');
const secretAK = "/var/openfaas/secrets/my-aws-access-key";
const secretSK = "/var/openfaas/secrets/my-aws-secret-key";
const accessKey = fs.readFileSync(secretAK, "utf-8");
const secretKey = fs.readFileSync(secretSK, "utf-8");

const AWS = require('aws-sdk');
  credentials: new AWS.Credentials ({
    region: process.env.AWS_REGION,
    accessKeyId: accessKey,
    secretAccessKey: secretKey

const dynamoDb = new AWS.DynamoDB.DocumentClient();

module.exports = (context, callback) => {
  const params = {
    TableName: process.env.DYNAMODB_TABLE,
    Key: {
      id: 'ruan.bekker',

  dynamoDb.get(params, (error, result) => {
    if (error) {
      callback(null, {
        statusCode: error.statusCode || 501,
        headers: { 'Content-Type': 'text/plain' },
        body: 'GetItem Failed',

    const response = result.Item;

    callback(null, response);

Deploy our OpenFaaS function:

$ export OPENFAAS_URL=
$ faas-cli up
Deploying: person.
Deployed. 202 Accepted.

Let's test our function by making a GET request on our OpenFaaS API Gateway URL:

$ curl
{"id":"ruan.bekker","surname":"bekker","name":"ruan","country":"south africa","age":32}

Boom, we have migrated our AWS Lambda Function to OpenFaaS, easy as that!)