NGINX Gateway On Heroku

SaaS On-premises 2.0

Control access, monitor traffic, and manage your API with 3scale

At 3scale, we recommend using NGINX as an API proxy for several reasons, among them its outstanding performance and its extensibility thanks to the Lua scripting support. When used to integrate with the 3scale API Management Platform, it's hands-down the best way to add an access control layer to your existing API.

The following tutorial describes the required steps to deploy NGINX acting as an API gateway proxy on the Heroku platform. Heroku provides a fantastic fully managed platform as a service for your application, so the required maintenance effort for your part will be minimal. Since NGINX is so lightweight, the free offering from Heroku will be enough for most cases.

High-level overview

Create a Heroku application with a custom buildpack including Lua and LuaRocks. Deploy the OpenResty distribution of NGINX using a Lua rock specifically tailored for Heroku. Then use the NGINX configuration files generated by 3scale from your Admin Portal, and tweak them to work properly with your NGINX on Heroku.


  • 3scale account – sign up here
  • Heroku account
  • The Heroku toolbelt must be installed on your computer – get it here


  • Set up your API in 3scale and download the auto-generated NGINX config files
  • Clone the Git repository with template files
  • Create an empty Heroku app
  • Copy your 3scale NGINX config files into your project directory to replace the templates
  • Modify the NGINX config files to adapt them for deploying in Heroku
  • Push the Git repository to deploy NGINX on Heroku
  • Check that everything works
This tutorial will show you how to use NGINX hosted in Heroku as a proxy to integrate your API with 3scale. If you already have an API, you can use it. Otherwise, jump now to the "Deploying a sample API to Heroku" section of this tutorial, where you will deploy a test API in minutes.

Once you have an API to use with the proxy, you can carry on from here.

Create an empty Heroku app

You will need to deploy the OpenResty web app server, which is essentially a bundle of the standard NGINX core with almost all the necessary third-party NGINX modules built in.

For an easy start, you can fork and clone the repository at

git clone

This repository contains some boilerplate files required by Heroku. Later on, you will only need to modify the nginx.sample.conf and nginx_3scale_access.lua. You will see a package.rockspec file is included. This is a special type of file that defines Lua dependencies and makes sure that the declared dependencies are fetched and installed when you push your repository to the server. There is only one dependency in this case, which is OpenResty.

After cloning the repository change into the directory that was just created automatically, "api-proxy-3scale-heroku".

Heroku doesn't support the Lua runtime by default, so you will need to create the application using a custom buildpack that includes:

  • Lua 5.1
  • LuaRocks, a Lua package manager

All you have to do is execute this:

heroku create --buildpack

The output of this command will show a URL with the random name generated by Heroku, with domain, for example,

This is where you will be able to reach your NGINX API proxy.

Heroku create command output


If you plan to implement the OAuth 2.0 Server-side/Authorization code flow, you will also need to install Redis on your Heroku instance.

For this, we recommend using the Redis To Go add-on. To install this, execute:

heroku addons:add redistogo

Configure and deploy 3scale API gateway

Now you've created an empty Lua application in Heroku. Before deploying NGINX with your configuration, you'll need to replace the nginx.sample.conf and nginx_3scale_access.lua with your own NGINX configuration files, which you'll download from 3scale.

Go through the following steps to download your customized configuration files:

  • Sign in to your 3scale account.
  • Go to the API(s) section.
  • Select the service (it will be named "API" if you only have one).
  • Click on the Integration link.
  • You need to make sure that the Deployment Option is set to Self-managed Gateway. If it's not the case, click on edit integration settings on the top right corner of the integration page and select Self-managed Gateway as production deployment option.
    If you are planning to use OAuth, make sure you select OAuth authentication option.

There will be some fields that you'll have to fill in with information about your API. This data will be used to automatically generate configuration files for NGINX already customized to integrate your API with 3scale.

The most important field right now is the one at the top, in the Staging section, Private Base URL. Here, type the complete URL of your API service, including the port. For instance: Click on the Update & Test Staging Configuration button to save the settings.

If you need further guidance on how to fill the rest of these fields, visit this tutorial.

In case your API is also hosted on Heroku, you will also need to fill the Host Header field. Check the previously linked tutorial to find out more about this.

In the Production section at the bottom fill the Public Base URL field with the URL of the gateway which you created in the previous step (

Finally, click on Update Production Configuration to save the settings, and after that you can download the configuration files by clicking on Download the NGINX Config files.

NGINX proxy configuration screen

The files you just downloaded will require a few changes.

  • Move them into your "api-proxy-3scale-heroku" directory.
  • Delete the existing nginx.sample.conf and nginx_3scale_access.lua files.
  • Rename your new .conf file, downloaded from 3scale, to nginx.conf.
  • Modify nginx.conf:
    • Add the following lines to the top of the file: daemon off;
    • error_log stderr;
    • Replace listen 80; at the beginning of the server block with listen ${{PORT}};. This is necessary because Heroku can change the port where your process is listening. This way, it will be filled in from a environment variable.

Set the port of the gateway as an environment variable:

heroku config:set PORT=80

Now you're all set and ready to deploy your customized NGINX. Commit the changes and push them to Heroku:

git add .
git commit -m "deploying proxy with custom configuration from 3scale"
git push heroku master


Additionally, for OAuth 2.0 server-side/authorization code flow, you will also need to make some changes to configure the connection to your Redis instance.

To use Redis To Go we need to do the following:

  1. Extract the host and port from the REDISTOGO_URL environment variable.
  2. Authenticate using the password provided.

You will want to replace the existing connect_redis function in threescale_utils.lua with the following to connect to your Redis To Go instance:

function M.connect_redis(red)
  redisurl = os.getenv("REDISTOGO_URL")
  redisurl_connect = string.split(redisurl, ":")[3]
  redisurl_user = string.split(redisurl_connect, "@")[1]
  redisurl_host = string.split(redisurl_connect, "@")[2]
  redisurl_port = tonumber(string.split(redisurl, ":")[4])
  local ok, err = red:connect(redisurl_host, redisurl_port)
  if not ok then
    ngx.say("failed to connect: ", err)
  local res, err = red:auth(redisurl_user)
  if not res then
    ngx.say("failed to authenticate: ", err)
  return ok, err

You will also need to declare the environment variable on top of the nginx.conf:

To check the value of the REDISTOGO_URL environment variable you can use the following command:
heroku config:get REDISTOGO_URL


Now you have NGINX adding an access control layer on top of your API. Make sure that it's working properly by making an authenticated request to your API.

Go back to the 3scale Admin Portal, and get the credentials of one of the sample applications that you you created when you first logged in to your 3scale account (if you missed that step, create a developer account and an application within that account).

If your API endpoint is:

To use the API proxy, you should now make a request to the following URL:

Deploying a sample API to Heroku

To test your API proxy, you will need an API backend. Here are the steps to deploying an API in Heroku. This tutorial will use the Sentiment API, which is an API that returns the sentiment rating of a given word.

Bear in mind that you need to create an additional Heroku application to host this API. You cannot have NGINX and the API in the same Heroku application.

The basic endpoint of this API is:

The previous request would return:


You've have already prepared a repository containing the Sentiment API and the required configuration files to deploy it in Heroku. You can find this repository here:

If you prefer to deploy your own API to Heroku, the way you should prepare it depends on your programming language of choice. You can find all the information on the Heroku Dev Center.

To deploy the Sentiment API on Heroku:

  • Clone the repository onto your computer: git clone
  • Move into the repository directory that was created on your computer with the name, "sentiment-api-heroku".
  • Run heroku create. The console output will show a URL. This will be your API domain.
  • Run git push heroku master.

To test that the API was deployed successfuly, you can make a request like the following one to the URL that was generated after executing heroku create:

Of course, this is completely bypassing the access control layer. Note that once your API gateway is in place, you will want to make sure that public access to your API is blocked. For this purpose, you can define your own "proxy secret token" in your 3scale Integration Proxy Wizard under advanced settings. Then your backend app simply verifies that the shared secret is correct.


If there are any problems after deploying to Heroku, it's always a good idea to check the production logs. You can do so by running the following command: heroku logs

During debugging, it may be useful to temporarily lower the level for error logging. You just need to edit the line that was added at the top of the nginx.conf file and append the desired level – all possible options are described here: error_log stderr info;

For further help, we strongly recommend you to check the Heroku Dev Center.


Full credit goes to Taylor Brown for proposing this solution as an alternative to deploying NGINX on Amazon EC2. Check it out here.

Also many thanks to Leaf Corcoran, creator of the Lua buildpack for Heroku and the awesome OpenResty Lua rock.