Setup Varnish as an API Proxy
At a end of this how to you will have setup up a Varnish instance on your server and connected it to 3scale’s traffic management system.
The architecture for API delivery via Varnish is as shown in Figure 1. No changes are required to application software in this deployment model and Varnish instances (one or many depending on the number of data centers or load balancing requirements) are deployed in front of the application.
The 3scale Varnish module (download via github) provides for a part of the cache space to be reserved for traffic control storage.
Once the 3scale varnish module is installed Varnish will check against 3scale for the validity of the API request. If positive, Varnish will serve the result from its cache or fetch the data from your origin server, finally Varnish will report to 3scale to maintain the consistent state and real-time analytics.
The connection between your local Varnish proxy and 3scale’s backend by default are done asynchronously using an eventual consistency model:
- Each time a call reaches the API, Varnish first tries to authorize from cached data.
- After the call response (and after serving the customer API call) a background call is triggered to 3scale to update the current policy status within the cache.
This configuration means that for cached keys, API traffic can be served uninterrupted by traffic management roundtrips and further, data in the cache is kept fresh using background calls which do not generate API client wait times.
Step by Step
Step 1: Download Varnish and the 3scale Module
For this configuration you require:
- The latest version of Varnish (minimum V3.0 or above): http://www.varnish-cache.org.
- The 3scale Varnish Module: https://github.com/3scale/libvmod-3scale/.
Step 2: Configure the Module and VCL Script
The module is configured and compiled as described in the module readme and once compiled, it is imported into your VCL configuration as follows (note that this is a sample only – see the bundle readme for precise instructions):
set req.http.X-tmp = threescale.send_get_request_threaded(“su1.3scale.net”,“80”,req.url,"");
set req.http.X-tmp = threescale.send_get_request(“su1.3scale.net”,“80”,req.url,“X-ur-header: true;”);
In addition a mapping file is required to determine which calls are checked against rate limits and reported to 3scale. A sample of the mapping file is provided in the module download in the VCL directory: Mapping File Sample.
Step 3: Add Varnish into your Traffic Flow Pipeline
Once Varnish is setup you’ll need to ensure that traffic for the API reaches Varnish before reaching your application.
In-Band and Out-of-Band Modes
Varnish can be used either in the way described above as a proxy filter for the API traffic or as a stand-alone out of band system. An out of Band configuration is shown in Figure 2. In this mode calls are made to the varnish instance as part of the API application or from some other load balancer and the traffic does not flow directly through Varnish.
API Content Caching and Varnish
The default Varnish setup here is concerned with added an authentication and reporting layer in front of the API and does not require the use of Varnish to actually cache API responses themselves. This is an independent choice to be made and if desired, standard cache controls described on the Varnish Web site can be used (http://www.varnish-cache.org).