If you are using nginx proxy_pass as the front-end for your node app, you should also be taking advantage of nginx’s ability to serve static content much more efficiently than node. This also takes quite a bit of load off your node app, which should not be spending time serving the hundreds of static files people are requesting, and should speed up your webpage loading times.

The config looks like this:

server {
    listen 80;
    server_name ksloan.net;

    root /home/kevin/app/public;

    location / {
        try_files $uri @backend;
    }

    location @backend {
        proxy_pass http://127.0.0.1:8080;
        access_log off;

        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
        proxy_set_header Host $host;
        proxy_hide_header X-Frame-Options;
    }
}

Let’s go see what this is doing…

First we’re settings up the server to listen on the basic HTTP port 80, for people visiting ksloan.net.

We set the root directory to the location of our public files. This directory is part of my node application structure where I keep static files I want the user to have access to. This includes javascript, stylesheets, images and other 3rd party libraries. Your public directory could look something like this:

public/
js/
css/
img/
lib/

The separation of your public files into a separate dir enables you to have nginx check the public folder to see if the user is requesting any of the files in there. If the user is requesting a public file, nginx can serve that file itself without involving your node server at all.

The first location block tells nginx that when anyone makes any kind of request, we should first try to serve the file from the /public/ directory, and if that fails, proceed to head over to the second location block to continue with the request. The @backend variable is a named location, which allows us to redirect requests in this way.

The $uri variable contains the path the user is requesting. So if someone requests http://ksloan.net/css/index.css the $uri would be css/index.css. If /home/kevin/app/public/css/index.css exists – then nginx will serve that file immediately and end the request.

In a case where someone requests a file that doesn’t exist, for example, http://ksloan.net/register, nginx would check for /public/register, and when that file doesn’t exist – it will forward that request to your node app like normal.

Depending on how many static files your website serves, you should see a dramatic decline in node.js requests, and decrease in your web page loading times!

There are several other things you can do to speed up your website. If your website is under heavy load, you’ll want to enable both server-side, and client-side caching for nginx, or possibly look into using a CDN. However, this setup works well enough for the majority of websites hosting moderate traffic.