Creating a reverse proxy and load balancer in 2 minutes using Caddy

PressRex profile image
by PressRex
Creating a reverse proxy and load balancer in 2 minutes using Caddy

Setting Up Your Computer:

To perform what will be demonstrated in this example, you'll need these tools:

Introduction:

Two well-known approaches in DevOps are reverse proxy and load balancer. Let's understand them:

  • Reverse proxy:

A Reverse Proxy is an approach where a server receives requests and acts as middleware with conditions and overrides to forward to a specific server. After the server responds, the reverse proxy is responsible for responding to the end user.

In the example above, the user makes a request to localhost:8080 or service2.com.br, and the reverse proxy will work based on its conditions, which in this case would be:

  • If a header "x-service-1" exists, it will call Service 1
  • If the host is "service2.com.br", it will call Service 2
  • If the path contains "/test", it will call Service 3

A use case for using a reverse proxy is when you don't want to expose multiple domains to the internet or when these applications are on a private network. In this scenario, you can leave only your reverse proxy public while keeping the applications private.

In this scenario, all Services are private, meaning you cannot access them directly. However, you use the reverse proxy to make calls using a single domain: api.domain.com

  • Load balancer:

Load balancer is an approach used to handle multiple instances of a single application (or multiple applications with certain conditions, though this approach is not ideal and is typically used to reduce costs).

Similar to a reverse proxy, the key difference with a Load Balancer is that it uses algorithms to decide which instance to call. Consider an example:

Suppose you have an API with 4 instances. How would you distribute traffic effectively to prevent one instance from being overloaded while another remains idle?

Using a Load Balancer solves this problem, as you can employ the most common algorithm, Round Robin, which is equitable and efficient in distributing traffic.

Unlike a reverse proxy, a Load Balancer points to a single application, distributing traffic equally and avoiding instance overload.

Another interesting point about Load Balancer is that we can specify a route that it can access periodically to check if the server is up. If the server is not running, the Load Balancer will not distribute traffic to the broken server.

Are they the same thing?

Definitely not. A load balancer can point to a reverse proxy, which in turn points to N load balancers, and many other existing scenarios.

Putting into Practice:

Now that you've learned about reverse proxy and load balancer, it's time to implement a basic example using Caddy and Node.js.

We'll use Caddy to create a server that will act as a reverse proxy and balance load using round-robin, and Node.js to create multiple instances of the same API.

Starting with Node.js, we'll create two files: server.js and start_workers.js

For server.js (which creates a simple server):

import { workerData } from "node:worker_threads";
import http from "node:http";

const sleep = async (ms) => {
    return new Promise(res => setTimeout(res, ms));
}

const server = http.createServer(async (req, res) => {
    // We use sleep to get a real example of a request that takes 2 seconds to respond
    await sleep(2000);

    const data = `Hello, this is the same API in different instance ${workerData.instanceId}`

    console.info(data);
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end(JSON.stringify({
      data,
    }));
});

console.info(`Server is running on port ${workerData.port}`)
server.listen(workerData.port);

It will start on the port passed by the parent process
Each call will display a message showing the instance ID

For start_workers.js:

import { Worker } from "node:worker_threads";
import path from "node:path";

for (let i = 0; i < 5; i++) {
    const worker = new Worker(path.resolve(process.cwd(), "server.js"), {
        workerData: {
            port: 8081 + i,
            instanceId: i
        }
    });
}
  • It will create 5 workers
  • Each worker will be an API running on different ports: :8081, :8082, :8083, :8084, :8085
  • Each worker will have a corresponding ID: 1, 2, 3, 4, 5

To use Caddy, we'll create a file called Caddyfile with the following configuration:

{
    admin 0.0.0.0:5555
}

:8080 {
    reverse_proxy :8081 :8082 :8083 :8084 :8085 {
        lb_policy round_robin
    }
}

The above file means we'll receive requests on port :8080, and Caddy should perform reverse proxy to ports :8081, :8082, :8083, :8084, :8085. Additionally, we specify that the Load Balancer will use Round Robin to decide which instance to call.

Running the Project

Once these three files are created, open 3 terminals in the project folder:

First terminal, start the Caddy server:

caddy run

Second terminal, start Node servers:

node start_workers.js

Third terminal, use autocannon to create parallel requests:

npx --yes autocannon http://localhost:8080

This command calls the reverse proxy, which will distribute traffic equally among our 5 Node servers.

You'll see something like:

As you can see, even without directly calling ports :8081, :8082, :8083, :8084, :8085, the reverse proxy distributed traffic equally without overloading any server.

If you can't reproduce this, you can verify the implementation in the GitHub repository: https://github.com/ekerdev/example-reverse-proxy-and-load-balancer

Author Of article : Erik Santana Read full article

PressRex profile image
by PressRex

Subscribe to New Posts

Lorem ultrices malesuada sapien amet pulvinar quis. Feugiat etiam ullamcorper pharetra vitae nibh enim vel.

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More