Skip to content

[Linux] Using Nginx to Reverse Proxy and Forward Service Requests to Different Ports on a Single Port

Last Updated on 2024-08-04 by Clay

Introduction

Nginx is a high-performance HTTP server and reverse proxy server often used for web traffic management, load balancing, and HTTP caching. This article focuses on how to configure Nginx to route different API requests to the corresponding services.

In a real-world application scenario, I encountered an issue where online services and public servers are constantly targeted by numerous malicious bots attempting to find vulnerabilities. Our firewall can restrict public ports to provide protection.

A very effective implementation is to host multiple services on different ports and use Nginx to proxy requests based on API paths. This way, the firewall only needs to open one port, and Nginx handles the reverse proxying for other services.

Diagram of Nginx Request Forwarding

Let's go through the steps to implement this locally:

  1. Start two small services using Python FastAPI + Uvicorn
  2. Set up the Nginx proxy service
  3. Test the forwarding effect

Creating a Test API Service with Python

First, install the required fastapi and uvicorn packages.

pip install fastapi uvicorn


Next, write a test API service. We can start the API service with different paths by inputting parameters.

from fastapi import FastAPI
import uvicorn
import argparse


def create_app(api_name: str, port: str) -> FastAPI:
    app = FastAPI()

    @app.get(f"/api/{api_name}")
    async def read_root():
        return {"message": f"Hi, I am the endpoint /api/{api_name} on port {port}"}

    return app


if __name__ == "__main__":
    parser = argparse.ArgumentParser(description="FastAPI server")
    parser.add_argument("--port", type=int, default=8000, help="Port to run the server on")
    parser.add_argument("--api-name", type=str, required=True, help="API name to be used in the endpoint")
    args = parser.parse_args()

    app = create_app(
        api_name=args.api_name,
        port=args.port,
    )
    uvicorn.run(app, host="0.0.0.0", port=args.port)


Then we input:

python3 fastapi_service.py --api-name service1 --port 8001
python3 fastapi_service.py --api-name service2 --port 8002


Output:

INFO:     Started server process [481342]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit)


AND


INFO: Started server process [452845]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8002 (Press CTRL+C to quit)

We will successfully start two test services with APIs /api/service1 and /api/service2, respectively.

You can use the curl command to confirm that both are working properly.

curl "http://127.0.0.1:8001/api/service1"


Output:

{"message":"Hi, I am the endpoint /api/service1 on port 8001"}

Setting Up the Nginx Proxy Service

Assuming we have multiple services, we need to set up an Nginx proxy service to route all API requests to the correct resource addresses.

First, we install Nginx:

sudo apt update
sudo apt install nginx


Then edit the configuration file /etc/nginx/conf.d/default.conf:

sudo vim /etc/nginx/conf.d/default.conf


And write the configuration:

upstream service1 {
    server 127.0.0.1:8001;
}

upstream service2 {
    server 127.0.0.1:8002;
}

server {
    listen 443;

    location /api/service1 {
        proxy_pass http://service1;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    location /api/service2 {
        proxy_pass http://service2;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}


The upstream block in this configuration file defines the location of the upstream servers, which essentially tells Nginx where to forward the API requests.

server block first defines the port 443 for Nginx to listen on. Any requests sent to port 443 will be forwarded by the Nginx service.

location blocks are crucial settings that define that requests starting with /api/service1 will be proxied to the service1 server (8001), and requests starting with /api/service2 will be proxied to the service2 server (8002).

Additionally, Nginx configuration files support regular expression (RE) location definitions. For example, we can write it in the following form:

...
    # ~ means case-sensitive
    location ~ ^/api/service1/.*$ {
        proxy_pass http://service1;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }

    # ~* means case-insensitive
    location ~* ^/api/service2/.*$ {
        proxy_pass http://service2;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}


Finally, we can check for any errors in the Nginx configuration:

sudo nginx -t


Then restart the Nginx service:

sudo systemctl restart nginx.service


Finally, let's test the forwarding effect! Now, let's send a request to port 443:

curl "http://127.0.0.1:443/api/service1"
curl "http://127.0.0.1:443/api/service2"


Output:

{"message":"Hi, I am the endpoint /api/service1 on port 8001"}
{"message":"Hi, I am the endpoint /api/service2 on port 8002"}


We can see that Nginx successfully received the requests sent to port 443 and forwarded them to the corresponding ports based on the API paths!

This is the simplest configuration of Nginx. In practice, Nginx has many convenient applications, such as integrating with certbot for free SSL certificates or using htpasswd for basic authentication mechanisms.

I plan to compile these advanced Nginx applications into an article soon. If you're interested, you can search for it on my website; it should be published by then.

Thank you for reading this far. If you have any questions, feel free to ask. I will respond as soon as possible.


References


Read More

Tags:

Leave a ReplyCancel reply

Exit mobile version