Load balancing is an approach that allows to disribute incoming network traffic across a group of backend servers efficiently. This group of backend service is often called a server farm or a server pool.
Nginx is a web server that can also be used as a reverse proxy, load balancer, mail proxy or HTTP cache. It is a free and open-source software, released under the terms of the 2-clause BSD license. It is one of a few most popular software choises when in comes to load balancing over HTTP.
- Learn how to install and run Nginx server on Ubuntu.
- Learn how to create a server pool and to distribute incoming workload across it.
- See how load balancing works hands-on.
This repository contains a simple service written in Python. The service provides a simple REST API over the HTTP protocol. The API consists of a single endpoint:
GET /colour
that returns a simple string with a colour, e.g., BLUE
, GREEN
, etc. The colour string is always returned in the upper case.
The service always listens to requests at port 80
.
The service reads the colour it returns from the environment variable called
COLOUR
. It will fail to start, if the variable is not set. The value of this
variable is case insensitive.
The colour may be set to one of the following:
Red, Green, Blue, Orange, Pink, Fuchsia, Amber, Coral, Maroon
The service will fail to start, if the specified colour is not one from the list above.
Since the service is written in Python and requires some dependencies to be installed
it is better to package it into a container image. The service was tested with
Python 3.8. Instead of using Ubuntu as the base image you may use an official
image called python
that already have both python and pip pre-installed.
Tags on that image correspond to the version of python installed in the image. Images that are tagged with just a python version are based on Debian and are rather heavy. Therefore it is recommended to use images tagged as "*-alpine" - those are based on Alpine and are much smaller.
Do not forget to install the service's dependencies from the requirements.txt
file. Use pip install -r requirements.txt
for that.
Run a container with docker run
as usual. Do not forget to map port 80 to one of the ports
available on the server. Ports 8080-8090 and ports 80 and 443 are open. I do not
recommend to map container ports to the server's port 80 because it will conflict
with the following tasks.
Do not forget to configure the colour.
- Use
docker logs
to check service logs. The-f
switch allows to "follow" the new logs. In case you use-f
switch useCtrl+c
to exit the logs. - With your browser or with
curl
check whether you can get a colour string from the service by sending the appropriate HTTP request. Please refer to Pre-requisites for details. - Discuss what you observe.
- Use
docker stop
anddocker rm
to kill and remove the container - this will simulate a failure of a service. - Check, if you can still get the colour the same way you did in the previous task.
- Discuss what you observe.
- Run 3 or more containers with the service the same way you did in Task 2. For now configure different colour for different service instance.
- Do not forget to map port 80 of every container to a different port on the server.
This task is similar to the Task 3 but you have to do it for every service instance.
- Check, if service instances are running with
docker-logs
. - With your browser or with curl try getting a colour from different instances.
- Discuss what you observe.
Nginx listens on port 80 by default. To avoid errors, please stop and remove containers, whose ports were mapped to the port 80 of the server, if any.
For the porpose of this lab all your personal user accounts were granted sudo permissions.
- Install nginx to the server. The package name is
nginx
. - With your browser check that it works sending a GET request to
/
at port 80. The response should show you a default nginx site.
Nginx stores its configuration is multiple files. The location of those files
may be different in different Linux distributions but they are normally located
somewhere under the /etc
directory since it's the directory to store all
configuration files.
On Ubuntu nginx stores its main configuration file with some global settings at
/etc/nginx/nginx.conf
. In our case we do not need to configure it. Please
explore this file and see what settings are configured there.
Individual sites are normally configured in individual configuration files. On
Ubuntu Nginx reads every text file from the /etc/nginx/sites-enabled/
directory
and threats it as a configuration files.
Standard practice is to put individual configuration files for different sites
into /etc/nginx/sites-available/
directory. This directory is not read by Nginx.
To enable a specific individual site you have to add a symlink for its configuration
file from /etc/nginx/sites-available/
into /etc/nginx/sites-enabled/
. If you
need to disable a site you delete an appropriate symlink from /etc/nginx/sites-enabled/
.
This method allows to enable and disable individual sites while keeping their configuration files safe.
- Disable the
default
site by removing the appropriate symlink from/etc/nginx/sites-enabled/
. - Create a new configuration file in
/etc/nginx/sites-available/
for your site with the following content:
upstream backend {
# Adjust the following list to the list of the service instances.
server 127.0.0.1:8081;
server 127.0.0.1:8082;
}
server {
listen 80;
server_name _;
location / {
proxy_pass http://backend;
}
}
- Enable your site by creating a symlink for the adding configuration file
into
/etc/nginx/sites-enabled/
. - Test the nginx configuration by running
nginx -t
- Since Nginx is running as a daemon (remember what it is?) you have to restart
it with the daemon management system, which in Ubuntu (as well as in many other Linux
distributions) is called
systemd
. The command line for restarting Nginx daemon is
systemctl restart nginx
- Use
systemctl status nginx
to check the status of the Nginx daemon.
Your site should be available on the port 80
now.
-
With your browser or curl try to get colours several times from your service by accessing it via the load balancer at port 80.
-
What do you observe? How are requests distrubuted?
Note that browsers may cache responses so you will always get the same response. On Mac use cmd+shift+r to send the request and show the page without the cache. Alternatively use curl - it does not use cache.
- With the change of colours on different requests explore how default Round Robin load balancing strategy works.
- You may also use
docker logs
to confirm that requests are being distributed.
Sometimes we may want to send more requests to certain instances, for instance when they are running on more powerful hardware. Nginx supports specifying weights for every backend instance. The bigger is the waight the more requests that instance will get.
- In the site's configuration file edit the weight of the instances by adding
weight=N
parameter to everyserver
directive inside theupstream
:
upstream backend {
# Adjust the following list to the list of the service instances.
server 127.0.0.1:8081 weight=1;
server 127.0.0.1:8082 weight=5;
}
..........
- Test the configuration with
nginx -t
and restart the daemon, if the test is successful. - Try getting colours several times.
- Discuss what you observe.
- Remove all
weight
parameters from Nginx configuration and restart the daemon.
Now it's time to simulate a real-life situation. In production all instances of a service normally have the same configuration and process queries the same way.
- Stop and remove all running containers with the service.
- Start 3 or more instances of the service with the same configuration.
- Adjust nginx's configuration, if needed (if ports were changed).
- Try running getting the colours and examine request distribution with
docker logs
.
- Simulate a failure by stopping and removing one container.
- Try getting the colours several times.
- Did the failure affect your user experience like it did in Task 4?
- You may play around by removing more containers.
- Commit created nginx configurations to this repo into
nginx_config
folder. - Stop the server