uWSGI is one of those interesting projects that keeps adding features with every new release without becoming totally bloated, slow, and/or unstable. In this post, we’ll look at some of its lesser used features and how you might use them to simplify your Python web service.
Let’s start by looking at a common Python web project’s deployment stack.
- Nginx: Static file serving, SSL termination, reverse proxy
- Memcached: Caching
- Celery: Background task runner
- Redis or RabbitMQ: Queue for Celery
- uWSGI: Python WSGI server
Five services. That’s a lot of machinery to run for a basic site. Let’s see how uWSGI can help you simplify things:
Static File Serving
uWSGI can serve static files quite efficiently. It can even do so without tying up the same worker/thread pool your application uses thanks to it’s offloading subsystem. There are a bunch of configuration options around static files, but the common ones we use are:
offload-threads
the number of threads to dedicate to serving static filescheck-static
this works like Nginx’s@tryfiles
directive, checking for the existence of a static file before hitting the Python applicationstatic-map
does the same, but only when a URL pattern is matched
Other options exist to allow you to control gzipping and expires headers among other things. An ini configuration for basic static file serving might look like this:
More information on static file handling is available on a topic page in the uWSGI docs. When placed behind a CDN, this setup is sufficient for even high-traffic sites.
SSL Termination
uWSGI can handle SSL connections and even the SPDY protocol. Here’s an example configuration which will use HTTPS and optionally SPDY as well as redirecting HTTP requests to HTTPS:
Reverse Proxy
uWSGI speaks HTTP and can handle efficiently routing requests to multiple workers. Here’s an example that will start an HTTP listener on port 80:
In this scenario, you’ll need to start uwsgi
as the root
user to access port 80, but it will drop privileges to an unprivileged account via the uid/gid
arguments.
You can also do routes and redirects (see the docs for more complex examples):
Note: It is unclear to me whether uWSGI’s HTTP server is vulnerable to DoS attacks such as Slowloris. Please leave a comment if you have any more information here.
Caching
Did you know uWSGI includes a fast in-memory caching framework? The configuration for it looks like this:
This will configure a cache named default
capable of holding up to 5000 items purging least recently used keys in the event of an overflow. The cache will periodically be asynchronously flushed to disk (/tmp/uwsgi_cache
) so the uWSGI process can be restarted without also dropping the entire cache.
You can find the caching framework docs here and a Django-compliant cache backend, django-uwsgi-cache
, is available on PyPI.
Task Queuing
Yes, that’s right, uWSGI includes a task queue too. The uWSGI spooler can not only queue tasks for immediate execution, but also provide cron-like functionality to schedule tasks to run at some point in the future. It is configured, simply by providing a directory to store the queue and the number of workers to run:
The uwsgi
Python package provides a uwsgidecorators
module that can be used to place jobs on the queue for execution. A simple example:
Conclusion
As you can see, uWSGI really is a swiss army knife for serving Python web services. Actually, it’s not even limited to Python. You can use it for Ruby and Perl sites as well. We’ve used many of these features on production sites with great success. While specialized services are certainly going to be more robust for high-volume workloads, they are simply overkill for the majority of sites.
Distributed microservice architectures may be all the rage, but the reality is that most sites can run on a single server. Reducing the number of services and dependencies makes deployment easier and removes points of failure in your system. Before you jump to add more tools to your stack, it’s worth checking if you can make do with what you already have.