Server-Side Scalability
Overview
In this part of the course, we move from the database to looking into server-side performance. We look into HTTP and its evolution, horizontal scaling of server-side functionality, and how to communicate between services.
This part is organized as follows:
- HTTP and Caching starts by looking into how HTTP has evolved over the years and how HTTP headers can be used to inform clients about caching of the resources.
- Statelessness and Horizontal Scaling with Containers discusses stateful and stateless applications, horizontal scaling, and how to create multiple replicas of a service in Docker.
- Traffic Distribution and Load Balancing discusses load balancing more broadly and continues from the previous section by adding a load balancer to distribute incoming requests to the replicas.
- Docker Networking Overview takes a step back and provides an overview of how services communicate with each other in a Docker network and why services need to be explicitly exposed to the local machine.
- Microservices and Message Passing revisits common architectural patterns in web applications and discusses how microservices architecture and message passing can enhance scalability.
- Message Queues with Redis showcases how message queues can be used for asynchronous communication between services and briefly evaluates the impact of using Redis as a message broker on server throughput.
- Overarching Project continues with the overarching project, focusing on exercise submissions and a separate grading API service.
Finally, at the end of the part, there is a recap and feedback chapter that briefly summarizes the part and asks for feedback on the part.