Set Up Workflows
Environment
Install PDM
The dependency manager used in this project is pdm. To install it, run the following command:
Or, alternatively, other installation methods can be used.
Install Dependencies
The dependencies are broken into groups:
-
Default dependencies: required for the core functionality of the project in production.
-
Development dependencies: required for development, testing, and documentation.
The specified python version in pyproject.toml is >=3.11, and so a python 3.11 interpreter should be used.
Conda
To do so with conda:
$ conda search python | grep " 3\.\(11\)\."
$ conda create --name booking_service_api -y python=3.11.9
$ conda activate booking_service_api
$ pdm use -f $(which python3)
$ pdm install
Pyenv
To do so with pyenv and virtualenv, use the pdm venv command:
$ pyenv install --list | grep " 3\.\(11\)\."
$ pyenv install 3.11.9
$ pdm venv create --name booking_service_api --with virtualenv 3.11.9
# To activate the virtual environment
$ eval $(pdm venv activate booking_service_api)
$ pdm install
Use --with venv to create the virtual environment using venv (comes with standard library) instead of virtualenv (third-party package but more performant). To create the venv in the project root instead of pdm config venv.location, omit the --name option:
Docker Compose for Local Development
Docker Compose is used for local development to isolate the application and its dependencies in a containerized environment. Two services are defined the the compose.yml file:
-
test: The service for running the application, unit, integration, and end-to-end tests. The following directories are mapped or bind-mounted from the host to the container:app/**: The application code to run the FastAPI application.tests/**: The test code as all tests are run in the container.pyproject.toml: The project configuration file, which includes the dependencies, pytest configurations, and other project settings.migrations: The Alembic migrations directory to manage the database schema. This is mounted to the container to apply migrations each time a new container is created (i.e.,docker compose up --detach --build).
-
db-test: The service for running the PostgreSQL database for testing purposes. The database is initialized with the schema and data required for testing based on the migration files in themigrationsdirectory.
The entrypoing script of the test.Dockerfile waits for the PostgreSQL database to start before running the application. This is done by checking if the port 5432 is open using the nc command. The script is as follows:
#!/bin/sh
echo "Waiting for postgres to start..."
while ! nc -z db-test 5432; do
# The nc (Netcat) is utility to check network connectivity
# The -z flag ensures nc only scan for open ports without sending data
# If the connection fails (i.e., PostgreSQL isn't up yet), the loop continues
sleep 0.1
done
echo "PostgreSQL started"
# 'exec' replaces the current shell process with the command and arguments passed to the script, preserving all arguments as separate arguments
# For example: ./script.sh command --option value -> exec command --option value
exec "$@"
Build and Run the Containers
To build the images and run the containers in the background:
This setup allows for automatic reloading of the application when changes are made to the code during development. The application is available at http://localhost:8004 or whichever port is specified in the compose.yml file.
To stop the containers without removing them:
To stop, remove the containers, and remove named volumes:
To view the logs of the services:
To run an interactive shell in a service container:
Testing with Pytest
The test suite is run using pytest and is divided into three categories:
The integration tests are run against a PostgreSQL database running in a Docker container. The DATABASE_URL_TEST database connection string is set as an environment variable in the compose.yml file.
The end-to-end tests are run against the FastAPI application running in dev mode on aws. The .github/workflows/ci_cd_end_to_end.yml workflow is configured to run after the .github/workflows/ecr_ecs_dev.yml workflow completes successfully. It sets up the aws cli and fetches the authentication credentials from the aws secrets manager to run the end-to-end tests.
Automation
Formatting with Black and Isort
The scripts directory contains a run_black_isort.sh shell script that runs black and isort on the project files. The script is also used in the GitHub Actions workflows to ensure consistent code formatting.
Run the script as follows: