In a Hurry?
- Docker Compose setup for PostgreSQL
- Environment variables for database configuration
- SQLAlchemy ORM integration
- Database health checks and error handling
Long Story:
I enjoy using SQLite which is a no fuss file system database. To be honest, SQLite works great almost everytime. Don't judge me for saying this but I have deployed production application over SQLite.😪However, Wherever I have worked, almost 100% of the time, I have used PostgreSQL.
Today, we're going to connect our FastAPI application to PostgreSQL using Docker Compose, and I'll show you all the gotchas and best practices I've learned along the way.
Why PostgreSQL Over SQLite?
Before we dive in, let me tell you why we're making this jump. SQLite is fantastic for development - it's simple, requires no setup, and works out of the box. But when you're building something that multiple people will use, you need something more robust.
PostgreSQL gives us:
- Concurrent connections: Multiple users can access your API simultaneously
- ACID compliance: Atomic transactions, Consistency, Integrity of data and Durability all out of box.
- Rich data types: JSON, arrays, and custom types
- Scalability: Can handle millions of records without breaking a sweat
The best part? With Docker Compose, setting up PostgreSQL is almost as easy as SQLite!
Setting Up Our Database Dependencies
First, let's update our requirements.txt
file. I'm going to add the new dependencies with comments so you know what each one does:
fastapi==0.115.12
uvicorn==0.27.1
#new
python-dotenv==1.1.1
psycopg2-binary==2.9.10
sqlmodel==0.0.24
Here's what each package does:
python-dotenv
: Loads environment variables from a.env
file (super important for keeping secrets safe)psycopg2-binary
: The PostgreSQL adapter for Python (this is what actually talks to the database)SQLAlchemy
: Our ORM (Object-Relational Mapping) tool that makes database operations feel like Python
Docker Compose Magic
Now comes the fun part - Docker Compose. This is where we define both our FastAPI application and PostgreSQL database as services. Add the db section to docker-compose.yml
file:
services:
web:
build: .
command: uvicorn main:app --host 0.0.0.0 --port 8000 --reload
volumes:
- .:/app
ports:
- "8000:8000"
restart: on-failure:3
db:
image: postgres:16
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- .env
environment:
- POSTGRES_PASSWORD=${DB_PASSWORD}
- POSTGRES_USER=${DB_USER}
- POSTGRES_DB=${DB_NAME}
ports:
- "5432:5432"
restart: always
volumes:
postgres_data:
Let me break this down:
- web service: This is our FastAPI application
- db service: This is our PostgreSQL database
- volumes: This ensures our database data persists even when containers restart
- restart policies:
on-failure:3
for web means it'll try to restart 3 times if it crashes,always
for db means it'll always restart
The beauty of this setup is that Docker Compose creates a network where our services can talk to each other using their service names. So our FastAPI app can reach the database at db:5432
!
Environment Variables: Keeping Secrets Safe
Create a .env
file in your project root. This is where we'll store our database credentials:
DB_USER=nofoobar
DB_PASSWORD=top_secret
DB_HOST=db
DB_PORT=5432
DB_NAME=genai_blog
Pro tip: Never commit your .env
file to version control! Add it to your .gitignore
file. I learned this the hard way when I accidentally pushed database credentials to GitHub once. Not fun.
Configuration Management
Now let's update our core/config.py
file to handle database configuration:
import os
from dotenv import load_dotenv
load_dotenv()
class Settings:
TITLE: str = "GenAI API"
DESCRIPTION: str = "Blog API Powered by Generative AI"
VERSION: str = "1.0.0"
DATABASE_URL: str = f"postgresql://{os.getenv('DB_USER')}:{os.getenv('DB_PASSWORD')}@{os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}/{os.getenv('DB_NAME')}"
settings: Settings = Settings()
This approach centralizes all our configuration in one place. Notice how we're building the DATABASE_URL
using the environment variables. This URL follows the format: postgresql://username:password@host:port/database_name
Database Connection Setup
Create a new file database/db.py
. This is where the real work happens:
from sqlmodel import create_engine, Session
from core.config import settings
print(f"DATABASE_URL is: {settings.DATABASE_URL}")
engine = create_engine(settings.DATABASE_URL)
Let me explain what's happening here:
create_engine:
engine = create_engine(settings.DATABASE_URL)
- Connects to your PostgreSQL database using the URL from environment variables
- The engine manages the actual database connection pool
Database session as a dependency:
We will be creating a dependency function named get_db in the apis/deps.py.This will be used by every single api route that needs a connection to the database.
from sqlmodel import Session
from database.db import engine
def get_db():
with Session(engine) as session:
yield session
Database Session Generator
- Creates a new database session for each request
with
statement ensures the session is properly closed after useyield
makes this a generator function - perfect for FastAPI's dependency injection- Each API endpoint that needs database access will call this function
Usage in FastAPI:
@app.get("/users/")
def get_users(db: Session = Depends(get_db)):
# db is now a database session ready to use
return db.exec(select(User)).all()
This pattern ensures each request gets its own clean database session that's automatically cleaned up when done!
Updating Our Main Application
Now let's update our main.py
file to include a database health check:
from fastapi import FastAPI
from typing import Dict
from sqlalchemy import text
from core.config import settings
from apis.deps import get_db
app: FastAPI = FastAPI(title=settings.TITLE, description=settings.DESCRIPTION, version=settings.VERSION)
@app.get("/")
def read_root() -> Dict[str, str]:
return {"message": "Hello World"}
@app.get("/health/db")
def health_check_db() -> Dict[str, str]:
try:
db = next(get_db())
db.execute(text("SELECT 1"))
return {"status": "ok"}
except Exception as e:
return {"status": "error", "message": str(e)}
The health check endpoint is crucial - it lets you verify that your database connection is working. In production, you'll want to monitor this endpoint to ensure your database is always reachable.
Running Everything Together
Now comes the moment of truth! Run:
docker-compose up --build
If everything is set up correctly, you should see both services starting up. Visit http://127.0.0.1:8000/docs to check if your database connection is working.
Common Mistakes:
- os.getenv not reading values: Make sure os.getenv('DB_") is reading the values from .env file. I have added print(f"DATABASE_URL is: {settings.DATABASE_URL}") line so check the Database url is correct in the terminal logs.
- Don't use @ in the password. It can break the format of DATABASE_URL