← Back to Tutorials

Building REST APIs with Python: A Practical Guide for Intermediate Developers

pythonrest-apiapi-designfastapiflaskauthenticationvalidationtestingpaginationerror-handling

Building REST APIs with Python: A Practical Guide for Intermediate Developers

This tutorial walks through building a production-leaning REST API in Python, focusing on practical decisions: project layout, validation, persistence, authentication, pagination, error handling, observability, and deployment. You’ll implement a small but realistic API (a “tasks” service) using FastAPI, SQLAlchemy, Alembic, and PostgreSQL—with commands you can run locally.


Table of Contents

  1. What “REST” Means in Practice
  2. Project Setup
  3. Designing the API: Resources, Endpoints, and Status Codes
  4. Implementing the API with FastAPI
  5. Data Validation and Serialization with Pydantic
  6. Database Layer with SQLAlchemy
  7. Migrations with Alembic
  8. CRUD Endpoints (Create, Read, Update, Delete)
  9. Filtering, Sorting, and Pagination
  10. Error Handling and Consistent Error Responses
  11. Authentication with JWT (Bearer Tokens)
  12. Testing the API
  13. Observability: Logging and Health Checks
  14. Deployment Notes (Docker + Uvicorn)
  15. Next Steps

What “REST” Means in Practice

REST is an architectural style, not a framework. In day-to-day API development, “RESTful” typically implies:

This guide focuses on the parts you need to build a maintainable API: correctness, consistency, and a structure that grows with your codebase.


Project Setup

Prerequisites

Create and activate a virtual environment

mkdir tasks-api
cd tasks-api

python -m venv .venv
# Linux/macOS
source .venv/bin/activate
# Windows PowerShell
# .\.venv\Scripts\Activate.ps1

Install dependencies

We’ll use:

pip install fastapi uvicorn sqlalchemy psycopg alembic pydantic-settings \
  python-jose passlib[bcrypt] python-multipart \
  pytest httpx

A common mistake is putting everything in main.py. Instead, separate concerns:

tasks-api/
  app/
    __init__.py
    main.py
    core/
      config.py
      security.py
      logging.py
    db/
      session.py
      models.py
      migrations/   (created by Alembic)
    schemas/
      task.py
      user.py
      errors.py
    api/
      deps.py
      routes/
        tasks.py
        auth.py
    services/
      tasks.py
      users.py
  tests/
    test_tasks.py
  alembic.ini
  pyproject.toml (optional)

You can start minimal and grow into this structure. We’ll implement enough pieces to make it real.


Designing the API: Resources, Endpoints, and Status Codes

We’ll implement a “tasks” resource:

We’ll also implement authentication:

Task representation (JSON)

Example task:

{
  "id": 123,
  "title": "Write API tutorial",
  "description": "Cover validation, migrations, auth, and testing",
  "status": "open",
  "created_at": "2026-02-14T10:00:00Z",
  "updated_at": "2026-02-14T10:30:00Z"
}

Key design choices:


Implementing the API with FastAPI

Create app/main.py:

from fastapi import FastAPI
from app.api.routes import tasks, auth
from app.core.logging import configure_logging

def create_app() -> FastAPI:
    configure_logging()
    app = FastAPI(title="Tasks API", version="1.0.0")

    app.include_router(auth.router, prefix="/auth", tags=["auth"])
    app.include_router(tasks.router, prefix="/tasks", tags=["tasks"])

    return app

app = create_app()

Create app/api/routes/tasks.py and app/api/routes/auth.py soon; first set up configuration and database.


Data Validation and Serialization with Pydantic

FastAPI relies on Pydantic models to:

Create app/schemas/task.py:

from datetime import datetime
from enum import Enum
from pydantic import BaseModel, Field, ConfigDict

class TaskStatus(str, Enum):
    open = "open"
    in_progress = "in_progress"
    done = "done"

class TaskCreate(BaseModel):
    title: str = Field(min_length=1, max_length=200)
    description: str | None = Field(default=None, max_length=2000)

class TaskUpdate(BaseModel):
    title: str | None = Field(default=None, min_length=1, max_length=200)
    description: str | None = Field(default=None, max_length=2000)
    status: TaskStatus | None = None

class TaskOut(BaseModel):
    model_config = ConfigDict(from_attributes=True)

    id: int
    title: str
    description: str | None
    status: TaskStatus
    created_at: datetime
    updated_at: datetime

Important details:


Database Layer with SQLAlchemy

We’ll use SQLAlchemy 2.x style.

Configuration

Create app/core/config.py:

from pydantic_settings import BaseSettings, SettingsConfigDict

class Settings(BaseSettings):
    model_config = SettingsConfigDict(env_file=".env", extra="ignore")

    database_url: str = "postgresql+psycopg://postgres:postgres@localhost:5432/tasks"
    jwt_secret_key: str = "change-me"
    jwt_algorithm: str = "HS256"
    access_token_exp_minutes: int = 60

settings = Settings()

This reads environment variables (optionally from .env). For production, don’t hardcode secrets; inject via environment.

SQLAlchemy session

Create app/db/session.py:

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, DeclarativeBase
from app.core.config import settings

engine = create_engine(settings.database_url, pool_pre_ping=True)

SessionLocal = sessionmaker(bind=engine, autoflush=False, autocommit=False)

class Base(DeclarativeBase):
    pass

Models

Create app/db/models.py:

from datetime import datetime, timezone
from sqlalchemy import String, Text, DateTime, Enum
from sqlalchemy.orm import Mapped, mapped_column
from app.db.session import Base
from app.schemas.task import TaskStatus

def utcnow() -> datetime:
    return datetime.now(timezone.utc)

class Task(Base):
    __tablename__ = "tasks"

    id: Mapped[int] = mapped_column(primary_key=True)
    title: Mapped[str] = mapped_column(String(200), nullable=False)
    description: Mapped[str | None] = mapped_column(Text, nullable=True)
    status: Mapped[TaskStatus] = mapped_column(Enum(TaskStatus, name="task_status"), default=TaskStatus.open)
    created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default=utcnow, nullable=False)
    updated_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default=utcnow, onupdate=utcnow, nullable=False)

Why this matters:


Migrations with Alembic

Start PostgreSQL (Docker)

Run a local PostgreSQL container:

docker run --name tasks-postgres -e POSTGRES_PASSWORD=postgres -e POSTGRES_DB=tasks \
  -p 5432:5432 -d postgres:16

Confirm it’s running:

docker ps

Initialize Alembic

From project root:

alembic init app/db/migrations

Edit alembic.ini and set sqlalchemy.url to match your config (or keep placeholder and override in env.py). A cleaner approach is to load from settings.

Edit app/db/migrations/env.py to use your app’s engine URL and metadata:

from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool

from app.core.config import settings
from app.db.session import Base
from app.db import models  # noqa: F401

config = context.config
fileConfig(config.config_file_name)

target_metadata = Base.metadata

def run_migrations_offline() -> None:
    context.configure(
        url=settings.database_url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )
    with context.begin_transaction():
        context.run_migrations()

def run_migrations_online() -> None:
    configuration = config.get_section(config.config_ini_section)
    configuration["sqlalchemy.url"] = settings.database_url

    connectable = engine_from_config(
        configuration,
        prefix="sqlalchemy.",
        poolclass=pool.NullPool,
    )

    with connectable.connect() as connection:
        context.configure(connection=connection, target_metadata=target_metadata)
        with context.begin_transaction():
            context.run_migrations()

if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

Create and apply the first migration

alembic revision --autogenerate -m "create tasks table"
alembic upgrade head

Verify tables exist (optional):

docker exec -it tasks-postgres psql -U postgres -d tasks -c "\dt"

CRUD Endpoints (Create, Read, Update, Delete)

Dependency for DB session

Create app/api/deps.py:

from collections.abc import Generator
from app.db.session import SessionLocal
from sqlalchemy.orm import Session

def get_db() -> Generator[Session, None, None]:
    db = SessionLocal()
    try:
        yield db
    finally:
        db.close()

A service layer helps keep route handlers thin and testable.

Create app/services/tasks.py:

from sqlalchemy.orm import Session
from sqlalchemy import select
from app.db.models import Task
from app.schemas.task import TaskCreate, TaskUpdate, TaskStatus

def create_task(db: Session, data: TaskCreate) -> Task:
    task = Task(title=data.title, description=data.description, status=TaskStatus.open)
    db.add(task)
    db.commit()
    db.refresh(task)
    return task

def get_task(db: Session, task_id: int) -> Task | None:
    return db.get(Task, task_id)

def list_tasks(db: Session, *, offset: int, limit: int) -> list[Task]:
    stmt = select(Task).offset(offset).limit(limit).order_by(Task.id.asc())
    return list(db.scalars(stmt).all())

def update_task(db: Session, task: Task, data: TaskUpdate) -> Task:
    if data.title is not None:
        task.title = data.title
    if data.description is not None:
        task.description = data.description
    if data.status is not None:
        task.status = data.status
    db.commit()
    db.refresh(task)
    return task

def delete_task(db: Session, task: Task) -> None:
    db.delete(task)
    db.commit()

Routes

Create app/api/routes/tasks.py:

from fastapi import APIRouter, Depends, HTTPException, status, Query
from sqlalchemy.orm import Session
from app.api.deps import get_db
from app.schemas.task import TaskCreate, TaskOut, TaskUpdate
from app.services import tasks as tasks_service

router = APIRouter()

@router.post("", response_model=TaskOut, status_code=status.HTTP_201_CREATED)
def create_task(payload: TaskCreate, db: Session = Depends(get_db)):
    return tasks_service.create_task(db, payload)

@router.get("", response_model=list[TaskOut])
def list_tasks(
    db: Session = Depends(get_db),
    offset: int = Query(0, ge=0),
    limit: int = Query(50, ge=1, le=200),
):
    return tasks_service.list_tasks(db, offset=offset, limit=limit)

@router.get("/{task_id}", response_model=TaskOut)
def get_task(task_id: int, db: Session = Depends(get_db)):
    task = tasks_service.get_task(db, task_id)
    if not task:
        raise HTTPException(status_code=404, detail="Task not found")
    return task

@router.patch("/{task_id}", response_model=TaskOut)
def patch_task(task_id: int, payload: TaskUpdate, db: Session = Depends(get_db)):
    task = tasks_service.get_task(db, task_id)
    if not task:
        raise HTTPException(status_code=404, detail="Task not found")
    return tasks_service.update_task(db, task, payload)

@router.delete("/{task_id}", status_code=status.HTTP_204_NO_CONTENT)
def delete_task(task_id: int, db: Session = Depends(get_db)):
    task = tasks_service.get_task(db, task_id)
    if not task:
        raise HTTPException(status_code=404, detail="Task not found")
    tasks_service.delete_task(db, task)
    return None

Run the server

uvicorn app.main:app --reload --port 8000

Open interactive docs:

Try real requests with curl

Create a task:

curl -s -X POST http://127.0.0.1:8000/tasks \
  -H "Content-Type: application/json" \
  -d '{"title":"Write tutorial","description":"Make it practical"}' | jq

List tasks:

curl -s "http://127.0.0.1:8000/tasks?offset=0&limit=10" | jq

Update status:

curl -s -X PATCH http://127.0.0.1:8000/tasks/1 \
  -H "Content-Type: application/json" \
  -d '{"status":"in_progress"}' | jq

Delete:

curl -i -X DELETE http://127.0.0.1:8000/tasks/1

Filtering, Sorting, and Pagination

Pagination is not optional for real APIs. Even if your dataset is small today, unbounded GET /tasks becomes a performance and memory risk.

We already used offset/limit. For larger datasets, consider keyset pagination (a.k.a. cursor-based), because offset pagination gets slower as offsets grow.

Add filtering by status

Update list_tasks service:

from sqlalchemy import select
from app.db.models import Task
from app.schemas.task import TaskStatus

def list_tasks(db, *, offset: int, limit: int, status: TaskStatus | None = None) -> list[Task]:
    stmt = select(Task)
    if status is not None:
        stmt = stmt.where(Task.status == status)
    stmt = stmt.order_by(Task.id.asc()).offset(offset).limit(limit)
    return list(db.scalars(stmt).all())

Update route:

from app.schemas.task import TaskStatus

@router.get("", response_model=list[TaskOut])
def list_tasks(
    db: Session = Depends(get_db),
    offset: int = Query(0, ge=0),
    limit: int = Query(50, ge=1, le=200),
    status: TaskStatus | None = None,
):
    return tasks_service.list_tasks(db, offset=offset, limit=limit, status=status)

Test:

curl -s "http://127.0.0.1:8000/tasks?status=open&limit=5" | jq

Sorting

Sorting should be constrained to known fields to avoid SQL injection-like issues and to keep query plans stable. A safe pattern is to map allowed sort keys to columns.

Example idea (not fully implemented here):


Error Handling and Consistent Error Responses

FastAPI already returns structured validation errors for invalid inputs (HTTP 422). But your own errors (404, 409, etc.) can become inconsistent if you handcraft messages everywhere.

Define an error schema (optional but useful)

Create app/schemas/errors.py:

from pydantic import BaseModel

class ErrorOut(BaseModel):
    detail: str

You can then document error responses in routes, but more importantly, you can standardize patterns:

Handle database integrity errors

If you later add unique constraints (e.g., unique task title per user), you should catch IntegrityError and return 409 Conflict rather than 500.

Pattern:

from sqlalchemy.exc import IntegrityError
from fastapi import HTTPException

try:
    db.commit()
except IntegrityError:
    db.rollback()
    raise HTTPException(status_code=409, detail="Conflict")

This is a big difference in API quality: clients can react programmatically to 409, but not to random 500s.


Authentication with JWT (Bearer Tokens)

For intermediate developers, JWT is a common next step. The goal:

User model and migration

Add a user table. Create app/schemas/user.py:

from pydantic import BaseModel, EmailStr, Field, ConfigDict

class UserCreate(BaseModel):
    email: EmailStr
    password: str = Field(min_length=8, max_length=128)

class UserOut(BaseModel):
    model_config = ConfigDict(from_attributes=True)
    id: int
    email: EmailStr

class TokenOut(BaseModel):
    access_token: str
    token_type: str = "bearer"

Update app/db/models.py with User:

from sqlalchemy import String
from sqlalchemy.orm import Mapped, mapped_column

class User(Base):
    __tablename__ = "users"

    id: Mapped[int] = mapped_column(primary_key=True)
    email: Mapped[str] = mapped_column(String(320), unique=True, index=True, nullable=False)
    password_hash: Mapped[str] = mapped_column(String(255), nullable=False)
    created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), default=utcnow, nullable=False)

Generate migration:

alembic revision --autogenerate -m "create users table"
alembic upgrade head

Security utilities

Create app/core/security.py:

from datetime import datetime, timedelta, timezone
from jose import jwt
from passlib.context import CryptContext
from app.core.config import settings

pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")

def hash_password(password: str) -> str:
    return pwd_context.hash(password)

def verify_password(password: str, password_hash: str) -> bool:
    return pwd_context.verify(password, password_hash)

def create_access_token(subject: str) -> str:
    now = datetime.now(timezone.utc)
    exp = now + timedelta(minutes=settings.access_token_exp_minutes)
    payload = {"sub": subject, "iat": int(now.timestamp()), "exp": exp}
    return jwt.encode(payload, settings.jwt_secret_key, algorithm=settings.jwt_algorithm)

Notes:

User service

Create app/services/users.py:

from sqlalchemy.orm import Session
from sqlalchemy import select
from app.db.models import User
from app.core.security import hash_password, verify_password

def get_user_by_email(db: Session, email: str) -> User | None:
    stmt = select(User).where(User.email == email)
    return db.scalars(stmt).first()

def create_user(db: Session, *, email: str, password: str) -> User:
    user = User(email=email, password_hash=hash_password(password))
    db.add(user)
    db.commit()
    db.refresh(user)
    return user

def authenticate(db: Session, *, email: str, password: str) -> User | None:
    user = get_user_by_email(db, email)
    if not user:
        return None
    if not verify_password(password, user.password_hash):
        return None
    return user

Auth routes

Create app/api/routes/auth.py:

from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.orm import Session
from app.api.deps import get_db
from app.schemas.user import UserCreate, UserOut, TokenOut
from app.services import users as users_service
from app.core.security import create_access_token

router = APIRouter()

@router.post("/register", response_model=UserOut, status_code=status.HTTP_201_CREATED)
def register(payload: UserCreate, db: Session = Depends(get_db)):
    existing = users_service.get_user_by_email(db, payload.email)
    if existing:
        raise HTTPException(status_code=409, detail="Email already registered")
    return users_service.create_user(db, email=payload.email, password=payload.password)

@router.post("/login", response_model=TokenOut)
def login(payload: UserCreate, db: Session = Depends(get_db)):
    user = users_service.authenticate(db, email=payload.email, password=payload.password)
    if not user:
        raise HTTPException(status_code=401, detail="Invalid credentials")
    token = create_access_token(subject=str(user.id))
    return TokenOut(access_token=token)

This uses UserCreate for login too (email + password). In a real system you might create a UserLogin schema, but it’s fine here.

Protect endpoints

FastAPI provides OAuth2PasswordBearer to parse the Bearer token. We’ll validate JWT ourselves.

Create app/api/deps.py additions:

from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer
from jose import jwt, JWTError
from sqlalchemy.orm import Session
from app.core.config import settings
from app.db.models import User

oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/auth/login")

def get_current_user(
    db: Session = Depends(get_db),
    token: str = Depends(oauth2_scheme),
) -> User:
    try:
        payload = jwt.decode(token, settings.jwt_secret_key, algorithms=[settings.jwt_algorithm])
        sub = payload.get("sub")
        if not sub:
            raise HTTPException(status_code=401, detail="Invalid token")
        user_id = int(sub)
    except (JWTError, ValueError):
        raise HTTPException(status_code=401, detail="Invalid token")

    user = db.get(User, user_id)
    if not user:
        raise HTTPException(status_code=401, detail="Invalid token")
    return user

Now require auth on task routes. Example: only authenticated users can create tasks.

In app/api/routes/tasks.py:

from app.api.deps import get_current_user
from app.db.models import User

@router.post("", response_model=TaskOut, status_code=status.HTTP_201_CREATED)
def create_task(
    payload: TaskCreate,
    db: Session = Depends(get_db),
    current_user: User = Depends(get_current_user),
):
    # current_user is available; you can later associate tasks with user_id
    return tasks_service.create_task(db, payload)

Try it with curl

Register:

curl -s -X POST http://127.0.0.1:8000/auth/register \
  -H "Content-Type: application/json" \
  -d '{"email":"dev@example.com","password":"supersecret123"}' | jq

Login:

TOKEN=$(curl -s -X POST http://127.0.0.1:8000/auth/login \
  -H "Content-Type: application/json" \
  -d '{"email":"dev@example.com","password":"supersecret123"}' | jq -r .access_token)

echo "$TOKEN"

Create a task with Bearer token:

curl -s -X POST http://127.0.0.1:8000/tasks \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"title":"Secure task","description":"Requires auth"}' | jq

Testing the API

Testing is where intermediate developers often level up. You want:

For simplicity, the example below focuses on API-level tests. In a real project, you’d create a separate test database and override get_db.

Install test dependencies (already installed above)

We’re using pytest and httpx.

Create tests/test_tasks.py:

import pytest
from httpx import AsyncClient
from app.main import create_app

@pytest.mark.anyio
async def test_docs_available():
    app = create_app()
    async with AsyncClient(app=app, base_url="http://test") as ac:
        r = await ac.get("/docs")
        assert r.status_code == 200

This verifies the app boots. To test DB-backed endpoints, you should:

  1. Spin up a test Postgres (Docker in CI) or use SQLite for tests.
  2. Override get_db dependency to point to a test session.
  3. Run migrations or create schema.

A practical approach is to run Postgres in Docker and use a separate DATABASE_URL for tests:

docker run --name tasks-postgres-test -e POSTGRES_PASSWORD=postgres -e POSTGRES_DB=tasks_test \
  -p 5433:5432 -d postgres:16

Then set:

export DATABASE_URL="postgresql+psycopg://postgres:postgres@localhost:5433/tasks_test"

And run migrations before tests:

alembic upgrade head
pytest -q

To fully automate this, you’d add fixtures that:

That’s beyond a minimal snippet, but the pattern is consistent: override dependencies and control your DB lifecycle.


Observability: Logging and Health Checks

Logging

Create app/core/logging.py:

import logging

def configure_logging() -> None:
    logging.basicConfig(
        level=logging.INFO,
        format="%(asctime)s %(levelname)s %(name)s %(message)s",
    )

For production, consider structured JSON logs and include request IDs. FastAPI middleware can inject correlation IDs, but keep it simple until you need it.

Health endpoint

Add to app/main.py:

from fastapi import FastAPI

# inside create_app()
@app.get("/health")
def health():
    return {"status": "ok"}

In real deployments, you might add a DB check, but be careful: a “health” endpoint used by load balancers should be fast and reliable. Often you split:


Deployment Notes (Docker + Uvicorn)

A minimal Dockerfile:

FROM python:3.11-slim

WORKDIR /app

COPY . /app

RUN pip install --no-cache-dir fastapi uvicorn sqlalchemy psycopg alembic pydantic-settings \
    python-jose passlib[bcrypt] python-multipart

EXPOSE 8000

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Build and run:

docker build -t tasks-api:latest .
docker run --rm -p 8000:8000 \
  -e DATABASE_URL="postgresql+psycopg://postgres:postgres@host.docker.internal:5432/tasks" \
  -e JWT_SECRET_KEY="replace-in-prod" \
  tasks-api:latest

Notes:


Next Steps

To make this API genuinely production-ready, consider implementing:

  1. Task ownership: add user_id to tasks, restrict access to own resources.
  2. Keyset pagination: replace offset/limit for large datasets.
  3. Idempotency keys for POST endpoints that may be retried.
  4. Rate limiting (e.g., via a reverse proxy or middleware).
  5. OpenAPI-driven client generation: FastAPI docs enable generating typed clients.
  6. Better error envelopes: include code, detail, and fields for validation-like errors.
  7. CI pipeline: run pytest, ruff, mypy, and migrations in CI.
  8. Background tasks: use Celery/RQ or FastAPI background tasks for async jobs.

Recap

You built a REST API with:

If you want, I can extend this tutorial by adding task ownership (multi-tenant security), a full test database fixture setup, and a Docker Compose configuration for API + Postgres + migrations.