Local Development
Nice, this is a good moment to lock this down properly.
Below is a full, concrete local development guide plus all the Docker-related files you need so you can literally paste them into your repo and start wiring things up.
I’ll assume your repo root is:
1. Prerequisites¶
On your local machine you need:
- Docker (Docker Desktop on Windows/macOS, or Docker Engine on Linux)
- Docker Compose v2 (comes with recent Docker Desktop / Engine)
You do not need local Python/Node to run the app if you always use Docker.
2. Final directory layout (with Docker files)¶
Target layout:
mbpanel/
├── docker-compose.yml
├── .env.local # root env for local dev
├── backend/
│ ├── Dockerfile
│ ├── .dockerignore
│ ├── app/
│ │ └── ... your FastAPI code ...
│ ├── requirements.txt # or pyproject.toml if you prefer
│ └── .env.backend.local
└── frontend/
├── Dockerfile
├── .dockerignore
├── package.json
├── package-lock.json / pnpm-lock.yaml / yarn.lock
├── src/
│ └── ... your Next.js code ...
└── .env.frontend.local
We’ll fill in each of those files now.
3. Root: .env.local¶
This file defines shared environment variables used by Docker Compose:
mbpanel/.env.local
# Which logical environment we're in
APP_ENV=local
# Host ports to expose
BACKEND_PORT=8000
FRONTEND_PORT=3000
# Postgres credentials (used by db container + backend)
DB_NAME=mbpanel_local
DB_USER=mbpanel_local
DB_PASSWORD=local_password
# Paths to service-specific env files
BACKEND_ENV_FILE=backend/.env.backend.local
FRONTEND_ENV_FILE=frontend/.env.frontend.local
You can commit this with safe placeholder creds, or keep it out of git and maintain a .env.local.example.
4. Backend: Dockerfile¶
This is a straightforward FastAPI image using uvicorn. You can later swap to gunicorn+uvicorn workers if you want.
mbpanel/backend/Dockerfile
# backend/Dockerfile
FROM python:3.12-slim AS base
ENV PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1 \
PIP_NO_CACHE_DIR=1
WORKDIR /app
# System dependencies (adjust if you need others, e.g. for psycopg)
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
&& rm -rf /var/lib/apt/lists/*
# Install Python dependencies
COPY requirements.txt /app/requirements.txt
RUN pip install --upgrade pip && \
pip install --no-cache-dir -r /app/requirements.txt
# Copy application code
COPY app /app/app
# Environment selector (local/dev/prod) – value comes from env file
ENV APP_ENV=${APP_ENV:-local}
EXPOSE 8000
# Single command for all envs; behaviour controlled by env, not Dockerfile
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
If you’re using pyproject.toml + uv / Poetry instead of requirements.txt, adjust the install step accordingly.
5. Backend: .dockerignore¶
Keeps the image small and builds fast.
mbpanel/backend/.dockerignore
__pycache__/
*.py[cod]
*.pyo
*.pyd
*.sqlite3
.env
.env.*
.git
.gitignore
.vscode
.idea
.mypy_cache
.pytest_cache
# Virtualenvs
venv/
.venv/
# OS crap
.DS_Store
Thumbs.db
6. Backend: .env.backend.local¶
This is loaded inside the backend container.
mbpanel/backend/.env.backend.local
APP_ENV=local
# Database (matches service name in docker-compose.yml)
DB_HOST=db
DB_PORT=5432
DB_NAME=mbpanel_local
DB_USER=mbpanel_local
DB_PASSWORD=local_password
# Redis (if you use it)
REDIS_HOST=redis
REDIS_PORT=6379
# CORS – allow local frontend
BACKEND_CORS_ORIGINS=http://localhost:3000
# Anything else your FastAPI app needs:
# SECRET_KEY=change_me
# ACCESS_TOKEN_EXPIRE_MINUTES=30
Your FastAPI config.py should read these via Pydantic settings.
7. Frontend: Dockerfile (Next.js)¶
This is a standard multi-stage Next.js Dockerfile that runs next start everywhere.
mbpanel/frontend/Dockerfile
# frontend/Dockerfile
# syntax=docker/dockerfile:1
FROM node:22-alpine AS deps
WORKDIR /app
# Copy dependency manifests
COPY package.json ./
# include exactly one of these depending on what you use
COPY package-lock.json* pnpm-lock.yaml* yarn.lock* ./
# Install deps (tries npm -> pnpm -> yarn)
RUN npm ci || pnpm install || yarn install
FROM node:22-alpine AS builder
WORKDIR /app
ENV NEXT_TELEMETRY_DISABLED=1
COPY --from=deps /app/node_modules ./node_modules
COPY . .
# Production build (even for local; same image everywhere)
RUN npm run build
FROM node:22-alpine AS runner
WORKDIR /app
ENV NODE_ENV=production \
NEXT_TELEMETRY_DISABLED=1
# Copy runtime artifacts
COPY --from=builder /app/public ./public
COPY --from=builder /app/.next ./.next
COPY --from=builder /app/package.json ./package.json
EXPOSE 3000
CMD ["npm", "run", "start"]
This assumes you’re using Next.js App Router with npm run build / npm run start. If you’re using pnpm/yarn, adjust the install/build commands.
8. Frontend: .dockerignore¶
mbpanel/frontend/.dockerignore
9. Frontend: .env.frontend.local¶
This will be loaded inside the frontend container.
mbpanel/frontend/.env.frontend.local
Your frontend code should read this with process.env.NEXT_PUBLIC_API_URL.
10. Root: docker-compose.yml¶
Single stack, used for local (and later for dev if you want).
mbpanel/docker-compose.yml
version: "3.9"
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- db_data:/var/lib/postgresql/data
networks:
- mbpanel
redis:
image: redis:7-alpine
restart: unless-stopped
networks:
- mbpanel
backend:
build:
context: ./backend
dockerfile: Dockerfile
restart: unless-stopped
env_file:
- ${BACKEND_ENV_FILE}
depends_on:
- db
- redis
ports:
- "${BACKEND_PORT}:8000"
networks:
- mbpanel
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
restart: unless-stopped
env_file:
- ${FRONTEND_ENV_FILE}
depends_on:
- backend
ports:
- "${FRONTEND_PORT}:3000"
networks:
- mbpanel
networks:
mbpanel:
volumes:
db_data:
How this connects:
backendgetsenv_filefrom${BACKEND_ENV_FILE}→backend/.env.backend.local(because.env.localdefines it).frontendgetsenv_filefrom${FRONTEND_ENV_FILE}→frontend/.env.frontend.local.
11. How to run locally (step-by-step)¶
From repo root (mbpanel/):
11.1 One-time setup¶
- Create all the files above (
.env.local, service-specific envs, Dockerfiles, compose). - Make sure
requirements.txtinbackend/andpackage.jsoninfrontend/are valid.
If you use Alembic migrations, ensure env.py and versions/ exist under backend/app/infrastructure/database/migrations/ (or wherever you placed them).
11.2 Start the whole stack¶
What happens:
db(Postgres) startsredisstartsbackendimage builds and starts onlocalhost:8000frontendimage builds and starts onlocalhost:3000
Logs stream in your terminal.
To run detached:
11.3 Stop everything¶
12. Initial dev tasks (DB + seed)¶
You already planned scripts under backend/scripts/. With containers up:
12.1 Run DB migrations (if you use Alembic)¶
From repo root:
(adjust the command if you organize Alembic differently).
12.2 Seed dev data¶
(or whatever the actual module path is; for example python -m app.scripts.seed_dev_data if you put scripts inside app/).
13. Quick sanity checks¶
-
Open
http://localhost:3000→ Next.js UI loads. -
Open
http://localhost:8000/docs→ FastAPI docs appear. -
From inside the
backendcontainer:
docker compose --env-file .env.local exec backend python -c "import os; print(os.getenv('DB_HOST'))"
Should print db.
- From inside the
frontendcontainer:
Should print http://localhost:8000.
If those work, your local Docker setup is wired correctly and matches the decoupled front/back structure we designed.
If you want after this, we can bolt on a tiny Makefile to wrap the docker compose commands so your standard local flow is literally: