CI/CD Local Validation Guide¶
Date: 2026-01-08 Related: ULTRATHINK Analysis Status: Ready for Use
Overview¶
This guide explains how to validate the optimized CI/CD workflow locally before pushing to GitHub. Use these commands to ensure your changes pass CI before creating a PR.
Prerequisites¶
1. Install act (GitHub Actions Runner)¶
act runs GitHub Actions workflows locally using Docker.
# macOS
brew install act
# Linux
curl https://raw.githubusercontent.com/nektos/act/master/install.sh | sudo bash
# Windows (via Chocolatey)
choco install act-cli
Official Source: https://nektosact.com/
2. Install Test Dependencies¶
This installs:
- pytest-xdist - Parallel test execution
- pytest-rerunfailures - Flaky test retry
- pytest-timeout - Prevent hanging tests
- pytest-cov - Coverage reporting
- mutmut - Mutation testing
Local Testing Commands¶
1. Run Tests with pytest-xdist (Parallel Execution)¶
Replicates CI parallel execution:
cd backend
# Run all unit tests in parallel (auto-detects CPUs)
pytest tests/unit -n auto --dist loadscope --cov=app --cov-report=term-missing
# Run all integration tests in parallel
pytest tests/integration -n auto --dist loadscope --cov=app --cov-report=term-missing
# Run specific test files in parallel
pytest tests/unit/core/test_jwt_tokens.py tests/unit/core/test_security_hashing.py \
-n auto --dist loadscope
Key Options:
- -n auto: Use all available CPU cores
- --dist loadscope: Isolate tests by class (avoid DB lock contention)
- --maxfail=10: Stop after 10 failures (fail fast)
- --reruns 3: Retry flaky tests automatically
2. Run Tests with Flaky Test Retry¶
cd backend
# Retry failed tests up to 3 times
pytest tests/unit/core -n auto --reruns 3 --cov=app
# Only retry on specific failures (e.g., AssertionError)
pytest tests/unit/core -n auto --reruns 3 --reruns-delay 1
3. Run Tests with Timeout¶
cd backend
# Timeout each test after 300 seconds (5 minutes)
pytest tests/unit -n auto --timeout=300 --cov=app
# Timeout per test (10 seconds)
pytest tests/unit/core/test_jwt_tokens.py --timeout=10
4. Run Mutation Testing¶
cd backend
# Run mutation testing on JWT module
mutmut run --paths-to-mutate=app/core/jwt.py \
--runner="pytest -q tests/unit/core/test_jwt_tokens.py"
# Run mutation testing on security module
mutmut run --paths-to-mutate=app/core/security.py \
--runner="pytest -q tests/unit/core/test_security_hashing.py"
# View mutation results
mutmut results
# Generate HTML report
mutmut html
5. Run Full CI Test Suite Locally¶
cd backend
# Replicate full CI test suite (unit + integration)
pytest tests/unit tests/integration \
-n auto \
--dist loadscope \
--maxfail=10 \
--reruns 3 \
--timeout=300 \
--cov=app \
--cov-report=term-missing:skip-covered \
--cov-report=json \
--cov-report=xml \
--json-report \
--json-report-file=test-results.json \
-m "not load"
Testing with act (GitHub Actions Locally)¶
Run Full Workflow¶
# Run the entire ci-backend workflow
act -j test
# Run with secrets
act -j test \
--secret CI_JWT_SECRET_KEY="your-test-secret" \
--secret CI_ENCRYPTION_KEY="your-test-encryption-key"
Run Specific Job¶
# Run only the lint job
act -j lint
# Run only the unit test job
act -j test --matrix test-suite:unit
# Run only the integration test job
act -j test --matrix test-suite:integration
# Run mutation testing
act -j mutation
Run with Custom Python Version¶
Dry Run (Show Workflow Without Executing)¶
Validation Checklist¶
Before pushing to GitHub, ensure:
- Lint passes:
ruff check .ormake lint - Type check passes:
mypy app(non-blocking until strict typing achieved) - Unit tests pass:
pytest tests/unit -n auto --cov=app - Integration tests pass:
pytest tests/integration -n auto --cov=app - Coverage threshold met: 85% overall, 95% core
- Mutation tests pass (if applicable):
mutmut run - No warnings or deprecations: Check pytest output
Performance Testing¶
Measure Test Runtime¶
cd backend
# Time the full test suite
time pytest tests/unit tests/integration -n auto --cov=app
# Expected results (local, 8-core CPU):
# - Unit tests: ~30-60 seconds
# - Integration tests: ~60-120 seconds
# - Total: ~2-3 minutes (vs ~6-8 minutes on CI with 2 cores)
Profile Slowest Tests¶
cd backend
# Show 10 slowest tests
pytest tests/unit --durations=10
# Profile with pytest-profiling (if installed)
pip install pytest-profiling
pytest tests/unit --profile
Debugging Failed Tests¶
Run Tests in Single Process (No Parallelization)¶
cd backend
# Useful for debugging test isolation issues
pytest tests/unit/core/test_jwt_tokens.py -v --pdb
Run with Verbose Output¶
cd backend
# Show test output
pytest tests/unit/core -v -s
# Show captured logs
pytest tests/unit/core -v --log-cli-level=DEBUG
Run Specific Test¶
cd backend
# Run specific test function
pytest tests/unit/core/test_jwt_tokens.py::test_create_access_token -v
# Run specific test class
pytest tests/unit/core/test_jwt_tokens.py::TestJWTTokenCreation -v
Enter PDB on Failure¶
CI-Only Features¶
These features only work in GitHub Actions (not locally with act):
- Service containers: Redis, PostgreSQL, RabbitMQ
-
Local alternative: Use
make upto start services -
Artifact uploads: Test results, coverage reports
-
Local alternative: Files saved to
backend/directory -
Secrets management:
CI_JWT_SECRET_KEY,CI_ENCRYPTION_KEY -
Local alternative: Set in
backend/.env.backend.local -
Parallel job execution: Matrix strategy
- Local alternative: Run
act -j testsequentially
Common Issues and Solutions¶
Issue 1: pytest-xdist Import Errors¶
Symptom: ImportError: No module named 'pytest_xdist'
Solution:
Issue 2: Tests Pass Locally But Fail in CI¶
Possible Causes: - Race conditions (parallel execution reveals them) - Environment differences (Python version, dependencies) - Missing environment variables or secrets
Debug Steps:
1. Run tests with -n auto locally to simulate parallel execution
2. Check dependency versions match CI
3. Verify all required environment variables are set
Issue 3: Coverage Reports Don't Combine¶
Symptom: Coverage shows 0% or only partial coverage
Solution:
cd backend
# Remove old coverage data
rm -f .coverage*
# Run tests again
pytest tests/unit tests/integration -n auto --cov=app
# Combine coverage data (if multiple files)
coverage combine
coverage report
Issue 4: Mutation Testing Takes Too Long¶
Symptom: mutmut run takes 10+ minutes
Solution:
- Only run mutation testing on security-critical modules
- Use --runner to specify minimal test suite
- Consider reducing mutation targets
Issue 5: Flaky Tests¶
Symptom: Tests fail intermittently
Solution:
cd backend
# Identify flaky tests
pytest tests/unit -n auto --reruns 3
# Common causes:
# - Missing fixtures (use `autouse=True`)
# - Test isolation issues (use `--dist loadscope`)
# - Time-dependent tests (use freezegun or similar)
# - Race conditions (use proper async/await)
Performance Comparison: Before vs After¶
| Metric | Before | After | Improvement |
|---|---|---|---|
| PR validation time | 15-20 min | 6-8 min | 60% faster |
| Main branch time | 20-25 min | 12-15 min | 40% faster |
| pytest runs | 2 (duplicate) | 1 (single) | 50% reduction |
| Mutation frequency | Every run | Main branch only | 80% reduction |
| Flaky test handling | Manual retry | Auto-retry (3x) | 100% automated |
| Test parallelization | None | pytest-xdist (2 cores) | 1.8x speedup |
Next Steps¶
- Install
act(if not already installed) - Update dependencies:
pip install -e ".[dev]" - Run local validation: Use commands in this guide
- Push to feature branch: Create PR to test workflow
- Monitor CI execution: Check GitHub Actions tab
- Iterate: Fix issues and re-validate locally
Official Sources¶
GitHub Actions¶
pytest Plugins¶
Local Testing with act¶
Document Version: 1.0 Last Updated: 2026-01-08 Author: DevOps Automation Architect (Claude Code)