TL;DR
I wanted the same thing again: keep unit tests fast, but enable a subset of tests to run against a real Postgres database with migrations applied automatically similar to Django's default experience. In my monorepo PoC, I wired Prisma to support that workflow.
Note: This post describes a proof of concept. The core idea works, but I'll be explicit about what's still missing for a production-grade testing setup.
Django's test experience is integrated: test runner + ORM + migrations + DB lifecycle all come together. In the TS ecosystem, Prisma is “just” the ORM layer; you still need to decide:
- How to initiate the DB
- When migrations run
- How to reset data
- How to prevent parallel tests from stepping on each other
So I built a pragmatic approach inside my monorepo PoC.
Repo: abel-castro/blog-monorepo (NestJS API + frontend in one workspace). 
The approach (conceptually)
My Prisma integration-testing recipe is:
- Provide a real Postgres for tests - easiest path: docker-compose (local + CI)
- Use a dedicated DATABASE_URL for integration tests
- never point integration tests at dev/prod DBs
- ideally a separate DB name or schema
- Ensure schema is up-to-date before tests
- run migrations programmatically, or via a pre-test command
- be consistent: either “migrate” (schema history) or “db push” (fast, but less realistic)
- Reset data deterministically
Why Prisma feels different than Django here
Prisma is great, but it's intentionally not a full “framework”.
- Prisma doesn't know your test runner (Jest/Vitest/node:test).
- Prisma doesn't own your DI container (NestJS does).
- Prisma migrations are a separate workflow (CLI / migration engine).
- There's no universally “right” cleanup strategy.
So: you build a thin layer that standardizes your project's conventions.
What my PoC focuses on
This PoC focuses on the “Django feeling” of:
- “I can write tests that hit a real DB”
- “DB schema is ready when tests run”
- “Unit tests remain fast and simple”
In other words: integration tests become easy to add.
Good points of this direction
- More realistic tests for anything query-heavy:
- filters, ordering, pagination
- relations, constraints, unique indexes
- migrations-related behavior
- Less mocking around the ORM boundary
- Higher confidence refactors (especially around query logic)
Current trade-offs
Even if the PoC works, there are predictable pain points:
- Speed
- DB startup + migrations cost time.
- Resetting DB between tests can dominate runtime if done naively.
- Isolation
- The best isolation approaches add complexity (transactions, schemas, per-worker DBs).
- The easiest approaches (shared DB, truncate) can allow ordering-dependent tests if you're sloppy.
- Parallel tests
- Test workers + one DB can cause flakiness unless you isolate per worker.
- Developer friction
- “Run DB before integration tests” has to be one command, or people won't use it.
What I'd implement next (turning PoC into a solid template)
If you want this to feel as boring as Django:
-
A single command
Something like:
pnpm test (unit only, no DB)
pnpm test:e2e (starts DB, migrates, runs integration specs, tears down)
-
A robust reset strategy
-
Per-worker DB isolation (parallel-safe)
- create DB/schema names based on worker id
- avoid cross-test collisions
-
CI hardening
- healthchecks
- deterministic order
- run integration tests in a separate job/stage
Closing
Django sets an expectation: “integration tests with a real DB should be trivial.” In Prisma + NestJS, that experience is absolutely achievable but you have to intentionally assemble the pieces.
Repo PoC: abel-castro/blog-monorepo.