Skip to content

Backend Integration Testing

Testing with real dependencies to verify system behavior and prevent regressions

Related Concepts: Unit Testing | Clean Architecture | Repository Pattern | Acceptance Testing

Table of Contents

  1. The Role of Integration Tests
  2. Integration Testing at Synapse
  3. Core Principles
  4. Types of Integration Tests
  5. Database Testing Philosophy
  6. Test Independence and Isolation
  7. The API Arrangement Pattern
  8. Performance Trade-offs
  9. When to Use Integration Tests
  10. Summary

The Role of Integration Tests

Why Integration Tests Matter

Integration tests occupy a unique position in our testing strategy. While unit tests verify that our pure business logic works correctly in isolation, integration tests verify that our system actually works when all the pieces come together.

Think of it this way: Unit tests confirm that each gear in your machine is properly formed. Integration tests confirm that when you connect those gears, the machine actually runs.

The Integration Testing Sweet Spot

Integration tests provide exceptional value because they:

  • Catch real bugs: Database constraints, transaction issues, and query problems that unit tests can't see
  • Verify component interaction: Ensure that your carefully designed components actually work together
  • Prevent regressions: Changes that break existing functionality are caught before production
  • Test realistic scenarios: Use real databases with real constraints and real behavior
  • Avoid black-box complexity: More focused than E2E tests while still testing real behavior

Complementing Unit Tests

Integration tests aren't a replacement for unit tests—they're a critical complement:

Unit Tests:
- Test pure business logic
- Fast feedback (milliseconds)
- High volume, fine-grained
- Catch logic errors

Integration Tests:
- Test with real dependencies
- Slower feedback (seconds)
- Lower volume, coarse-grained
- Catch integration errors

A healthy test suite has both: many fast unit tests for rapid feedback on business logic, and thorough integration tests for confidence that the system actually works.

Integration Testing at Synapse

Our Integration Test Philosophy

At Synapse, integration tests have a specific meaning: tests that use a real PostgreSQL database. This isn't about mocking or stubbing—it's about testing with actual dependencies to verify real behavior.

The distinction is simple but important:

  • Unit tests: Test pure domain logic without dependencies
  • Integration tests: Test with real PostgreSQL database

This clear boundary helps us choose the right test type for each scenario and maintains consistency across our projects.

Why Real Dependencies Matter

Using real PostgreSQL in our tests isn't just about being thorough—it's about catching the bugs that actually happen in production:

  • Database constraints: Foreign keys, unique indexes, and check constraints
  • Transaction behavior: Isolation levels, deadlocks, and rollback scenarios
  • Query performance: N+1 queries, missing indexes, and inefficient joins
  • Data integrity: Referential integrity and cascade operations
  • Concurrency issues: Race conditions and optimistic locking failures

These are the issues that cause production incidents. Unit tests can't catch them. Integration tests with real databases can.

Core Principles

Test Independence

Every integration test must be completely independent. This means:

  • Tests can run in any order
  • Tests can run in parallel
  • Tests don't share state
  • Tests don't depend on previous test results

Independence is non-negotiable because it ensures reliable, maintainable tests that don't create cascading failures.

State Initialization, Not Cleanup

A fundamental principle at Synapse: tests should ensure correct state at the beginning, not worry about clean up afterward.

Why? Because:

  • The test itself knows about the state it needs to be in to run
  • Clearing the whole db to a clean state afer every test is slow, and usually unnecessary
  • Cleaning the db to an empty state after every run makes running in parallel impossible

Focus on Behavior, Not Implementation

Integration tests verify what the system does, not how it does it. They test:

  • API contracts are honored
  • Business rules are enforced
  • Data persists correctly
  • Errors are handled appropriately
  • Side effects occur as expected

The implementation can change—different query strategies, caching layers, or internal refactoring—as long as the behavior remains correct.

Types of Integration Tests

Route Tests

Route tests are our primary integration tests. They test the full API request/response cycle with a real database, verifying:

  • Request validation and parsing
  • Authentication and authorization
  • Business logic execution
  • Database interactions
  • Response formatting
  • Error handling

These tests provide the most confidence because they exercise the entire stack exactly as production requests would.

Repository Tests

Repositories are our data access layer, and they need integration tests because:

  • Unit testing database queries without a database is meaningless
  • Complex queries need verification against real data
  • Transaction behavior must be tested with actual transactions
  • Database-specific features (arrays, JSON, full-text search) need real testing

Repository tests verify that our data access patterns work correctly with PostgreSQL's specific behavior and constraints.

Database Testing Philosophy

Always Use Real PostgreSQL

We never use in-memory databases, SQLite, or H2 for testing. Why?

  • Different SQL dialects: What works in H2 might fail in PostgreSQL
  • Different constraint behavior: Foreign keys, triggers, and checks behave differently
  • Missing features: Arrays, JSON operations, CTEs might not be supported
  • False confidence: Tests pass but production fails
  • It's Easy: Running a real postgres db with docker is very easy and cheap to do.

The small speed gain from in-memory databases isn't worth the risk of missing production bugs.

Test Data Management

Test data should be:

  • Minimal: Only create what's needed for the test
  • Explicit: Clear about what data exists and why
  • Isolated: Not shared between tests
  • Representative: Similar to production data patterns

Good test data management makes tests faster, clearer, and more maintainable.

Test Independence and Isolation

Why Independence Matters

Test independence isn't just a nice-to-have—it's essential for:

  • Parallel execution: Run tests simultaneously for speed
  • Debugging: Failed tests don't cascade to others
  • Maintenance: Change one test without breaking others
  • Reliability: Consistent results regardless of execution order

Achieving Isolation

True test isolation requires:

  • State isolation: No shared variables or singletons
  • Time isolation: Tests control their own time when needed
  • External service isolation: Mocked or sandboxed external dependencies

Each test should be a self-contained experiment that proves a specific behavior.

The API Arrangement Pattern

Using Public APIs for Test Setup

A critical principle: when arranging test data, use your application's public API rather than direct database manipulation. This ensures:

  • Validation runs: Business rules and constraints are enforced
  • Side effects occur: Audit logs, events, and calculations happen
  • Realistic state: Data looks like it would in production
  • Refactoring safety: Internal changes don't break test setup

Why Not Direct Database Setup?

Direct database manipulation in tests:

  • Bypasses business logic
  • Creates unrealistic data states
  • Couples tests to schema details
  • Misses important side effects
  • Makes tests brittle to refactoring

The API arrangement pattern keeps tests focused on behavior rather than implementation.

Performance Trade-offs

Speed vs Confidence

Integration tests are slower than unit tests—typically 50-100x slower. But they provide confidence that unit tests can't:

Unit Test:        ~5ms    - Tests if calculation is correct
Integration Test: ~200ms  - Tests if system actually works
E2E Test:         ~5000ms - Tests if user can complete task

The question isn't "how fast?" but "how much confidence per second?" Integration tests often win this calculation.

Optimization Without Compromise

You can make integration tests faster without sacrificing realism:

  • Parallel execution: Run independent tests simultaneously
  • Shared connections: Reuse database connections across tests
  • Transaction rollback: When appropriate for the test scenario
  • Minimal data: Create only what's necessary
  • Smart scheduling: Run critical tests first

The goal is reasonable speed while maintaining test integrity.

Best Practices

DO ✅

  • Use real PostgreSQL for all database tests
  • Ensure test independence through proper isolation
  • Initialize state at test start rather than cleanup after
  • Use API for test arrangement to ensure realistic setup
  • Test error scenarios including constraints and concurrency
  • Verify side effects to ensure all changes occurred
  • Keep tests focused on single behaviors or scenarios
  • Make tests readable with clear arrangement, action, and assertion

DON'T ❌

  • Don't mock the database - use real PostgreSQL
  • Don't manipulate database directly for test setup
  • Don't rely on test execution order
  • Don't clean up after every tests - initialize instead
  • Don't use different databases for testing (SQLite, H2)
  • Don't ignore flaky tests - fix the root cause
  • Don't share test data between tests
  • Don't skip integration tests to save time

Summary

Integration tests at Synapse provide unique and essential value to our testing strategy:

  1. Real confidence: Testing with real PostgreSQL catches real bugs
  2. Regression prevention: Changes that break existing functionality are caught
  3. Behavioral verification: Tests focus on what the system does, not how
  4. Practical approach: More realistic than unit tests, more focused than E2E tests

Remember:

  • Unit tests verify your logic is correct
  • Integration tests verify your system actually works
  • Both are essential for a robust test suite

The key is balance: use unit tests for rapid feedback on business logic, and integration tests for confidence that components work together correctly with real dependencies.

Integration tests might be slower, but they catch the bugs that matter—the ones that break production. That makes them invaluable.


For implementation details, see our framework-specific testing guides. For pure logic testing strategies, see our Unit Testing guide. For full user journey testing, see our Acceptance Testing guide.