Quantcast
Channel: Are (database) integration tests bad? - Software Engineering Stack Exchange
Viewing all articles
Browse latest Browse all 12

Answer by Jeff Bowman for Are (database) integration tests bad?

$
0
0

Write the smallest useful test you can. For this particular case, an in-memory database might help with that.

It is generally true that everything that can be unit-tested should be unit-tested, and you're right that unit tests will take you only so far and no further—particularly when writing simple wrappers around complex external services.

A common way of thinking about testing is as a testing pyramid. It's a concept frequently connected with Agile, and many have written about it, including Martin Fowler (who attributes it to Mike Cohn in Succeeding with Agile), Alistair Scott, and the Google Testing Blog.

        /\                           --------------       /  \        UI / End-to-End    \          /      /----\                           \--------/     /      \     Integration/System    \      /    /--------\                           \----/   /          \          Unit             \  /  --------------                           \/  Pyramid (good)                   Ice cream cone (bad)

The notion is that fast-running, resilient unit tests are the foundation of the testing process. There should be more focused unit tests than system/integration tests, and more system/integration tests than end-to-end tests. As you get closer to the top, tests tend to take more time/resources to run, tend to be subject to more brittleness and flakiness, and are less-specific in identifying which system or file is broken; naturally, it's preferable to avoid being "top-heavy".

To that point, integration tests aren't bad, but heavy reliance on them may indicate that you haven't designed your individual components to be easy to test. Remember, the goal here is to test that your unit is performing to its spec while involving a minimum of other breakable systems: You may want to try an in-memory database (which I count as a unit-test-friendly test double alongside mocks) for heavy edge-case testing, for instance, and then write a couple of integration tests with the real database engine to establish that the main cases work when the system is assembled.

As you noted, it's possible for tests to be too narrow: you mentioned that the mocks you write simply test how something is implemented, not whether it works. That's something of an antipattern: A test that is a perfect mirror of its implementation isn't really testing anything at all. Instead, test that every class or method behaves according to its own spec, at whatever level of abstraction or realism that requires.

In that sense your method's spec might be one of the following:

  1. Issue some arbitrary SQL or RPC and return the results exactly(mock-friendly, but doesn't actually test the query you care about)
  2. Issue exactly the SQL query or RPC and return the results exactly(mock-friendly, but brittle, and assumes SQL is OK without testing it)
  3. Issue an SQL command to a similar database engine and check that itreturns the right results (in-memory-database-friendly,probably the best solution on balance)
  4. Issue an SQL command to a staging copy of your exact DB engineand check that it returns the right results(probably a good integration test, but may be prone to infrastructureflakiness or difficult-to-pinpoint errors)
  5. Issue an SQL command to your real production DB engine and check thatit returns the right results(may be useful to check deployed behavior, same issues as #4 plusthe dangers of modifying production data or overwhelming your server)

Use your judgment: Pick the quickest and most resilient solution that will fail when you need it to and give you confidence that your solution is correct.


Viewing all articles
Browse latest Browse all 12

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>