Unit Testing Best Practices – Handling Large or Complex Fixtures

testingunit testing

I am trying to use JUnit to test a class (FooWriter) that is responsible for writing my application's data to disk using a custom file format. The data set being written is potentially very large, and I want to at least be able to test simple things, such as whether or not the expected number of lines were written out.

My problem is that I'm not sure what the best way is to set the test up with realistic test data. It seems like my options are:

  1. Read the test data in from disk and then use that data for the FooWriter test case
  2. Write a hard-coded text fixture for the FooWriter test case to use

(1) seems like a bad idea because then the FooWriter test case would no longer be isolated from other complex, breakable operations. (2) seems like a bad idea because it would result in about a thousand lines of code to compose the data structure, along the lines of:

foo.addBar(new Bar("bof", "bam"));
foo.addBar(new Bar("baz", "boom"));

Not only is that tedious and error prone, if I were to later make a change in the way the data is constructed, the test could then fail whether or not the application itself is actually broken.

In cases like this where I see no good options, I've found it's usually because I'm being stupid. Where's the stupid?

Best Answer

(1) seems like a bad idea because then the FooWriter test case would no longer be isolated from other complex, breakable operations.

That's largely irrelevant. Reading a lot of test data should be trivial. CSV format. JSON format. YAML format. There are lots and lots of ways of handling this that are neither complex nor breakable.

(2) seems like a bad idea because it would result in about a thousand lines of code to compose the data structure...

What I've done in the past is write a small utility to convert a large amount of test data (either random or in CSV) into TestCase code.

In sum, they both work and they're both good choices.

Toss a coin to choose. Heads.