Best practices for avoiding test smells

An important challenge in creating automated tests is how to design test fixtures, i.e., the setup code that initializes the system under test before actual automated testing can start.

Test designers have to choose between different approaches for the setup, trading off maintenance overhead with slow test execution.

Over time, test code quality can erode and test smells can develop. Examples of smells are the occurrence of overly general fixtures, obscure inline code and dead fields.

In a large-scale study, I investigate how fixture-related test smells evolve over time by analyzing several thousand revisions of five open source systems (Voldemort, PMD, Azure Java SDK, Jsoup, Checkstyle).

Lehman's law says that when a system evolves, its complexity increases unless work is done to maintain or reduce it. Click To TweetLehman’s Law

Lehman’s law says that when a system evolves, its complexity increases unless work is done to maintain or reduce it. Therefore, I expected test fixture smell density to continually increase over time.

Test smell evolution across projects
Test smell trends: General fixture

The findings were surprising, because they show that a general growth of test fixture smells over time does not occur.

There are, on the other hand fluctuations. This means test smells increase and decrease over time. I performed a manual investigation of strong fluctuations in smell trends. This investigation revealed that smells are often introduced or removed by structural changes, such as adding or removing features and refactoring.

Smells are introduced or removed by structural changes, such as adding or removing features and refactoring. Click To TweetMichaela Greiler

In general, I could see that setup management strategies strongly influence the types of test fixture smells that emerge in code. It was also visible that several types of fixture smells often emerge at the same time.

Different types of test smells across projects

Test smells seldom get resolved, but developers learn over time what appropriate strategies are to create test fixtures and less test smells are introduced.

Further, I correlated the density of test fixture smell with number of test methods in a class. The statistic indicates that classes with a higher number of test methods in a class are have a higher smell density.

With increase of test methods in a test class, also the number of test smells increases. Click To TweetMichaela Greiler

This is mostly likely because as classes grow they perform more tasks and touch more parts of the code. With the number of test methods in a class, also the likelihood increases that several functionalities are tested that have little to do with each other. This results in general test fixtures which set-up more than each separate test method needs to be functioning.

So what can you do to reduce or avoid test smells?

Here are several research-packed strategies and recommendations

  • Keep test classes small (and therefore coherent), by reducing the number of test methods within a class.
  • Keep inheritance structures flat, and also limit the scope of super classes (e.g., do not implement one super test class that provides functionality for all base test classes).
  • Use composition instead of inheritance to provide test classes with helper functionality.
  • Create data classes that provide test inputs to avoid overloading test classes with field declarations that are only used for a single test method. This improves performance and understandability of test classes.
  • Be aware that declaring fields in the header may impact tests in terms of understandability and test performance.
  • Reconsider the “one test class per class” organization in case test methods within a single test class require diverse states and configurations for the system under test.
  • Carefully consider the consequences of the chosen fixture strategies and the impact those decisions will have on the projects, as this is not likely to change.

More details can be found in the paper or the presentation:

Michaela Greiler

I am a software engineer and an entrepreneur. I have a lot of experience analyzing software development processes, especially understanding code review practices and software testing approaches.

Leave a Reply

Your email address will not be published. Required fields are marked *