Add Zing to your unit tests

Introducing a framework for generic, productive, reliable, and maintenance-free unit tests

Unit testing is as an integral part of extreme programming, and many open source tools and frameworks help developers write unit tests. IDE plug-ins can create skeleton unit test cases, and Ant tasks and Maven goals automatically run test cases and generate reports during continuous integration.

However, unit test cases are not generic. Any functional method handles multiple data scenarios. We write one test method for every data scenario because we create the test data, fire the test, and validate the output in the same test method. If a new requirement adds additional data scenarios for the method, then we end up writing more test methods. Thus, test-case maintenance requires effort. The complexity of tests further increases when testing server-side components against data that changes during transactions. In addition, we must ensure that the tests are always correct. All of these issues require considerable time to address and increase the complexity of the test cases. Overall, we yearn for something that will remove the complications involved in writing test cases and provide a generic set of test cases that are free from maintenance.

This article first outlines a comprehensive list of issues faced during unit testing and then details the creation of a testing framework that facilitates the writing of generic and configurable unit test cases by integrating with multiple open source testing tools and frameworks. In this article, we refer to the JUnit, JUnitPerf, Cactus, JUnitEE, and DbUnit frameworks and tools such as Ant, CruiseControl, and XStream. Please note that this article is not a tutorial for these frameworks and tools.

Issues in unit testing

Here is the summary of the issues that must be addressed while unit testing:

  • The effort required to create/maintain unit tests: We create test data within the test method's body. Thus, to have robust unit testing, we must create different test methods for every possible combination of data for that functional method. Let's call combinations of data data scenarios. We do not generalize data creation and output assertion. Hence, we end up writing test methods for every data scenario that needs testing, which causes maintenance issues. If we change the method to handle more data scenarios, we end up writing more test methods. Also, changing the data for a scenario isn't always straightforward since the test data is embedded in the code and we must change the test-case code and rebuild the entire test suite.
  • Maintaining data consistency: We also need to maintain data consistency for the methods that complete database transactions. For example, if a method creates a customer in the database and we try to create the same customer again, the unit test case will fail.
  • Providing the same approach to writing unit tests but leveraging high-quality tools and frameworks: Some JUnit tools/frameworks available in the open source community enhance unit testing. One such tool is JUnitPerf. It allows basic performance testing of the code. However, it has its own approach to creating test cases and running them. It needs information about the load and response time to create tests. This information is hard-coded in the test methods and the developer ends up writing more unit test cases. Abstracting that information out of the test case so the same unit test case can be run as a load test or response-time test without the developer knowing the intricacies of JUnitPerf would prove beneficial.
  • Complexity of test cases when testing server-side components: When testing server-side components, we need Java Enterprise Edition (JEE) features—e.g., JNDI (Java Naming and Directory Interface) lookup and an EJB (Enterprise JavaBeans) container—which further adds to the complexity of test cases. Tools are available for server-side testing, with the best examples being Cactus and JUnitEE. If we want to leverage the advantages of both, we must glue them together and hide their configuration and intricacies from the developer. We must provide a simple way for the developer to do server-side testing.
  • Encouraging maximum test effectiveness (all the data scenarios are tested): Unit testing is a critical element of continuous integration. During continuous integration, we can use unit testing to measure the build progress. That is, if the unit test reports show a 50-percent success rate, then we should be able to get an idea of the overall build progress. To achieve such evaluation of build progress, the unit tests should be functionally correct and reliable.
  • Correct tracking of the build phase: Most unit test code follows the same pattern. On a higher level, the code looks repetitive. Abstracting such repetitive code from the test cases minimizes the effort required to write unit test cases.

These issues can be resolved by writing a unit test framework that reduces testing effort, by making test cases generic and configurable, and provides seamless integration with excellent open source unit test frameworks. The building blocks of the framework are:

  • Test types
  • Data abstraction, initialization, and assertion techniques
  • Infrastructure support (logging, JNDI lookup, property reader, XML binding, etc.)
  • DbUnit (for database consistency), JUnitEE, and Cactus (for server-side testing), and JUnitPerf
  • Built-in tools (for creating configurations and test skeletons)

We will discuss each building block in subsequent sections.

We call our framework Zing. For this article's sample application, we use the two classes that appear in the code below. The sample code and Zing code are available for download from Resources.

 

package sample.currency;

public class Money{ private int amount; private String currency;

public Money(int amount, String currency){ this.amount = amount; this.currency = currency; } // Getters and setters }

package sample.currency; public class MoneyUtils{ public Money addMoney(Money m1, Money m2){ if (m1 == null || m2 == null){ throw new IllegalArgumentException("Arguments can't be null"); } else if(!(m1.getCurrency().equals(m2.getCurrency()))){ throw new IllegalArgumentException("Arguments should be in same currency"); } return new Money(m1.getAmount() + m2.getAmount(), m1.getCurrency()) } }

Test types

To integrate multiple unit test frameworks and provide a simple interface for the developer to write test cases, we need to classify test cases into multiple types. These test types form the basis of the Zing framework, allowing integration among multiple unit test frameworks. These types form abstract classes in the Zing framework.

In this article, we discuss the following types:

  • Simple test: A simple test case that extends from JUnit's TestCase
  • Servlet test: A test case that extends from Cactus's ServletTestCase
  • PerfLoadTest: A performance test case that extends from JUnitPerf's LoadTest
  • PerfTimedTest: A performance test case that extends from JUnitPerf's TimedTest

All the above mentioned test cases must provide some common functionality to integrate them together. We can achieve such functionality by creating a common interface, zing.tests.ITest. We have to extend our test case either from SimpleTest or ServletTest and write the test code. Depending on the test-case configuration, PerfLoadTest and PerfTimedTest are created while the test runs. Using these test types, we abstract the complexity of using different test frameworks to one single place so we can concentrate on writing test code instead of worrying about different implementations for different test frameworks. This approach also enables us to plug in more unit-testing frameworks.

For example, the test case for MoneyUtils looks like:

   public class MoneyUtilsTests extends zing.tests.SimpleTest{
    ...
    }

Data abstraction, initialization, and assertion

Next, we concentrate on abstracting all the data out of the test code. Unit testing deals with three categories of data:

  • Test data: Input and output data
  • Configuration data
  • Database data

Test data

We abstract the test data out (input and output objects) of the test-case code to make it more generic. We can extract this data in XML. For every data scenario, we create XML files for input and output objects (including exceptions). For testing MoneyUtils's addMoney() method, we need input objects m1 and m2, and one output object of type Money to assert this method's success. Instead of hard coding the object values in the test-case code, we create an XML representation of the required instances of Money. So we create three XML files, one for each instance (the generation of these files can be automated using XDoclet annotation). The XML file for m1 looks like:

   <sample.currency.Money>
        <amount>20</iName>
        <currency>dollar</currency>
    </sample.currency.Money>

The file for m2 looks like:

   <sample.currency.Money>
        <amount>30</iName>
        <currency>dollar</currency>
    </sample.currency.Money>

The file for output looks like:

   <sample.currency.Money>
        <amount>50</amount>
        <currency>dollar</currency>
    </sample.currency.Money>

We just need to create the XML input and output files instead of writing new methods for testing new scenarios of the same method. The best way to abstract data in an XML file is by using an XML serializer, such as the open source XStream. XStream serializes XML files to Java objects and vice versa. This way, we can store input and output values for multiple scenarios that can run across the same test method.

If we want to write generic code that reads these XML files and converts them to Money instances, we must follow certain naming conventions for the file names. We use the following naming convention: <test method name>_<data scenario name>_<argument name>.xml. Hence, for m1, the name is testAddMoney_DEFAULT_SCENARIO_m1.xml. Adhering to a common naming convention makes it possible to write a utility class to read the data from XML files for a given scenario of a given test method and feed that data to a test method. In later sections, we explain how to automatically create these input and output XML files with a utility class.

We still require a mechanism to configure the same test method to run multiple times for different data scenarios. We achieve that by abstracting configuration data out of the Java code.

Configuration data

Our aim is to write only one test method for one functional method, but test it for all possible data scenarios. To achieve this, we also need a way to configure the same test multiple times. In addition, we may need optional configuration to apply test decorators such as load test and response-time test. Such configurable tests can be achieved by following these steps:

  • Create a configuration schema
  • Use XML binding (Castor, XMLBeans) to read the configuration
  • Create a utility class to create tests from the configuration
  • Create a suite class that can assemble all the tests, and initialize and run them

Please note that the test framework handles everything except for the schema. Developers don't have to worry about it. A sample configuration schema looks like:

 <test-suite name="zing-suite">
    <test-case name="sample.currency.MoneyUtilsTest">
        <test-method name="testAddMoney">
            <test-data connection="SAMPLE_CONNECTION" dataset="dataset/money.xml"/>
            <test-scenario id="DEFAULT_SCENARIO">
                <input-data name="m1" value="input/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m1.xml"/>
                <input-data name="m2" value="input/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m2.xml"/>
                <output-data name="m3" value="output/sample/currency/MoneyUtils/addMoney_DEFAULT_SCENARIO_m3.xml"/>
            </test-scenario>
            <test-scenario id="INVALID_MONEY">
                <input-data name="m1" value="input/sample/currency/MoneyUtils/addMoney_INVALID_MONEY_m1.xml"/>
                <input-data name="m2" value="input/sample/currency/MoneyUtils/addMoney_INVALID_MONEY_m2.xml"/>
                <exception-data name="Exception" type="java.lang.IllegalArgumentException"/>
            </test-scenario>
            <test-type>
                <testtypeid>TIMED</testtypeid>
                    <perfparams>
                        <maxelapsedtime>40</maxelapsedtime>
                    </perfparams>
            </test-type>
        </test-method>
    </test-case>
</test-suite>

In the configuration XML, we specify the test-case name and test-method name. Against each test method, we specify multiple scenarios. A test is added in the suite for every scenario of every method. Additionally, if we specify test types as TIMED or LOAD, then, for every scenario of each method, the created test case will be a TIMED test. That is, the test checks if the method completes in 40 milliseconds and ensures the test's functional correctness.

For every suite of unit tests, we must create one configuration XML. This XML file is named TestConfig.xml and should be in the classpath. For the above configuration, two timed tests will be created.

1 2 Page 1
Page 1 of 2