SBST Java Unit Testing Tool Contest

We invite developers of tools for Java unit testing at the class level—both SBST and non-SBST—to participate in a tools competition! Competition entries are in the form of short papers (maximum of 4 pages) describing an evaluation of your tool against a benchmark supplied by the workshop organisers.

The results of the tools competition will be presented at the workshop. We additionally plan to co-ordinate a journal paper jointly authored by the tool developers that evaluates the results of the competition.

The Contest

The contest is targeted at developers/vendors of testing tools that generate test input data for unit testing java programs at the class level. Each tool will be applied on a set of java classes taken from open-source projects, and selected by the contest organization. The participating tools will be compared for:

  • execution time
  • achieved branch coverage
  • test suite size
  • mutation score

Each participant should install a copy of their tool on the server where the contest will be run. To this end each participant will have SSH-access to the contest-server. The benchmark infrastructure will run the tools and measure their outputs fully automatically, therefore tools must be capable of running without human intervention.

Participate

To participate, please send a mail to Tanja Vos describing the following characteristics of their tool: name, testing techniques implemented (SBST or other), compatible operating systems, tool inputs and outputs, and optionally any evaluative studies already published.

You will be sent credentials to log-in to the sbst-contest-server.

Instructions

To allow automatic execution of the participating tools, these need to be configured and installed on the contest-server:

  • Host: sbstcontest.dsic.upv.es 
  • OS: Ubuntu 12.04 LTS

You should install and configure your testing tool in the home directory of your account.  You can basically do that in any way you want with the following exceptions.

  • You must have an executable (or shell script) named $HOME/runtool that implements the protocol described below
  • Your tool must store intermediate data in $HOME/temp/data
  • Your tool must output the generated test cases in JUnit4 format in $HOME/temp/testcases

The Benchmark Automation Protocol

The runtool script/binary is the interface between the benchmark framework and the participating tools. The communication between runtool and the benchmark framework is done through a very simple line based protocol over the standard input and output channels. The following table describes the protocol, every step consists of a line of text received by the runtool program on STDIN or sent to STDOUT.

Step Messages STDIN Messages STDOUT Description
1 BENCHMARK Signals the start of a benchmark run; directory $HOME/temp is cleared
2 directory Directory with the source code SUT
3 directory Directory with compiled class files of the SUT
4 number Number  of entries in the class path (N)
5 directory/jar file Class path entry (repeated N times)
6 number Number  of classes to be covered (M)
7 CLASSPATH Signals that the testing tool required additional classpath entries
8 Number Number of additional class path entries (K)
9 directory/jar file Repeated K times
10 READY Signals that the testing tool is ready to receive challenges
11 class name The name of the class for which unit tests must be generated.
12 READY Signals that the testing tool is ready to receive more challenges; test cases in $HOME/temp/testcases are analyzed; subsequently $HOME/temp/testcases is cleared; goto step 11 until M class names have been processed

To ease the implementation of a runtool program according to the protocol above, we provide a skeleton implementation in Java.

Test the Protocol

In order to test whether your runtool implemented the protocol correctly, we installed a utility called sbstcontest on the machine. If you run it, it will output:

sbstcontest <benchmark> <tooldirectory> <debugoutput>
Available benchmarks: [mucommander, jedit, jabref, triangle, argouml]

The first line shows how the tool is used. <benchmark> is one of the installed benchmarks as shown in the second line. <tooldirectory> is the directory where your runtool resides and <debugoutput> is a boolean value to enable debug information. The benchmarks are collections of classes from different open source projects. triangle lends itself to testing the basic functionality, as it is a very simple benchmark consisting of only 2 classes. An example invocation would be:

sbstcontest triangle . true

If you implemented the protocol correctly and generated all the test cases, the tool will create a transcript file in the runtool’s directory. This file will show you several statistics, such as achieved branch coverage, mutation score, etc.

Test Case Format

The tests generated by the participating tools must be stored as one or more java files containing JUnit4 tests.

  • declare classes public
  • add zero-argument public constructor
  • annotate test methods with @Test
  • declare test methods public

The generated test cases will be compiled against

  • JUnit 4.10
  • The benchmark SUT
  • Any dependencies of the benchmark SUT