Performance Tests HowTo

By Sonia Dimitrov, Christof Marti, Andre Weinand
2005/02/24

The Eclipse performance test plugin (org.eclipse.test.performance) provides infrastructure for instrumenting programs to collect performance data and to assert that performance doesn't drop below a baseline. The infrastructure is supported on Windows, Linux, and MacOS X.

The first part of this document describes how performance tests are written and executed, the second part explains how performance data is collected in a database and how this database is installed and configured.

Writing Performance Tests

Setting up the environment

Writing a performance test case

A performance test case is an ordinary JUnit test TestCase. Notes:

Participating in the performance summary (aka "Performance Fingerprint")

If the number of performance tests grows large, it becomes harder to get a good overview of the performance characteristics of a build. A solution for this problem is a performance summary chart that tries to condense a small subset of key performance tests into a chart that fits onto a single page. Currently the performance infrastructure supports two levels of summaries, one global and any number of "local" summaries. A local summary is typically associated with a component.

A summary bar chart shows the performance development of about 20 tests relative to a reference build in an easy to grasp red/green presentation.

summary graph

So dependent on the total number of components every Eclipse component can tag one or two tests for inclusion in a global and up to 20 for a local performance summary. Tests marked for the global summary are automatically included for a local summary.

Marking a test for inclusion is done by passing a performance meter into the method Performance.tagAsGlobalSummary(...) or Performance.tagAsSummary(...). Both methods should be called outside of start/stop calls but it must be called before the the call to commit().

// ....
Performance perf= Performance.getDefault();
PerformanceMeter pm= perf.createPerformanceMeter(perf.getDefaultScenarioId(this));
perf.tagAsGlobalSummary(pm, "A Short Name", Dimension.CPU_TIME);
try {
	// ...

In order to keep the overview graph small, only a single dimension (CPU_TIME, USED_JAVA_HEAP etc.) of the test's data is shown and only a short name is used to label the data (instead of the rather long scenario ID). Both the short label as well as the dimension must be supplied in the calls to tagAsGlobalSummary and tagAsSummary. The available dimensions can be found in org.eclipse.test.performance.Dimension.

The PerformanceTestCase provides similar methods that must be called before startMeasuring():

public class MyPerformanceTestCase extends PerformanceTestCase {

	public void testMyOperation() {
		tagAsSummary("A Short Name", Dimension.CPU_TIME);
		for (int i= 0; i < 10; i++) {
			startMeasuring();
			toMeasure();
			stopMeasuring();
		}
		commitMeasurements();
		assertPerformance();
	}
}

Running a performance test case (from a launch configuration)

Running a performance test case (from the command-line)

Running a performance test case (within the Automated Testing Framework on each build)

If the test.xml of your test plug-in already exists and looks similar to the jdt.text.tests' one, add targets similar to those shown below. The performance target is the entry point for performance testing like the run target is for correctness testing.
<!-- This target defines the performance tests that need to be run. -->
<target name="performance-suite">
  <property name="your-performance-folder" value="${eclipse-home}/your_performance_folder"/>
  <delete dir="${your-performance-folder}" quiet="true"/>
  <ant target="ui-test" antfile="${library-file}" dir="${eclipse-home}">
     <property name="data-dir" value="${your-performance-folder}"/>
     <property name="plugin-name" value="${plugin-name}"/>
     <property name="classname" value="<your fully qualified test case class name>"/>
  </ant>
</target>
                    
<!-- This target runs the performance test suite. Any actions that need to happen -->
<!-- after all the tests have been run should go here. -->
<target name="performance" depends="init,performance-suite,cleanup">
  <ant target="collect" antfile="${library-file}" dir="${eclipse-home}">
    <property name="includes" value="org*.xml"/>
    <property name="output-file" value="${plugin-name}.xml"/>
  </ant>
</target>
Notes:

Running a performance test case (within the Automated Testing Framework, locally)

Setting up the Derby database

Performance tests are only valuable if measured data can be monitored over time and compared against reference data. For this functionality the Eclipse performance plugin makes use of the Apache project's Derby database (formerly called Cloudscape).

Derby is a database engine written in Java that can be accessed via JDBC. Derby is easily embeddable in Java programs or can run as a network server.

This section describes how to install Derby and how to configure the performance test plugin to use Derby.

Getting and installing Derby

The performance infrastructure does not include Derby. If you want to leverage Derby you need to download and install it.

The performance plugin has an optional prereq for a "org.apche.derby" library project. Since it is optional, you won't see any compile time errors when loading the performance plugin from the Eclipse repository and the Derby project is not available in your workspace. However you'll see runtime errors when running the tests and trying to access the database.

If you have access to the following repository you can get the org.apache.derby library project from there:

  :pserver:anonymous@ottcvs1.ott.oti.com:/home/cvs/zrheclipse
Otherwise get Derby from here. Unpack the archive to any directory.
To create a library project for Derby, open the Java project wizard and enter "org.eclipse.derby" as the project's name. Go to the next page and select the "Libraries" tab. Remove the JRE and add the five jar-files () from Derby's lib directory via the "Add External JARs" button. Switch to the "Order and Export" tab and check all five libraries. Press "Finish". Create a new file "plugin.xml" inside the Derby project and paste the following contents into it:
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.0"?>
<plugin
   id="org.apache.derby"
   name="Derby"
   version="1.0.0">
   <runtime>
      <library name="db2jcc.jar">
         <export name="*"/>
      </library>
      <library name="db2jcc_license_c.jar">
         <export name="*"/>
      </library>
      <library name="derby.jar">
         <export name="*"/>
      </library>
      <library name="derbynet.jar">
         <export name="*"/>
      </library>
      <library name="derbytools.jar">
         <export name="*"/>
      </library>
   </runtime>
</plugin>
In addition you'll need to load the performance plugin (org.eclipse.test.performance) and if you are running on Windows the associated fragment (org.eclipse.test.performance.win32).

Configuring the performance plugin for using Derby

The performance test plugin is configured via the three Java properties eclipse.perf.dbloc, eclipse.perf.config, and eclipse.perf.assertAgainst.

The eclipse.perf.dbloc specifies where the Derby DB is located. If no value is given

	-Declipse.perf.dbloc=
Derby runs in embedded mode (not as a separate server) and the DB will live in your home directory.

If an absolute or relative path is given, Derby uses or creates the DB in that location. E.g. with (Linux and MacOS X)

	-Declipse.perf.dbloc=/tmp/derby
Derby runs in embedded mode and creates the database under /tmp/derby.

To connect to a Derby server running locally (or remotely) use the following:

	-Declipse.perf.dbloc=net://tcp-ip address
With the properties eclipse.perf.config and eclipse.perf.assertAgainst you specify the name under which performance data is stored in the database and the name of the reference data to compare against. This "name" is not a single string but a set of key/value pairs separated by semicolons:
	-Declipse.perf.config=key1=value1;key2=value2;...;keyn=valuen
	-Declipse.perf.assertAgainst=key1=value1;key2=value2;...;keyn=valuen
The key/value pairs can be used to associate the collected performance data with information about the configuration that was used to generate the data. Typically this includes the name of the build, the system on which the test were run, or the used Java VM. So in this example:
	-Declipse.perf.config=build=N20040914;host=relengwin;jvm=j9
performance data for the nightly build N20040914 is stored in the database under a "name" that consist of three key/value pairs.
If the tests are run multiple times with the same arguments, the new data does not replace old data but is added under the same name. Programs that visualize the data are expected to aggregate the data for example by calculating the average of all tests.

To assert that performance data collected for another build does not degrade with respect to some reference data the assertAgainst property is used similarly:

	-Declipse.perf.assertAgainst=build=R3.0;host=relengwin;jvm=j9
This property enables any "assertPerformance" calls in your performance tests and compares the newly measured data against the data specified by the three key/value pairs. Please note that the order of the pairs does not matter when looking up the data in the database. However, the number of key/value pairs must be identical.

Because in most cases you want to store newly collected data as well as assert against other reference data at the same time you'll need to specify both properties. In this case only those key/value pairs must be listed in the assertAgainst property, that differ from the config property:

	-Declipse.perf.config=build=N20040914;host=relengwin
	-Declipse.perf.assertAgainst=build=R3.0
So in the example from above the new performance data is stored in the database under the build name "N20040914" and the host "relengwin" and the "assertPerformance" compares this data against data tagged with a build name of "R3.0" and an implicitely specified host "relengwin".

If you want to assert the average of multiple runs (instead of the data of a single run) against the reference data, do the following:

	// Run program 4 times to collect data under build name "I20040914"
	... -Declipse.perf.config=build=I20040914
	... -Declipse.perf.config=build=I20040914
	... -Declipse.perf.config=build=I20040914
	... -Declipse.perf.config=build=I20040914
	
	// Run program a 5th time and collect more data under I20040914
	// and assert the average of 5 runs of I20040914 against some baseline data
	... -Declipse.perf.config=build=I20040914 -Declipse.perf.assertAgainst=build=R3.0

Viewing the data

Since we do not (yet) have fancy visualization tools, the performance test plugin provides a class org.eclipse.test.internal.performance.db.View that can be run as a standalone program for viewing the data contained in the database in a tabular format.

You need to specify the database location via the eclipse.perf.dbloc property (most easily done via a launch configuration). Select the data to view by either specifying a variation via the eclipse.perf.config property or by directly setting the key/value pairs of the variation at the beginning of the program's main method. If you only want to view specific scenarios, use an appropriate pattern for the local variable scenarioPattern. The local variable seriesKey specifies what variation is shown on the x-axis of the table.

So the following setup:

public class View {

    public static void main(String[] args) {
        
        Variations variations= PerformanceTestPlugin.getVariations();
        variations.put("host", "relengwin");
        variations.put("build", "I%");
        
        String scenarioPattern= "%RevertJavaEditorTest%";

        String seriesKey= "build";
        
    	// ...
creates a table showing all dimensions of the (single) scenario selected by the pattern "%testRevertJavaEditor%" for all integration builds (that is builds starting with a capital 'I').
Scenario: org.eclipse.jdt.text.tests.performance.RevertJavaEditorTest#testRevertJavaEditor()
Builds:              I200409240800    I200409281200  I200410050800  I200410190941  I200410260800
CPU Time:          1.02 s [284 ms]  1.05 s [327 ms]         971 ms            1 s         481 ms
Committed:              69K [246K]      119K [389K]           103K           111K         -97484
Elapsed Process:   1.02 s [286 ms]  1.07 s [345 ms]         981 ms         1.01 s         481 ms
Kernel time:         41 ms [27 ms]    48 ms [40 ms]          46 ms          28 ms          22 ms
Page Faults:             145 [125]        148 [125]            176            191            143
System Time:       1.02 s [285 ms]  1.06 s [345 ms]         981 ms         1.01 s         477 ms
If you are interested in creating performance charts and tables similar to those available on the eclipse platform download pages, you could try the stand-alone java program org.eclipse.test.performance.ui.Main stored in the org.eclipse.releng.basebuilder project. Refer to the readme.html in org.eclipse.releng.basebuilder/plugins/org.eclipse.test.performance.ui for more details.

How to setup a Derby server (on Linux and MacOS X)