This is a unordered list of implementation design decisions. Each topic tries to follow this structure:
Coverage information has to be collected at runtime. For this purpose JaCoCo creates instrumented versions of the original class definitions. The instrumentation process happens on-the-fly during class loading using so called Java agents.
There are several different approaches to collect coverage information. For each approach different implementation techniques are known. The following diagram gives an overview with the techniques used by JaCoCo highlighted:
Byte code instrumentation is very fast, can be implemented in pure Java and works with every Java VM. On-the-fly instrumentation with the Java agent hook can be added to the JVM without any modification of the target application.
The Java agent hook requires at least 1.5 JVMs. For reporting class files compiled with debug information (line numbers) allow a good mapping back to source level. Although some Java language constructs are compiled in a way that the the coverage highlighting leads to unexpected results, especially in case of implicitly generated code like default constructors or control structures for finally statements.
Instrumentation means inserting probes at certain check points in the Java byte code. A probe generated piece of byte code that records the fact that it has been executed. JaCoCo inserts probes at the end of every basic block.
A basic block is a piece of byte code that has a single entry point (the first
byte code instruction) and a single exit point (like jump
,
throw
or return
). A basic code must not contain jump
targets except the entry point. One can think of basic blocks as the nodes in
a control flow graph of a method. Using basic block boundaries to insert code
coverage probes has been very successfully proven by
EMMA.
Basic block instrumentation works regardless whether the class files have been compiled with debug information for source lines. Source code highlighting will of course not be possible without this debug information, but percentages on method level can still be calculated. Basic block probes result in reasonable overhead regarding class file size and execution overhead. As e.g. multi-condition statements form several basic blocks partial line coverage is possible. Calculating basic block relies on the Java byte code only, therefore JaCoCo is independent of the source language and should also work with other Java VM based languages like Scala.
The huge drawback of this approach is that fact, that basic blocks are actually much smaller in the Java VM: Nearly every byte code instruction (especially method invocations) can result in an exception. In this case the block is left somewhere in the middle without hitting the probe, which leads to unexpected results for example in case of negative tests. A possible solutions would be to add exception handlers that trigger special probes.
The Java agent is loaded by the application class loader. Therefore the classes of the agent live in the same name space than the application classes which can result in clashes especially with the third party library ASM. The JoCoCo build therefore moves all agent classes into a unique package.
The JaCoCo build renames all classes contained in the
jacocoagent.jar
into classes with a
org.jacoco.<randomid>
prefix, including the required ASM
library classes. The identifier is created from a random number. As the agent
does not provide any API, no one should be affected by this renaming. This
trick also allows that JaCoCo tests can be verified with JaCoCo.
JaCoCo requires Java 1.5.
The Java agent mechanism used for on-the-fly instrumentation became available with in Java 1.5 VMs. Coding and testing with Java 1.5 language level is more efficient, less error-prone – and more fun. JaCoCo will still allow to run against Java code compiled for older versions.
Instrumentation requires mechanisms to modify and generate Java byte code. JaCoCo uses the ASM library for this purpose.
Implementing the Java byte code specification would be a extensive and error-prone task. Therefore an existing library should be used. The ASM library is lightweight, easy to use and very efficient in terms of memory and CPU usage. It is actively maintained and includes as huge regression test suite. Its simplified BSD license is approved by the Eclipse Foundation for usage with EPL products.
Each class loaded at runtime needs a unique identity to associate coverage data with. JaCoCo creates such identities by a CRC64 hash code of the raw class definition.
In multi-classloader environments the plain name of a class does not unambiguously identify a class. For example OSGi allows to use different versions of the same class to be loaded within the same VM. In complex deployment scenarios the actual version of the test target might be different from current development version. A code coverage report should guarantee that the presented figures are extracted from a valid test target. A hash code of the class definitions allows a differentiate between classes and versions of a class. The CRC64 hash computation is simple and fast resulting in a small 64 bit identifier.
The same class definition might be loaded by class loaders which will result in different classes for the Java runtime system. For coverage analysis this distinction should be irrelevant. Class definitions might be altered by other instrumentation based technologies (e.g. AspectJ). In this case the hash code will change and identity gets lost. On the other hand code coverage analysis based on classes that have been somehow altered will produce unexpected results. The CRC64 has code might produce so called collisions, i.e. creating the same hash code for two different classes. Although CRC64 is not cryptographically strong and collision examples can be easily computed, for regular class files the collision probability is very low.
Instrumented code typically gets a dependency to a coverage runtime which is responsible for collecting and storing execution data. JaCoCo uses JRE types and interfaces only in generated instrumentation code.
Making a runtime library available to all instrumented classes can be a painful or impossible task in frameworks that use there own class loading mechanisms. Therefore JaCoCo decouples the instrumented classes and the coverage runtime through official JRE API types.
TODO: Streaming, Deep first
The Java language and the Java VM use different String representation formats
for Java elements. For example while a type reference in Java reads like
java.lang.Object
, the VM references the same type as
Ljava/lang/Object;
. The JaCoCo API is based on VM identifiers only.
Using VM identifiers directly does not cause any transformation overhead at runtime. There are several programming languages based on the Java VM that might use different notations. Specific transformations should therefore only happen at the user interface level, for example while report generation.
JaCoCo is implemented in several modules providing different functionality. These modules are provided as OSGi bundles with proper manifest files. But there is no dependencies on OSGi itself.
Using OSGi bundles allows well defines dependencies at development time and at runtime in OSGi containers. As there are no dependencies on OSGi, the bundles can also be used as regular JAR files.