JMH - Great Java Benchmarking
Learn how to get started with JHM, an open source framework provides benchmarking for measuring the performance of your Java code.
Join the DZone community and get the full member experience.
Join For FreeIf you still measure execution time like this:
long before = System.currentTimeMillis();
doMagic();
long now = System.currentTimeMillis();
System.out.println("Seconds elapsed: " + (now-before)/1000F + " seconds." );
Then it's time to use JHM framework.
This rich open source framework provides you a proper way to measure the performance of your Java code. With JHM, you can easily
- Add a JVM warm-up stage.
- Specify the number of runs/threads in each run.
- Calculate the precise average/single execution time.
- Share a common scope between executed tests.
- Configure the result output format.
- Add specific JVM related params (e.g. disable inlining).
Getting Started
To generate a hello world project, just execute this Maven command:
mvn archetype:generate -DinteractiveMode=false -DarchetypeGroupId=org.openjdk.jmh -DarchetypeArtifactId=jmh-java-benchmark-archetype -DgroupId=org.sample -DartifactId=test -Dversion=1.0
Writing Your First JHM Hello World Benchmark
In our simple example, we will estimate average time of Thread.sleep (2000). In MyBenchmark.java, we put:
package org.sample;
import org.openjdk.jmh.annotations.*;
import java.util.concurrent.TimeUnit;
public class MyBenchmark {
@Benchmark@BenchmarkMode(Mode.AverageTime) @OutputTimeUnit(TimeUnit.MICROSECONDS)
public void testMethod() {
doMagic();
}
public static void doMagic() {
try {
Thread.sleep(2000);
} catch (InterruptedException ignored) {
}
}
}
Now we need to build it executing the famous Maven command:
maven clean install
Starting the Benchmark
Once the Maven command is being executed in the target folder, you can find executable benchmark.jar. We will execute this jar with the next benchmark parameters:
number of forks = 1
warm up iterations = 2
benchmark iterations = 5
java -jar target/benchmark.jar -f 1 -wi 2 -i 5
Results of the Benchmark
# JMH version: 1.19
# VM version: JDK 1.8.0_60, VM 25.60-b23
# VM invoker: C:\Program Files (x86)\Java\jre1.8.0_60\bin\java.exe
# VM options: <none>
# Warmup: 2 iterations, 1 s each
# Measurement: 5 iterations, 1 s each
# Timeout: 10 min per iteration
# Threads: 1 thread, will synchronize iterations
# Benchmark mode: Average time, time/op
# Benchmark: org.sample.MyBenchmark.testMethod
# Run progress: 0,00% complete, ETA 00:00:07
# Fork: 1 of 1
# Warmup Iteration 1: 1999653,736 us/op
# Warmup Iteration 2: 1999914,772 us/op
Iteration 1: 2000066,040 us/op
Iteration 2: 1999286,499 us/op
Iteration 3: 1999159,327 us/op
Iteration 4: 1999529,242 us/op
Iteration 5: 1999628,748 us/op
Result "org.sample.MyBenchmark.testMethod":
1999533,971 (99.9%) 1352,808 us/op [Average]
(min, avg, max) = (1999159,327, 1999533,971, 2000066,040), stdev = 351,320
CI (99.9%): [1998181,163, 2000886,779] (assumes normal distribution)
# Run complete. Total time: 00:00:14
Benchmark Mode Cnt Score Error Units
MyBenchmark.testMethod avgt 5 1999533,971 1352,808 us/op
As you can see, JHM measured average time as 1.999533971 seconds.
Alternative JMH Configuration
There are at least two ways to configure your JMH benchmark.
#1 Using annotations:
@Benchmark
@BenchmarkMode(Mode.AverageTime) @OutputTimeUnit(TimeUnit.MICROSECONDS)
@Fork(value = 1)
@Warmup(iterations = 2)
@Measurement(iterations = 5)
public void testMethod() {
doMagic();
}
#2 Using the Options object provided by JMH out of the box:
Options opt = new OptionsBuilder()
.include(JMHSample_01_HelloWorld.class.getSimpleName())
.warmupIterations(2)
.measurementIterations(5)
.forks(1)
.shouldDoGC(true)
.build();
You can download the source code and build files here.
Instead of a conclusion:
Opinions expressed by DZone contributors are their own.
Comments