Restful Java Metering by Dropwizard Metrics
Dropwizard has taken the Java world by storm with it's ease of use and great functionality. The metrics library is particularly powerful for anyone wanting to measure statistics in their Java applications.
Join the DZone community and get the full member experience.
Join For FreeWe saw how we can do the restful Java Metering using Jersey event listeners (Read here) in one of our earlier article.
Here we are going to see how to use Dropwizard Metrics framework to do the metering of our restful resource methods. Dropwizard Metrics is using Jersey events listeners internally to achieve this. They have provided nice wrapper and lots of plug-in to gather the performance of each resource methods without much effort.
Ok! Ready…
There are 3 steps involved in-order to achieve this.
- Metrics dependency in Maven pom
- Register ‘MetricRegistry’ and ‘ConsoleReporter’ in our resource configuration
- Inject @Timed or @Metered annotation for resource methods
Maven pom:
Make the following changes in your pom.xml under dependencies.
<dependency>
<groupId>io.dropwizard.metrics</groupId>
<artifactId>metrics-core</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>io.dropwizard.metrics</groupId>
<artifactId>metrics-jersey2</artifactId>
<version>3.1.0</version>
</dependency>
Since we are going to use Metrics framework inside Jersey (restful Java) framework, the second dependency is required. If your rest service implementation is NOT using Jersey framework, then you can ignore the ‘metrics-jersey2’ dependency. This will fetch the required libraries in our application after the sync operation.
Register ‘MetricRegistry’ and ‘ConsoleReporter’:
Both MetricRegistry and ConsoleReporter implementations are coming from Metrics framework. They actually provide the capability to capture the performance of our resource methods and emit the aggregated result in console as a report.
So do the following changes in our ResourceConfig implementation,
public class RestSkolApplication extends ResourceConfig {
private static final Logger logger = LogManager.getLogger(RestSkolApplication.class);
private Set<Class<?>> classes = new HashSet<Class<?>>();
public RestSkolApplication() {
initializeApplication();
}
private void initializeApplication() {
registerListeners(); // Register listeners
}
private void registerListeners() {
final MetricRegistry metricRegistry = new MetricRegistry();
register(new InstrumentedResourceMethodApplicationListener(metricRegistry));
ConsoleReporter.forRegistry(metricRegistry)
.convertRatesTo(TimeUnit.SECONDS)
.convertDurationsTo(TimeUnit.MILLISECONDS)
.build()
.start(1, TimeUnit.MINUTES);
logger.info("Console reporter is enabled successfully!");
}
}
The console report will report the performance metrics for every minute. This interval can be configurable. So change the interval based on your need.
@Timed or @Metered annotation:
The final step is to add either @Timed or @Metered annotation in the REST resource methods like below:
@Path("books")
public class BookResource {
@GET
@Produces(MediaType.APPLICATION_JSON)
@Timed
public Response getAllBooks() {
System.out.println("Get all books resource is called");
final List<Book> books = BookDataStore.getInstance().getBooks();
return Response.ok()
.entity(books)
.build();
}
@Path("{id}")
@GET
@Produces(MediaType.APPLICATION_JSON)
@Timed
public Response getBook(@PathParam("id") String id) {
final Book book = BookDataStore.getInstance().getBook(id);
return Response.ok() // (Response code)
.entity(book) // (response value)
.build();
}
}
Here I have used @Metered annotation whereas you can use @Timed annotation as well to track the performance numbers. These 2 annotations are provided by Metrics framework.
Good Job!
When you compile and run the application you will see the following output in your console:
-- Timers ----------------------------------------------------------------------
com.cloudskol.restskol.resources.BookResource.getAllBooks
count = 5
mean rate = 0.08 calls/second
1-minute rate = 0.04 calls/second
5-minute rate = 0.01 calls/second
15-minute rate = 0.01 calls/second
min = 0.09 milliseconds
max = 0.23 milliseconds
mean = 0.13 milliseconds
stddev = 0.05 milliseconds
median = 0.11 milliseconds
75% <= 0.12 milliseconds
95% <= 0.23 milliseconds
98% <= 0.23 milliseconds
99% <= 0.23 milliseconds
99.9% <= 0.23 milliseconds
com.cloudskol.restskol.resources.BookResource.getBook
count = 2
mean rate = 0.03 calls/second
1-minute rate = 0.02 calls/second
5-minute rate = 0.01 calls/second
15-minute rate = 0.00 calls/second
min = 0.42 milliseconds
max = 14.78 milliseconds
mean = 7.44 milliseconds
stddev = 7.17 milliseconds
median = 0.42 milliseconds
75% <= 14.78 milliseconds
95% <= 14.78 milliseconds
98% <= 14.78 milliseconds
99% <= 14.78 milliseconds
99.9% <= 14.78 milliseconds
First resource has been requested for 5 times and second resource has been requested for 2 times.
Dropwizard Metrics is a great tool for measuring performance of our REST service methods. By watching the numbers carefully we can address many performance problems during the development phase itself!
I hope you have enjoyed reading this article. Please share this with your friends and share your comments or feedback below.
All the code samples are available under https://github.com/cloudskol/restskol
Published at DZone with permission of Thamizh Arasu, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments