Friday, October 20, 2006

JChav 1.1.0 Released

JChav 1.1.0 is now available, featuring:
  • The y-axis (response time) is now uniform across all charts by default. This means when you look at the summary page you're comparing like with like. This option can be controlled by the Ant option of uniformyaxis="true|false".
  • Bug fix: the max value for some tests was not set if the first value in the results was both the min and max values. This showed up as a chart with a huge y-axis scale.
  • Minor chart changes: the thumbnail graphs now do not show the (unreadable) x-axis labels, saving a bit of screen space.
  • A Maven 2 plugin has been contributed (thanks to Tim McCune for this). You will also want to check out his JMeter Maven plugin.
We're not (yet) regular Maven users, so apologies if we've messed it all up and undone Tim's good work. The instructions that seem to work for us: download the 1.1.0 ZIP release; cd into the etc folder; run mvn -f maven-jchav-plugin-pom.xml install.


Friday, October 06, 2006

About JChav

JChav is a way to see the change in performance of your web application over time, by running a benchmark test for each build you produce.

Example of the JChav reports output.

How does it work?

  • You build and deploy your application.
  • You write a JMeter test plan to exercise your application.
  • From Ant, you run the JMeter test plan and log the results, using the Ant JMeter task.
  • JChav reads all the JMeter logs from each of your runs (one per build), and produces a set of charts for each test in each run.
  • Each time you deploy, re-run the JMeter tests and the JChav tool to update the charts to show the change in performance.
By running this often you can see the effect of code change on your application performance.

Getting started

JChav is made available under the Apache 2.0 license.

Live Demo

If you would like to see what the reports look like we have a live demo of the site available here. It shows a series of threads accessing the site. The example section below shows the scripts used to produce these results.

Thursday, October 05, 2006

Download & Quick Start

  • JChav needs Java 1.5 (or later), which can be downloaded from

  • Download and install Apache Ant.

  • Download and install JMeter.
  • If you've not used Ant or JMeter before, check out the detailed example we provide.

  • Download and extract the latest release of JChav: from the Google Code site.

  • Read on to configure JChav and run against your own site. However, if you'd like to run the example we supply, which runs a few tests against the web site, to can: in the docs/examples directory of JChav, edit the two lines in to let JChav know where it's installed and where to find Jmeter. Then, from the docs/exampes directory, run ant -f build-example.xml. It'll take a while to run as it samples the web site. When it's over open digjchavresults/index.html in your web browser.

  • Modify or create an Ant build file to run your JMeter test and record the results. Also set up the location of JMeter, where you'd like the results to be written, etc. Here's an example, which you can use by modifying the value of various properties:
  •  <target name="init">
      <!-- Produce a build id. If using a continuous build 
              process inherit the build. id from that" -->
       <format property="" pattern="dMMhhmmss" locale="en"/>
      <property description="The location of the install of JMeter" 
       name="jmeter.install.dir" value="DIRECTORY_TO/jakarta-jmeter-2.2" />
      <property description="The directory containing the jchav jars" 
       name="jchav.libs.dir" value="DIRECTORY_TO/jchav" />
      <property description="The JMeter test plan script we want to run" 
       name="jmeter.testplan" value="YOUR_PLAN.jmx" />
      <property description="The location to store the per run files" 
       name="jmeter.result.dir" value="jmeter-results" />
      <property description="The resulting file location, make sure this is unique for each build" 
       name="jmeter.result.file" value="${jmeter.result.dir}/result-${}.xml" />
      <property description="The location to generate the html and charts to. " 
       name="jchav.result.dir" value="jchav-results" />
     <target name="run-jmeter" depends="init"
       description="Execute the JMeter test plan, recording the results to a file.">  
      <taskdef name="jmeter"
                 <property name="" value="xml"/>
  • Run your ant task to gather statistics: ant run-jmeter. Based on the simple example about you'll end up with an XML file in the output directory of jmeter-results.

  • Modify your Ant build file to run JChav on the JMeter results. Here's an example:
  • <target name="run-jchav" depends="init"
       description="Produce JChav report from the JMeter results">
       <taskdef name="jchav" classname="com.googlecode.jchav.ant.JChavTask">
             <fileset dir="${jchav.libs.dir}/">
                <include name="**/*.jar"/>
      <jchav srcdir="${jmeter.result.dir}" destdir="${jchav.result.dir}"/>
  • Run the ant task, e.g., ant jchav. This will create an output directory such as chav-results. Open index.html to see your reports.

  • Problems? Issues? Fixes? Suggestions? Post them on the project issue tracker.
Thanks for trying JChav.

Wednesday, October 04, 2006

Continuous Performance Monitoring With JChav

We think you will get the most benefit from using JChav alongside a continuous integration tool like CruiseControl or Continuum. By integrating JChav into the automated build/test/deploy cycle you get the additional benefit of being able to see if the changes made are having a positive or negative effect on performance. Those changes are immediately available to all interested parties. The typical build arrangement in ant/cruisecontrol we use is as follows : So as soon as a developer commits a change to the source code repository an automated build is triggered. The normal ant build is run including any checkstyle constraints, JUnit tests etc. The build also deploys the application before triggering the JMeter scripts and performance chart generation through JChav. The notification via email to all the build participants contains a link to the generated charts. Bringing performance monitoring into every cycle in a simple way stops any nasty suprises at a later date. Trying to add performance testing to a large establish application looks unsurmountable, but ... one test at a time, seeing the graph going the right way will help to motivate you to adding more.


The first screen: showing thubnails of all the reports, and each thumbnail links to the details for the report.

The details screen: Showing the full report for a single test and how the test ran for different builds.

Example -

Here's an example of running a JChav, using as the subject of the test.

Produce A JMeter Script

The first step is to produce a JMeter script that will exercise the site you wish to test. The Jakarta JMeter web pages describe how to do this in depth.

JChav produces the top level set of images based upon the labels that you put into the JMeter configuration file. So where possible give your JMeter tasks meaningful names. This is especially important if you are using the http proxy tool built into JMeter to make test generation easy. JChav will do its best to turn URLs into something meaningful, but if you take the trouble to set the name on the task the resulting pages will be better. Make sure that the test plan you have created is running as you expect it to inside the JMeter workbench. When you are happy with this script save the script as a jmx file.

We have included a small example called digwalk.jmx which performs a series of simple calls to the Digg web site.

If you'd rather jump in a use JMeter to test your own site, we've provided a localhost.jmx JMeter test which simply requests the home page of a site running on your localhost on port 80. It's a template to get you up and running quickly. Also check out the examples that ship with JMeter in their docs/demos/ directory.

Running our script from ant

JChav includes build-example.xml which is an Ant build file set up to run the tests and produce a JChav report. If you go into the jchav/docs/examples you'll see it there. Run it with ant -f build-example.xml.

If you're logged in to your computer as the user jimbo you can customize the build for your machine by creating a file called jchav/docs/examples/ which can contain something along these lines:


Alternatively, if you don't want to create a file, go into the docs/examples directory edit the two lines in to let JChav know where it's installed and where to find Jmeter. Then, from the docs/exampes directory, run ant -f build-example.xml. It'll take a while to run as it samples the web site. When it's over open digjchavresults/index.html in your web browser.

How the script works

The good folks over at Programmer Planet have produced an ant task for running JMeter called Jmeter Ant Task, which is included in recent JMeter distributions in the jmeter/extras folder.

The job of this ant task is to run our tests and produce an output file containing the results. But it is important that we make sure that we produce a different output file each time we perform our tests. By default the task appends information to one file, but we want want file per run so that we can judge if the changes we have made to the software have improved or degraded performance over time.

To do this simply create a unique build id for each build performed. If you are using a continuous integration tool such as CruiseControl then use the build id for the build, otherwise use something like a timestamp. eg.

    <format property="" pattern="dMMhhmmss" locale="en"/>

Everytime we run the jmeter target we will create a file in the ${basedir}/test/results/ directory. This is the directory that we pass to the JChav task to produce our performance graphs.

Adding JChav to the build

Initially you need to ensure that the jchav task is available for your ant build. The following taskdef will import the task :

  <taskdef name="jchav" classname="com.googlecode.jchav.ant.JChavTask">
      <fileset dir="${jchav.libs.dir}/">
        <include name="**/*.jar"/>

Once the task is available you simply need to add the task to the build. It requires two attributes. The first is srcdir which is the directory that contains all the logs from the JMeter runs.i.e. the directory you chose to write the JMeter output to from the above example. The second parameter is destdir which is the output directory for the images and html produced by the task.

  <jchav srcdir="${basedir}/test/results/" destdir="${basedir}/test/chavoutput/"
      reporttitle="Digg Example Walk"/>

JChav Ant Task Details



Generate a series of charts based upon the stored data from a series of tests. The charts show the performance over time of each page tested in the script.


Attribute Description Required
srcdir the location of the JMeter XML files to process. Yes
destdir the location to produce the html results/images. Yes
reporttitle The page title for the produced reports. No
uniformyaxis If true, all charts have the same y-axis, computed from the largest and smallest values seen in the dataset. Set to false to have each chart have it's own range. Default is true. Since 1.1.0. No


   <jchav  srcdir="${basedir}/test/results/"
       reporttitle="Digg Example Walk"

Runs the report setting the title page to be "Digg Example Walk".

Show me the code

It's at