Archive for category jenkins

Ansible – tighten your quality feedback loop

Testing your ansible playbook…


, , , , ,

Leave a comment

How do you puppet ?

Infrastructure as Code : lessons learned.

Raise your #Reproducibility, #Maintainability, #Testability, #Reusability up to 98%.

Prepared and presented for the Tech Talk #1 @ ICAB thanks to 8thcolor.

, , ,


Improved jenkins-github navigation

At work, we are using git feature branches extensively, we have a jenkins job configured to build all appearing branches origin/feature/* but it’s hard to know which commit/branch is linked to the build. So I will show you how we use the Groovy Postbuild Plugin to add github link and the branch that was built.

+ +
= cheap and useful navigation links

Continuous build

Add a groovy post build action :

def matcher = manager.getLogMatcher(".*Commencing build of Revision (.*) (.*)\$")
if(matcher?.matches()) {
    branch =,
    commit =,6)
    githuburl ="com.coravy.hudson.plugins.github.GithubProjectProperty").getProjectUrl().commitId(
    description = "<a href='${githuburl}'>${commit}</a>"+" - "+branch

It assumes that you have configured the GitHub project url in the job configuration page from the github plugin.

Don’t forget to install the Extra columns plugin and configure your main view to display the build description.


Deployment pipeline

For deployment job, inspired by GitHub, let’s say that you have an url on your website returning the current sha like /site/sha. You have a jenkins job that tracks commit on origin/develop and trigger a deployment.

Let’s add a shell script step in your job :

DEPLOYED_SHA="`wget --no-check-certificate -qO-`" 

Than postbuild groovy script that will show the deployed sha and the github difference between the previously deployed version :

def matcher = manager.getLogMatcher(".*commit (.*)\$")
if(matcher?.matches()) {
    branch = 'develop'
    commit =,6)
    projectUrl ="com.coravy.hudson.plugins.github.GithubProjectProperty").getProjectUrl()
    githuburl = projectUrl.commitId(
    def matcher_currently_depoyed = manager.getLogMatcher(".*CURRENTLY_DEPLOYED_SHA (.*)\$")
    commit_from =,6)
    description = "<a href='${githuburl}'>${commit}</a> - ${branch} - <a href='${projectUrl.baseUrl}compare/${commit_from}...${commit}'>diff</a>"

Where the diff links gives you something like diff.

If you have other hack around GitHub and jenkins, keep me posted !

, ,

Leave a comment

The puppet-lint –fix effect

Learning puppet by my self, I found useful to avoid common mistakes in my modules/manifests. No body knows puppet in my current position, so it’s hard get reviews of my work. I started looking at automated codereview/lint and stumble upon puppet-lint. In his last pre release, the tool implemented autofix of common errors. Let’s see how to measure my progress and learn from my mistakes 😉

First step, to gain visibility, I’ve plugged puppet-lint in our jenkins instance. Setting up jenkins to collect puppet-lint warning. Small modification for me, I don’t abort the build to enable warnings collection with this modified rake task.

Install the pre-release of puppet-lint.

gem install --pre puppet-lint -v 0.4.0.pre1

Uninstall the previous version

gem uninstall puppet-lint

Select gem to uninstall:
 1. puppet-lint-0.3.2
 2. puppet-lint-0.4.0.pre1
 3. All versions
> 1
Successfully uninstalled puppet-lint-0.3.2

Launch puppet-lint with auto-fix options.

ERROR: two-space soft tabs not used on line 15
ERROR: two-space soft tabs not used on line 19
FIXED: unquoted resource title on line 14
WARNING: line has more than 80 characters on line 5
WARNING: line has more than 80 characters on line 6

Double check the changes, launch rspecs, and a vagrant provision than commit

autofix with puppet-lint
 8 files changed, 82 insertions(+), 63 deletions(-)

Lets see the jenkins statistics.

puppet-lint effect

Thanks puppet-lint ! Now it’s time to fix the trivial one (two-space soft tabs) and less trivial one (“foo::bar not documented” , “class inheriting from params class” ,“define defined inside a class”,… ) 😉

What I’ve learned with this experiment:
— if you don’t show your errors, you are not really pushed to fix them
— it takes time to setup these quality tools, but it worth it.
— you are overloaded by warnings, fixing the stupid one automatically and fixing the easy one make you more optimist/confident to attack the harder one.
— it’s easier to fix when there are fresh in your mind

, ,


Jenkins as monitoring platform of the poor

The goal

The goal was to monitor some html page and wsdl availability. I don’t really have access to all the monitoring infrastructure and wanted to check my development servers. I was looking for a lightweight way of monitoring them. I’ve mixed jenkins and groovy and ended up to pretty and low-cost monitoring solution 😉

Install the necessary plugin and tools

Manage Jenkins > Manage Plugins :
Groovy plugin : This plugin adds the ability to directly execute Groovy code.
Green Balls : Changes Hudson to use green balls instead of blue for successful builds
Groovy Postbuild Plugin : This plugin executes a groovy script in the Jenkins JVM. Typically, the script checks some conditions and changes accordingly the build result, puts badges next to the build in the build history and/or displays information on the build summary page.

Manage Jenkins > Configure System : Groovy > Groovy installations or Install automatically
For the groovy plugin you can use the built-in tool installer or just point it to a unzipped binary of groovy
GROOVY_HOME : /opt/groovy/

So let’s create a free-style jenkins job with the following settings

Discard Old Builds : Max # of builds to keep : 100
Build periodically : */10 * * * *
Execute Groovy Script : groovy command :

servers = ['','','']

servers.each() {host ->

def koCount=0;
def slowCount=0;
def checkUrl = { url, check ->
 def status ='KO'
 def host =''
   start= System.currentTimeMillis()
    try {
     myurl = new URL(url)
       host =myurl.getHost()
       def text = myurl.getText(connectTimeout: 10000, readTimeout: 10000)
       def ok = check(url,text)
      status = ok?'OK':'KO';
      if (!ok) {koCount++}
    } catch (Throwable t) {
    end= System.currentTimeMillis()
    if ((end-start)>100)
    println "$host\t"+status+'\t'+(end-start)+'\t'+' '+url
def checkAllUrl =  {urls, check -> urls.each() {url ->checkUrl(url,check)}}
def wsdlCheck = {url,content -> content.contains("wsdl:definitions")}
def pingCheck = {url,content -> content.contains("status=NORMAL")}
def contentCheck = {url,content -> content.contains("login")}

checkAllUrl (wdsls,wsdlCheck )
checkAllUrl (simpleurls,contentCheck)

println "ko.count="+koCount
println "slow.count="+slowCount

if (koCount>0 || slowCount >0) {

Build two lists of urls : wsdls, simpleurls based on a list of servers.
A first closure checkUrl get the content of an url and update counters ok, ko
, it’s also receiving another closure that will check the expected content of the url content.
Now depending on the kind of content call the checkAllUrl with matching check closure wsdlCheck ,contentCheck,….

Add a Groovy Postbuild : Groovy script:

def addShortTextSlow = { comp,shortcomp->
matcher = manager.getMatcher(, comp+".count=(.*)\$")
if(matcher?.matches()) {
    manager.addShortText(shortcomp+' ', "grey", "white", "0px", "white")

That’s it !

Subscribe to the jenkins “RSS for failures” feed or your preferred jenkins notification tool and benefit from the jenkins built-in ui !

You have an history of the checks :


And trending

You can easily embed this graph or the green/red ball in jira our your wiki :

<a href="">
    <img src="" alt="200" title="200" border="0"/>

<a href="">
    <img src="" border="0">

The sky is the limit !

Ok now you got the idea… let’s add some checks to gather

— check some open ports :

try {
    s = new Socket(host, port);
    s.withStreams { input, output ->	}
    println "management port ok $host $port"
} catch (Exception e){
    println "management port KO for  $host $port : "+e.getMessage()

— access jmx beans

def serverUrl = new JMXServiceURL('service:jmx:rmi:///jndi/rmi://')
def server = JMXConnectorFactory.connect(serverUrl).MBeanServerConnection;
def memory = new GroovyMBean(server, 'java.lang:type=Memory')
println memory.listAttributeNames() 
println memory.listOperationNames() 

— some jamon statistics :


servers.each() {host ->	jamonurls.add("http://${host}"+jamonurlsuffix)}

def fixJamonXml= {
	xml ->
	if (xml.indexOf("No data was returned") != -1) {
		return '<JAMonXML></JAMonXML>';
	String content = xml.substring(xml.indexOf('<JAMonXML>'));
	rangeLabels = [ "0_10ms", "10_20ms","20_40ms","40_80ms","80_160ms","160_320ms","320_640ms","640_1280ms","1280_2560ms","2560_5120ms","5120_10240ms","10240_20480ms"];
	content = content.replaceAll( '<Label>','<Label><![CDATA[');
	content = content.replaceAll( '</Label>',']]></Label>');
	rangeLabels.each() {
		rangeLabel -> content = content.replaceAll(rangeLabel, "range_" + rangeLabel);
	content = content.replaceAll(  "LessThan_0ms", "range_LessThan_0ms");
	content = content.replaceAll( "GreaterThan", "range_GreaterThan");
	return content

def jamonCheck= {
	url,content ->
	monitors = []
	def JAMonXML = new XmlSlurper().parseText(fixJamonXml(content))
    def parseLong =  { t ->  if (t.text().equals("")) return null; Long.valueOf(t.text().replaceAll(',', ''))}
    def parseLongString =  { t ->  if (t.equals("")) return null; Long.valueOf(t.replaceAll(',', ''))}
    def parseRange = {
		rangeText ->	// 15/10.2 (0/0/0)
		rangeFormat = /(.*)\/(.*) \((.*)\/(.*)\/(.*)\)/
		matched = ( rangeText.text() =~ rangeFormat )
		if (matched.matches()) {
			return [	'label' , hits : parseLongString(matched[0][1]),average:matched[0][2]]
		return [	'label' , hits : 0,average:0.0]
	println "************************"+ url
	JAMonXML.children().each() { row ->
		monitors.add( [
			'label' : row.Label,
			'units' : row.Units,
			'hits' : parseLong(row.Hits),
			'avg'  : parseLong(row.Avg),
			'total' : parseLong(row.Total),
			'stddev' : parseLong(row.StdDev),
			'lastvalue': parseLong(row.LastValue),
			'min' : parseLong(row.Min),
			'max' : parseLong(row.Max),
			'active' : parseLong(row.Active),
			'lastAccess' : row.LastAccess,
			'ranges' : [
				'range_LessThan_0ms' :parseRange(row.range_LessThan_0ms),
				'range_0_10ms' : parseRange(row.range_0_10ms),
				'range_10_20ms' : parseRange(row.range_10_20ms) ,
				'range_20_40ms' : parseRange(row.range_20_40ms),
				'range_80_160ms' : parseRange(row.range_80_160ms) ,
				'range_160_320ms' : parseRange(row.range_160_320ms),
				'range_320_640ms' : parseRange(row.range_320_640ms),
				'range_640_1280ms' : parseRange(row.range_640_1280ms),
				'range_1280_2560ms' : parseRange(row.range_1280_2560ms) ,
				'range_2560_5120ms' : parseRange(row.range_2560_5120ms),
				'range_5120_10240ms' : parseRange(row.range_5120_10240ms),
				'range_10240_20480ms' : parseRange(row.range_10240_20480ms),
				'range_GreaterThan_20480ms': parseRange(row.range_GreaterThan_20480ms)]
		] )
	 *  1      0      10ms
		2     10      20ms
		3     20      40ms
		4     40      80ms 
		5     80     160ms
		6    160     320ms
		7    320     640ms
		8    640    1280ms
		9   1280    2560ms
		10  2560    5120ms
		10  5120   10240ms
		12 10240   20480ms
		13 >>      20480ms
	def getPercentiles = {monitor ->
	    def ps = [0.5,0.8,0.9,0.95,0.98,0.99]
		def ranges = [];
		monitor.ranges.eachWithIndex() {it, i -> ranges.add(it.value.hits) }	
		def rangesCumulative  = [];	 
		(0..13).each() {i -> rangesCumulative.add (monitor.hits>0?ranges[i]/monitor.hits:0)}
		def percentages= (0..13).collect() {i -> rangesCumulative[1..i].sum()}
		def percentiles = ps.collect{ percentile->percentages.findIndexOf{it>=percentile}}
	   return percentiles
	percentileserrors = [];
	monitors.each {
		percentiles = getPercentiles(it)
		println percentiles.join('\t') + "\t"+it.label
		if (percentiles[2]>8) {

	return percentileserrors>0;
checkAllUrl (jamonurls,jamonCheck)

, , , ,


Web’s fear and maven

All is well and good on your jetty or tomcat servers, then one of your client want to deploy your application in websphere application server, and trouble begins

 - JNDI lookup for datasources
 - Classloading mess
    - Verbose classloading and parent last
    - Jboss tattletale
    - Cleanup undesired dependencies
        - Maven exclusions
        - Correct scope
        - Maven war plugin : packagingExcludes
        - Patched jar
    - Keep it clean
         - maven-enforcer-plugin and friends
         - Combining Groovy-Sonar-Jenkins

Jndi lookup

If your client plan to use websphere, may be he wants to use the built-in websphere datasource, an implementation collecting various statistics about connection, prepared statement,…
You probably want to keep your jetty/tomcat compliance and if you are in webphere switch to the specific implementation (jndi datasource, jta transaction manager,…)
You can use the spring profiles to lookup your datasource via jndi instead of using dbcp or another datasource implementation.

	<bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean" abstract="false"
		<property name="lookupOnStartup" value="false" />
		<property name="cache" value="true" />
		<property name="proxyInterface" value="javax.sql.DataSource" />
		<property name="expectedType" value="javax.sql.DataSource" />
		<property name="jndiName" value="java:comp/env/jdbc/MyDataSource" />

If you plan to use jta transactions, and multiple datasources/queues, don’t forget to use XA transactions or tweak your transactions to avoid mixing access in a single transaction (and design it for possible data loss).
Also reduce the isolation level through the datasource property webSphereDefaultIsolationLevel (the default one is repeatable read).
If you have long running transaction like quartz job, test them extensively.

Classloading mess

We are in 2012… osgi is there since a long time and I’m still struggling with websphere and its bundled xerces.

Verbose classloading and parent last

To diagnose classloading issues like NoClassDefFound, Violation Constraint,… you can enable the verbose classloading.
To minimize the side effect of the bundled jars in websphere you setup the classloader policy of your application and module to parent last.

Jboss tattletale

I know that it’s ironic but this tool developed by JBoss will save you hours of trial and errors.
To audit your web-inf/lib, jboss tattletale is THE tool to identify :
– undesired dependencies like the one bundling javax.** classes
– duplicate jars (often due to maven relocation)
– duplicate classes


Launch mvn clean package
Then take a look at the report, you will perhaps discover duplicates classes like the one from commons-logging and use jcl-over-slf4j

or duplicate quartz jar :


and many other undesired dependencies.

Cleanup Undesired dependencies

Maven exclusions

Since maven 2.x resolves dependencies transitively, it is possible for unwanted dependencies to be included in your project’s classpath. Projects that you depend on may not have declared their set of dependencies correctly, for example. In order to address this special situation, maven 2.x has incorporated the notion of explicit dependency exclusion. Exclusions are set on a specific dependency in your POM, and are targeted at a specific groupId and artifactId. When you build your project, that artifact will not be added to your project’s classpath by way of the dependency in which the exclusion was declared.


Correct scope

For example excluding test artifact by specifying the correct scope.


Exclude jdbc drivers by defining them as provided (idem for your datasource implementation)




In extreme case… putting exclusions is just too long and boring. Configuring the maven war plugin to exclude the jar can be a faster way but remember that if this dependency breaks something in your application, it’s still there in your unit tests.


Patched jars

Some open source jars bundles multiple times the same classes, for example org.w3c.dom.UserDataHandler is bundled in xom, jaxen and many more.
This interface was also bundled in websphere and two jars in the web-inf/lib, one of them was sealed leading to java.lang.LinkageError: loading constraint violation: loader.
So I removed them from the jar and upload a xom-1.1.patched.jar to the corporate maven repository. It’s really ugly but it’s working.

Keep it clean

maven-enforcer-plugin and friends

Maven provide a default rules to enforce some rules, on[e] of them is for banneddependencies.

But there is another set of rules provided by the pedantic pom enforcer

Have you ever experienced symptoms like headaches, unfocused anger or a feeling of total resignation when looking at a Maven project where everybody adds and changes stuff just as they need? Do people call you a “POM-Nazi” if you show them how to setup proper and well organized projects?

If so, the Pedantic POM Enforcers are absolutely the thing you need!

An you have also an extra rule set @codehaus

Combining Groovy-Sonar-Jenkins

It’s quite easy to create a small groovy script that
– will check the jars in web-inf/lib against a baseline list
– failed the build or if you are less paranoid…
– send a mail to your team,
– or just contribute to a sonar manual measure

Let’s define our baseline, for some jar you want to get noticed if a different version is bundled, for your module you accept any version.
And use this baseline as a whitelist if it’s a different version or if there’s no match then it’s a new dependency -> requires to test a websphere deployment.


Then create the manual measure in sonar

You can define a manual measure

And now the groovy script to analyze the latest war file and post the manual measure to sonar and send you a mail 😉 :


//authenticated post
def postSonarMeasure = { resource,metric,val, sonarhost,token ->
	def script = "resource=${resource}&metric=${metric}&val=${val}&text=fromgroovy&description=fromgroovy";
	println script
	URL url = new URL("${sonarhost}/api/manual_measures?"+script);
	URLConnection conn = url.openConnection();
	conn.setRequestProperty ("Authorization", "Basic ${token}")
	OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
	result=  conn.getInputStream().getText()
	println 'metrics created '+result;
	return result

def sonar = ''
def mavencoordinate=''
def token = 'myuser:mypassword'.bytes.encodeBase64().toString()

// <not_supported/>  
// 1. define the metrics
// 2. add a measure manually
// 3. launch an analysis
// 4. check the data through api/manual_measures
def postSonarUnverifiedWebInfJars = { value ->

def getActualContentOfWebInfLibFromLastestWar ={
	// find latest war file in target directory
	fileWar = new File("./target").listFiles().findAll(){ it.getName().endsWith('.war')}.sort() { a,b ->
		a.lastModified().compareTo b.lastModified()
	println "Checking WEB-INF/lib from "+fileWar.canonicalPath;
	//and create actuals with content
	ZipFile file = new ZipFile(fileWar)
	actuals = file.entries().collect { entry -> if (entry.getName().startsWith('WEB-INF/lib/')) return entry.getName().substring('WEB-INF/lib/'.length()) }
	actuals = actuals.findAll {it!=null && !it.equals('')}
	return actuals

def getBaseLine = {
	new File("./baseline.txt").eachLine { if (!it.trim().isEmpty())allowed.add(it) }
	return 	allowed
	actuals =getActualContentOfWebInfLibFromLastestWar();
	allowed =getBaseLine();

	println "************************************ "
	println "actuals "+actuals.size()
	println "allowed "+allowed.size()
	println "************************************ "

	unallowed = [];
	unmatched = [];

	allowedNonMatching = [];

	actuals.each { actual ->
		ok = allowed.find() { allow ->
			boolean match= (actual =~ '^'+allow)
			if (match) {
				println "matching " +actual +" "+ allow
			return match;
		if (ok==null) {
			unallowed.add("unmatched dependencies ! '${actual}' ")
			println "unmatched dependencies ! '${actual}' "
	if (!unallowed.isEmpty() || !allowedNonMatching.isEmpty) {
		def msg =  "The ${project} problem dependencies : \n"+unallowed.join('\n')+" \n add exclusions or adapt baseline.txt check if websphere deployment is ok.\nplease.\n"+actuals.join('\n');
		 ant = new AntBuilder()
		 ant.mail(mailhost:'', subject:"${project} : undesired dependencies detected !" ,tolist:''){

		println msg.toString()
	println "************************************ unused constraint from baseline.txt"
	allowedNonMatching.each {println it}
	println "*************************"
	println "************************************ append content to baseline.txt"
	unmatched.each {println it}
	println "*************************"

Enable the run of this scripts via maven plugin in a dedicated profile


or via jenkins groovy post scripts.


1 Comment

Personal taste for test and quality toolbox


* Software Quality Axes
  - Focus on test
  - Quality of a good test
* Libraries to write maintainable tests
  - Junit, fest, mockito, openpojo, spring-mock
  - Build forge with jenkins and sonar
* IDE features and plugins
  - Eclipse : shortcuts, static imports, eclipse template for junit4, eclipse save actions
  - Plug-ins : more unit, Eclemma codecoverage, infinitest, sonar ide
* Coding style
  - Test as running specifications
  - Write testable code
  - Junit best practices
* Conclusion

Software Quality Axes

Sonar is defining software quality through these 7 axes. I will mainly focus on unit tests and their code coverage, maintainability and make them fun to write.

As Java developers we are really lucky,

  • there’s a plethora of libraries to write easily, maintainable tests and
  • we have also a great IDE with a lot of plug-ins and built-in feature to maximize our productivity !

Quality of a good test from Kent Beck :

  • isolated (unaffected by the presence, absence, or results of other tests)
  • automated
  • easy to write
  • easy to read :  expressive -> maintainable (fix or add new ones)
  • easy to run
  • unique : don’t overlap with others -> code coverage

I picked up some plugins and libraries to help on these areas.

Libraries to write maintainable tests

JUnit (easy to run)

JUnit3’s last release was in 2002. So it’s time to move to junit4 ! All the tooling around junit is available : ide, maven, jenkins, sonar,…
Don’t forget to use the ‘new’ features like

I know testNg do this better… but junit is so widespread in my development environment.

Fest Assert(easy to read)

Default junit assertion are not really expressive, custom errors messages are often required to make failure more understandable to the developer. Fest-Assert is a library that let you as a developer write assertion fluently like assertThat(frodo.getAge()).isEqualTo(33); or assertThat(fellowshipOfTheRing).hasSize(9);. More samples in this snippet. After reading this article you will understand why I prefer fest-assert over hamcrest. Note also that festAssert can be extended to support your model : find a sample for jodatime assertions

(easy to write)

Mockito is a mocking framework that tastes really good. It lets you write beautiful tests with clean & simple API. Mockito doesn’t give you hangover because the tests are very readable and they produce clean verification errors. Mockito will help you testing nominal use cases but also extreme situation like unexpected/rare/hard to reproduce Exception (like db connection issues, corrupted database,…).

org.springframework.mock.web (easy to write)

The org.springframework.mock.web package contains a comprehensive set of Servlet API mock objects, targeted at usage with Spring’s Web MVC framework, which are useful for testing web contexts and controllers. These mock objects are generally more convenient to use than dynamic mock objects such as mockito.

Openpojo (easy to write)

Trivializing POJO testing and Identity.

It’s perhaps no a great idea to test getters/setters. The main point with this library is in fact to let you focus on the non trivial code by covering the trivial one. You will be surprised how easy it is when you use openpojo.You can also easily add your own testers for example verifying equals/hashCode transitity, reflectivity, nullsafety,..

Build forge based on

(easy to run)
Getting fast feedback from your commit or the one from your coworker is what jenkins provides you.

Sonar will follow the evolution of your project taking a look at potential bugs, code duplications, uncovered code, fragile test, integration test,…

 IDE features and plugins

Eclipse shortcuts (easy to write)

To be a productive programmer, don’t use the mouse 😉 Eclipse offers already a lot of shortcuts.My favorites : ctrl-shift-T or ctrl-shift-R for a goto type or resources
If you don’t know them well, some articles here and here and a great plugin to learn them.

Static imports (easy to write)

Open Preferences (Window -> Preferences)
Browse to Java -> Editor -> Content Assist -> Favorites
Click 'New Type...'
enter org.junit.Assert.*
enter org.mockito.Mockito.*
enter org.mockito.Matchers.*
enter org.fest.assertions.Assertions.*

If you type mock and then press Ctrl + Space, the static method Mockito.mock is proposed. The static import is added.
see it in action here

Eclipse template for junit4 (easy to write)

These templates once imported will help you to produces junit4 test method via content assist.

enter test
Ctrl + Space and pick test4
your test method is generated just fill the name of the testcase

Eclipse save actions

Sonar is checking unused import, and the systematic use of braces. The good news is that you can configure eclipse to fix these 2 plagues automatically with save-actions

Under "Preferences": Java > Editor > Save Actions
Check "Additional actions"
Click "Configure…"
Go to the "Code Style" tab
Check "Use blocks in if/while/for/do statements" and configure to your preferences

plugin (easy to write and run)

MoreUnit assist you in writing more unit tests. With feature like easy navigation between code and unit test, skeleton generation,…

ctrl-j jump to test or code
ctrl-r run associated test

If you don’t really follow the same naming convention as proposed by the plugin you can modify the preference ‘enable extended search for test methods:’
If needed there’s a similar plugin : fast code

Eclemma codecoverage plugin (unique and isolated)

Detecting uncovered code allows you to easily identify missing unit tests.
Eclemma brings code coverage analysis directly into the Eclipse workbench:

  • Fast develop/test cycle:Launches from within the workbench like JUnit test runs can directly be analyzed for code coverage.
  • Rich coverage analysis:Coverage results are immediately summarized and highlighted in the Java source code editors.
  • Non-invasive: EclEmma does not require modifying your projects or performing any other setup.

(easy to run)

Infinitest is a never ending unit tests runner, displays assertions failure or errors as problem just like compilation errors ! To make infinitest your new best friend you need a fast test suites or filter out the slow ones.

Let’s see fest and infinitest in action : errors displayed in front of contains and the fest message shows the actual content of the fellowshipOfTheRing


Setting up checkstyle, findbugs, pmd in all your developers workspaces, keeping all the plugins and rules up to date is quite hard. This last plugin will cover all the other axes. A good habit is to launch it when you think you are done… you will perhaps discover that you are not done yet 😉

The Sonar Eclipse Plugin provides a comprehensive integration of Sonar in Eclipse for Java projects. The objective of this integration is that developers do not leave their favorite IDE anymore to manage the quality of their source code. Most information displayed in the Sonar Web interface is now available in Eclipse. You can run local analysis based on the ruleset configured in your central sonar server.

Coding style

Writing tests is always easier if your code is designed to be testable and your test shows the intent of the test.

writing unit test as running specifications :

A good naming convention make it clear what you expect from the MailInboxSyncer implementation.

Class MailInboxSyncerTest {
     itShouldSyncMessagesFromPopSinceLastSync() {...}
     itShouldCloseConnectionEvenWhenExceptionThrown() {...}
     itShouldSyncMessagesOnlyWhenNotAlreadyInInbox() {...}
     itShouldIgnoreMessagesOlderThenLastSync() {...}
     itShouldIgnoreMessagesFailingFilter() {...}
     itShouldDefaultToDefaultTimeWhenNeverSynced() {...}

Write testable code

In general, don’t follow any rules from this article on How To Write Unmaintainable Code and Ensure a job for life ;-).
Global state is undesirable (statics, singleton,…) see testability-explorer and then injectability is good : dataSource=new org.apache.commons.dbcp.BasicDataSource() vs setDataSource(Datasource datasource).

Code Design principles that could help having a code base more testable

Layered design
Code against interfaces
Put the code where it belongs
Prefer composition over inheritance.
Favor polymorphism over conditionals
Avoid static methods and the Singleton pattern.
Isolate dependencies.
Inject dependencies

Code practices that don’t help

Mixing of concerns (business logic, caching, transaction,...)
Mixing service objects with value objects
Kilometric classes/methods
A lot a dependency between your classes
Do complicated creation work in objects
Avoid interface usage, depends on concrete
Static methods/fields
Satefull singletons
A lot of global attributes
Make the scope of variable broader then required

Junit Best practices


Don't write unit test without any assert
Don't log… but assert
Don't write complex condition logic : something you are not testing or something that is too complex that should be re-factored.
Avoid hard-coding dataset
Avoid over complex assert


Small and fast test
No side effect
A bug means a new unit test
Externalize your dataset if possible (xml,csv,...)
Avoid time dependency via jodatime

For spring integration test : @RunWith(SpringJUnit4ClassRunner.class) Assert result (not null, empty,…) Delta assertion : The pattern is like this: 1.measure 2.exercise 3.measure second times and 4.assert on deltas Guard Assertion : You assert some state before exercising the SUT. For ex an empty case for current dataset Create custom assertions if needed : When complex assertion are “copy-pasted” over the test code or when the assertion code “hides” the goals of the test you can “extract the code” in assertion method.


No excuse ! You have a now my personal toolbox to write more (maintainable) tests.
I’d like to thank all these developers contributing to a better quality, more productivity, gain in visibility via their tools and libraries !

Anything missing from this list ?
Do you have experience with bdd frameworks ?
Share your own testing pearls ?

, , , , , ,