I developed the original ESB Performance Test Framework - in June 2007 while actively working on the Apache Synapse and the WSO2 ESBs. Since then, we've run three rounds of testing, comparing both Proprietary and Open Source ESBs - including Mule, Apache ServiceMix, Apache Synapse/WSO2 ESB, a leading proprietary ESB and the proprietary version of an open source ESB.
It was interesting to see other vendors such as Mule and BEA picking up this test framework in addition to WSO2, in publishing results. However, due to different hardware configurations being used, and other advanced tuning or optimizations performed by vendors - the results were left a bit questionable and could not be compared in a fair manner.
This latest round - Round 4 - takes this framework to the Amazon EC2 - and is designed to let end-users run the tests on an EC2 node for less than $2 of computing time! This will allow differences in hardware to not make any difference, and allow the users to see how exactly the tests have been configured, tuned and run.
I hope Mule, ServiceMix, JBoss, OpenESB, Petals, BEA/Oracle, IBM, WSO2 and any other ESB vendors I've missed will make use of this opportunity and publicly share the necessary configurations so that we could all rely on an accurate and fair benchmark - which can be verified independently on demand by any end-user!
Hence - unlike in the past, I will refrain from naming any competitor - unless its vendor or the open source project team requests that it be included and the results publicly shared. Thus it is now left to the users of ESBs - to demand the configurations necessary for them to run these tests from vendors - and themselves decide which one to select after executing the test on Amazon EC2. I hope to make this process even simpler at the next round - with a custom AMI and more automation
The Round 4 results includes the three test cases conducted earlier (i.e. Direct Proxy, Content Based Routing [CBR] Proxy, and XSLT Proxy), and adds a new scenario for WS-Security processing.