1. Overview

This article is based on the 5th round of public ESB Performance benchmarking since June 2007. This performance framework has become the de-facto ESB Performance test suite, with multiple vendors using it to publish benchmark results in the past.

The significant changes in this round includes:

  • The inclusion of 8 free and open source ESBs

    • AdroitLogic UltraESB v1.6.0 [link]

    • WSO2 ESB v 4.0.0 [link]

    • Mule Community Edition v3.2.0 [link]

    • Apache ServiceMix v4.3.0 [link]

    • Fuse ESB v4.4.0 [link]

    • Talend ESB SE v4.2.0 [link]

    • JBoss ESB v4.10 [link]

    • Petals ESB v3.1.3 [link]

  • The publication of a standalone Amazon EC2 AMI image of the complete test scenario and resources (ami-49428f20)

  • The publication of the ESB Performance Benchmark resources project on BitBucket

  • The inclusion of content based routing [CBR] test cases for SOAP header and Transport header based routing

2. Disclaimer

These configurations have been built by AdroitLogic, partly based on previously published resources and contributions submitted to the resources repository. The solutions we’ve come up with may have inherent limitations due to our limited understanding of some of the other ESBs and the limited time we had spent. Hence these configurations may not be optimal, and/or may not be optimally tuned for the scenarios tested

However, AdroitLogic has spent many days and nights to get the other ESBs to work under identical conditions as much as possible. We would be happy to receive feedback to improve the configurations used, and the tuning parameters for the different ESBs from the vendors and/or end users. We will include those updates into future rounds, and could be submitted directly to the Mercurial repository as a pull request.

3. Scenarios tested

  • Direct Proxy Service

  • Content Based Routing Proxy

    • on SOAP body payload

    • on SOAP header

    • on Transport Header

  • XSLT Transformation Proxy

  • WS-Security Proxy

Each of these scenarios are described in detail under Performance Test Cases

4. Performance Test Environment and Resources

4.1. Amazon EC2 AMI image

The performance test was carried out on an Amazon EC2 High CPU Extra Large (c1.xlarge) instance running the Public AMI "ami-49428f20" based on the EBS backed official Ubuntu 10.10 image. This instance was a 64 bit instance and had 7GB of RAM with a 8GB EBS backed storage, 20 ECUs and 8 cores. All tests against each ESB was carried out on the same instance started on the 3rd of October 2011, and executed one after another. Network isolation was not required as all test traffic was only between the local machine hosting the load generator, ESB and backend service.

Each ESB was allocated 2GB of Java Heap memory. The Sun JDK 1.6.0_26-b03 was used with the unlimited strength policy files applied to allow WS-Security testing with larger key sizes. The instance had a 2GB RAM disk which was utilized by the UltraESB. We attempted to allow approximately 300 worker threads, and a socket timeout of 120 seconds for each ESB, but this was not possible for some ESB’s due to time and/or documentation limitations.

4.2. Performance scenario configurations for each ESB

The configurations used for each ESB are published on the BitBucket repository at https://bitbucket.org/adroitlogic/esbperformance. This includes a README.TXT file for each ESB, with basic instruction to configure each ESB from scratch, and to build and deploy the configurations.

4.3. Results Analysis Worksheet hosted on Google Docs

ESB Performance Testing Round 5 is a publicly accessible Google document hosting the summarized results.

4.4. Raw Execution Logs

The raw execution logs and the ESB log files for each ESB for the performance run can be found at the ~/client-scripts directory and each ESB specific directory under ~/esbs.

5. Results

The following ESB’s did not complete the performance test successfully and hence were excluded from the analysis

  • Apache ServiceMix v4.3.0

  • Fuse ESB v4.4.0

  • JBoss ESB v4.10

  • Petals ESB v3.1.3

The following ESB completed the performance test, although many errors were encountered

  • Talend ESB v4.2.0

The Talend ESB encountered many errors/failures i.e. a total of 131,165/3,590,400 or 3.65% of the requests. But it still managed to complete the test. This is highlighted in the "Summary" sheet of the ESB Performance Testing Round 5 Google Document

The following ESBs completed the performance test with only socket timeout errors which could be expected under load due to various reasons

  • Mule ESB CE v3.2.0

  • WSO2 ESB v4.0.0

  • UltraESB v.1.6.0 - Vanilla

The following ESB completed the performance test without a single error or socket timeout, and very clearly had a lead over all other ESBs for all use cases

  • UltraESB v1.6.0 - Enhanced

Here are more details on the ESBs that failed and were eliminated from the final analysis

5.1. Apache ServiceMix v4.3.0 - more details

Sometimes we found that the ServiceMix ESB did not start as expected, and the easiest way to get it to restart was to re-install it. Messages "seemed" to be stuck inside the ESB and a restart seemed like they were again loaded. For a single request, all scenarios worked as expected. However, when running the WS-Security smoke test itself, the following error was seen:

16:57:48,056 | WARN  | qtp339044420-242 | PhaseInterceptorChain            |  -  -  | Interceptor for {http://services.samples/xsd}SecureProxy#{http://services.samples/xsd}buyStocksOperation1K has thrown exception, unwinding now
 at org.apache.cxf.staxutils.FastStack.pop(FastStack.java:31)[116:org.apache.cxf.bundle:2.3.2]

During the load test iteration with 20 concurrent users to the XSLTProxy, the ServiceMix ESB became almost unusable, this was still at the time the pre-load test warmup phase was executing. It was not really servicing requests, but had become extremely slow (as per a tcpdump) but the log files did not report any faults.

5.2. Fuse ESB v4.4.0 - more details

For a single request all scenarios were successful. However, running the smoke performance test itself started filling up the log file with the following error, and hence we abandoned the performance tests for the WS-Security scenarios

12:58:32,478 | WARN  | tp1807330842-482 | PhaseInterceptorChain            |  -  -  | Interceptor for {http://services.samples/xsd}SecureProxy#{http://services.samples/xsd}buyStocksOperation1K has thrown exception, unwinding now
 at org.apache.cxf.staxutils.FastStack.pop(FastStack.java:31)[123:org.apache.cxf.bundle:2.4.1.fuse-00-43]

The load test iteration with 640 concurrent users with 100K messages to the DirectProxy rendered the Fuse ESB unusable. It was not servicing any requests after the apparent crash, and the logs contained stack traces as shown below

09:44:52,804 | WARN  | p1776334754-1113 | PhaseInterceptorChain            | ?  ? |  -  -  | Interceptor for {http://services.samples/xsd}DirectProxy#{http://services.samples/xsd}buyStocksOperation5K has thrown exception, unwinding now
org.apache.cxf.interceptor.Fault: Current event not START_ELEMENT
 at org.apache.servicemix.cxfbc.CxfBcConsumer$JbiInvokerInterceptor.handleMessage(CxfBcConsumer.java:896)[168:servicemix-cxf-bc:2011.02.0.fuse-00-43]
Caused by: java.lang.IllegalStateException: Current event not START_ELEMENT

5.3. JBoss ESB v4.10 - more details

At a concurrency level of 320 users while running the CBR Proxy service, something went wrong and made the ESB unable to process any further requests. On an earlier  run which produced the same effect, the following error messages could be seen

20:05:40,699 INFO  [ServiceInvoker] Unresponsive EPR: InVMEpr [ PortReference < <wsa:Address invm://7365727669636524242424242424242424242443425250726f7879/false?false#10000/>, <wsa:ReferenceProperties jbossesb:passByValue : false/> > ] for message: header: [  ]
20:05:54,371 INFO  [ServiceInvoker] Delivering message [header: [  ]] to DLQ.
20:05:54,389 ERROR [[http-cbr-body]] Servlet.service() for servlet http-cbr-body threw exception
org.jboss.soa.esb.listeners.message.ResponseTimeoutException: No response received for service [service:CBRProxy].
 at org.jboss.soa.esb.client.ServiceInvoker.post(ServiceInvoker.java:427)

5.4. Petals ESB v3.1.3 - more details

The Petals ESB started to give errors through the Direct Proxy test case, and when starting with 5K messages, it crashed. The tail end of the log file looked as follows:

WARNING 2011-10-03 07:05:34,943 [Petals.Container.Components.petals-bc-soap]
 Catch an exception on the WS invocation : null
WARNING 2011-10-03 07:06:20,054 [Petals.Container.Components.petals-bc-soap]
 Catch an exception on the WS invocation : Attempted read on closed stream.
The console log showed the following, almost endlessly
at org.apache.axiom.om.impl.llom.OMNodeImpl.setComplete(OMNodeImpl.java:181)
at org.apache.axiom.om.impl.llom.OMNodeImpl.setComplete(OMNodeImpl.java:181)
at org.apache.axiom.om.impl.llom.OMNodeImpl.setComplete(OMNodeImpl.java:181)

6. Observations

esb performance round5 

Refer to the Google Document at : ESB Performance Testing Round 5 for complete results

  • The UltraESB - Enhanced version had a very clear lead above all ESBs across all scenarios, followed up by the UltraESB - Vanilla version. The main difference of the "Enhanced" version is the dropping of the Saxon library (for XSLT performance) and the VTD XML and Fast VTD XML libraries (for CBR performance). These libraries are easily downloadable by end users, and are not shipped by default due to the licensing issues related to these libraries.

  • The WSO2 ESB would be the rightful owner of the 3rd place, but the XSLT performance has suffered due to a bug in the code which happened under load (more details). The fourth place would have been picked up by the Talend ESB if it did not fail on 3.65% of the requests. However its interesting to note that Talend ESB slightly performed better than the UltraESB for one of the WS-Security use cases. Mule ESB CE performed well with fewer errors than the Talend ESB.

  • Compared to Round #4, where we compared the UltraESB against the WSO2 ESB as the other ESB, we can see that the WSO2 ESB has improved its performance on WS-Security processing, as earlier it was about 4X slower than the UltraESB. At the same time, we note that in Round #4, the WSO2 ESB did better in the CBR cases, due to the SAX parsing advantage it had. In v1.6.0 the UltraESB utilizes VTD XML to perform CBR over XML payloads without full XML parsing, and hence gains a significant boost over the previous round.

7. Reproducing the results

Full details on reproducing these results on an Amazon EC2 instance can be found here

8. Authors

Asankha C. Perera, Founder and CTO AdroitLogic
Ruwan Linton, Director of Engineering, AdroitLogic

9. Notices

WSO2 and WSO2 ESB are trademarks of WSO2 Inc. MuleSoft and Mule ESB are trademarks of MuleSoft. Fuse ESB is a trademark of IONA Technologies PLC. Petals is a trademark of EBM WebSourcing. JBoss is a trademark of Red Hat, Inc. Talend is a trademark of Talend. Apache ServiceMix is a trademark of the Apache Software Foundation. UltraESB and AdroitLogic are trademarks of AdroitLogic Private Ltd. All other product and company names and marks mentioned are the property of their respective owners and are mentioned for identification purposes only.

Back to Home Page