This article is based on the 5th round of public ESB Performance benchmarking since June 2007. This performance framework has become the de-facto ESB Performance test suite, with multiple vendors using it to publish benchmark results in the past.
The significant changes in this round includes:
- The inclusion of 8 free and open source ESBs
- The publication of a standalone Amazon EC2 AMI image of the complete test scenario and resources (ami-49428f20)
- The publication of the ESB Performance Benchmark resources project on BitBucket
- The inclusion of content based routing [CBR] test cases for SOAP header and Transport header based routing
These configurations have been built by AdroitLogic, partly based on previously published resources and contributions submitted to the resources repository. The solutions we've come up with may have inherent limitations due to our limited understanding of some of the other ESBs and the limited time we had spent. Hence these configurations may not be optimal, and/or may not be optimally tuned for the scenarios tested
However, AdroitLogic has spent many days and nights to get the other ESBs to work under identical conditions as much as possible. We would be happy to receive feedback to improve the configurations used, and the tuning parameters for the different ESBs from the vendors and/or end users. We will include those updates into future rounds, and could be submitted directly to the Mercurial repository as a pull request.
- Direct Proxy Service
- Content Based Routing Proxy
- on SOAP body payload
- on SOAP header
- on Transport Header
- XSLT Transformation Proxy
- WS-Security Proxy
Each of these scenarios are described in detail under
Performance Test Environment and Resources
Amazon EC2 AMI image
The performance test was carried out on an Amazon EC2 High CPU Extra Large (c1.xlarge) instance running the Public AMI "ami-49428f20" based on the EBS backed official Ubuntu 10.10 image. This instance was a 64 bit instance and had 7GB of RAM with a 8GB EBS backed storage, 20 ECUs and 8 cores. All tests against each ESB was carried out on the same instance started on the 3rd of October 2011, and executed one after another. Network isolation was not required as all test traffic was only between the local machine hosting the load generator, ESB and backend service.
Each ESB was allocated 2GB of Java Heap memory. The Sun JDK 1.6.0_26-b03 was used with the unlimited strength policy files applied to allow WS-Security testing with larger key sizes. The instance had a 2GB RAM disk which was utilized by the UltraESB. We attempted to allow approximately 300 worker threads, and a socket timeout of 120 seconds for each ESB, but this was not possible for some ESB's due to time and/or documentation limitations.
Performance scenario configurations for each ESB
The configurations used for each ESB are published on the BitBucket repository at https://bitbucket.org/adroitlogic/esbperformance . This includes a README.TXT file for each ESB, with basic instruction to configure each ESB from scratch, and to build and deploy the configurations.
Results Analysis Worksheet hosted on Google Docs
ESB Performance Testing Round 5 is a publicly accessible Google document hosting the summarized results.
Raw Execution Logs
The raw execution logs and the ESB log files for each ESB for the performance run can be found at the ~/client-scripts directory and each ESB specific directory under ~/esbs.
The following ESB's did not complete the performance test successfully and hence were excluded from the analysis
- Apache ServiceMix v4.3.0
- Fuse ESB v4.4.0
- JBoss ESB v4.10
- Petals ESB v3.1.3
The following ESB completed the performance test, although many errors were encountered
- Talend ESB v4.2.0
The Talend ESB encountered many errors/failures i.e. a total of 131,165/3,590,400 or 3.65% of the requests. But it still managed to complete the test. This is highlighted in the "Summary" sheet of the ESB Performance Testing Round 5 Google Document
The following ESBs completed the performance test with only socket timeout errors which could be expected under load due to various reasons
- Mule ESB CE v3.2.0
- WSO2 ESB v4.0.0
- UltraESB v.1.6.0 - Vanilla
The following ESB completed the performance test without a single error or socket timeout, and very clearly had a lead over all other ESBs for all use cases
- UltraESB v1.6.0 - Enhanced
Here are more details on the ESBs that failed and were eliminated from the final analysis
Apache ServiceMix v4.3.0 - more details
Sometimes we found that the ServiceMix ESB did not start as expected, and the easiest way to get it to restart was to re-install it. Messages "seemed" to be stuck inside the ESB and a restart seemed like they were again loaded. For a single request, all scenarios worked as expected. However, when running the WS-Security smoke test itself, the following error was seen:
During the load test iteration with 20 concurrent users to the XSLTProxy, the ServiceMix ESB became almost unusable, this was still at the time the pre-load test warmup phase was executing. It was not really servicing requests, but had become extremely slow (as per a tcpdump) but the log files did not report any faults.
Fuse ESB v4.4.0 - more details
For a single request all scenarios were successful. However, running the smoke performance test itself started filling up the log file with the following error, and hence we abandoned the performance tests for the WS-Security scenarios
The load test iteration with 640 concurrent users with 100K messages to the DirectProxy rendered the Fuse ESB unusable. It was not servicing any requests after the apparent crash, and the logs contained stack traces as shown below
JBoss ESB v4.10 - more details
At a concurrency level of 320 users while running the CBR Proxy service, something went wrong and made the ESB unable to process any further requests. On an earlier run which produced the same effect, the following error messages could be seen
Petals ESB v3.1.3 - more details
The Petals ESB started to give errors through the Direct Proxy test case, and when starting with 5K messages, it crashed. The tail end of the log file looked as follows:
Refer to the Google Document at : ESB Performance Testing Round 5 for complete results
- The UltraESB - Enhanced version had a very clear lead above all ESBs across all scenarios, followed up by the UltraESB - Vanilla version. The main difference of the "Enhanced" version is the dropping of the Saxon library (for XSLT performance) and the VTD XML and Fast VTD XML libraries (for CBR performance). These libraries are easily downloadable by end users, and are not shipped by default due to the licensing issues related to these libraries.
- The WSO2 ESB would be the rightful owner of the 3rd place, but the XSLT performance has suffered due to a bug in the code which happened under load (more details). The fourth place would have been picked up by the Talend ESB if it did not fail on 3.65% of the requests. However its interesting to note that Talend ESB slightly performed better than the UltraESB for one of the WS-Security use cases. Mule ESB CE performed well with fewer errors than the Talend ESB.
- Compared to Round #4, where we compared the UltraESB against the WSO2 ESB as the other ESB, we can see that the WSO2 ESB has improved its performance on WS-Security processing, as earlier it was about 4X slower than the UltraESB. At the same time, we note that in Round #4, the WSO2 ESB did better in the CBR cases, due to the SAX parsing advantage it had. In v1.6.0 the UltraESB utilizes VTD XML to perform CBR over XML payloads without full XML parsing, and hence gains a significant boost over the previous round.
Reproducing the results
Full details on reproducing these results on an Amazon EC2 instance can be found here
WSO2 and WSO2 ESB are trademarks of WSO2 Inc. MuleSoft and Mule ESB are trademarks of MuleSoft. Fuse ESB is a trademark of IONA Technologies PLC. Petals is a trademark of EBM WebSourcing. JBoss is a trademark of Red Hat, Inc. Talend is a trademark of Talend. Apache ServiceMix is a trademark of the Apache Software Foundation. UltraESB and AdroitLogic are trademarks of AdroitLogic Private Ltd. All other product and company names and marks mentioned are the property of their respective owners and are mentioned for identification purposes only.