Thursday, September 27, 2007

SCA Rock Rolls Forward

My colleagues who have been working over in open source land at Apache inform me that Apache Tuscany SCA Incubator project 1.0 level has been released. The formal announcement, source and binaries are located here. This is a significant milestone in the life of SCA as an open initiative.

Monday, September 24, 2007

Perspective in Benchmarks - My thoughts on Microsoft StockTrader

I wanted to take the time to respond to claims coming from Microsoft. On June 4th, Greg Leake, Technical Marketing Manager for .NET of Microsoft started to discuss an effort with which he was involved. The effort was “porting” IBM’s WebSphere Performance Sample – Trade 6.1 to Microsoft .NET 3.0. Since then, Microsoft has been interviewed by trade press, has created video podcasts, and has started discussions on Microsoft forums and other development community sites. Please do not be fooled by these marketing statements.

In my opinion, Microsoft uses their report to focus on three points

- Attempt to prove that Microsoft is interoperable with WebSphere software
- Attempt to prove Microsoft’s SOA relevancy and readiness for the enterprise
- Attempt to prove Microsoft performance is better than WebSphere software performance

I’d like to state up front that I was personally involved with the development of IBM’s Trade benchmark starting with Trade 3 and ending with Trade 6.1. Others on my team have made significant contributions to the code as well to ensure that Trade continues to be an credible and useful tool to express performance concepts and release to release enhancements to IBM customers for J2EE (Java EE) programming.

Some level of interoperability

I applaud any effort by Microsoft or any other software vendor in pursuit of interoperability. However, the interoperability the Microsoft report speaks to is basic web services functionality (SOAP/WSDL/XML) only. It does not focus on interoperability of transactions, security, reliability, durability and does not use industry standard schemas that many of our customers need for cross enterprise (B2B) or intra-enterprise web services interoperability with .NET clients calling WebSphere servers. Independent of the Microsoft report, Microsoft and IBM have already focused on interoperability of these higher value qualities of service in Web Services through industry standard WS-I efforts as shown here and here. I am very happy to see IBM and Microsoft continue to focus on standards based interoperability. I am confident that WS-I will continue to facilitate customer focused web services interoperability in these higher value web service functionalities.

Is Microsoft enterprise ready?

Press coverage has cited the Microsoft report as helping Microsoft prove it is ready for the enterprise. From our quick look at Microsoft code, this doesn’t seem to be the case. Many of the “enterprise ready” features stressed by the Microsoft report are hand coded into the application. Areas such as multiple vendor database support, load balancing, high availability, scalability, and enterprise wide configuration for services contribute to the significantly higher lines of code count in the Microsoft application as compared to the WebSphere implementation. You should equate higher lines of code count to more maintenance and support costs long term and think about the value of this being provided in the application server product versus in the application itself. Some of the increased lines of code are for comments, but the comments themselves point out how many places where Microsoft deviated from using framework classes and instead implemented custom extensions to fill in gaps of functionality lacking in their framework.

As an author of Trade, I must admit an embarrassing fact. Four years ago, my team added web services capability to Trade as an optional component – to demonstrate web services as an alternative to remote stateless session bean methods. Since that time, I personally have worked hand in hand with many enterprise customers adopting web services. We have found that the Trade approach to web services wasn’t the best. Specifically the fine grained “services” in Trade average around 400-800 bytes of passed data. As you’ll see in my recent blog post, industry standard schemas for B2B typically have much larger payloads. While Trade was a fun exercise for my team to learn web services, it in no way mirrors what we know now our customers have told us are good web service practices. Current SOA principles motivate more coarse gained services. The embarrassing fact is that we never have gotten around to removing the poor examples of web services usage from Trade. However, it is interesting to note that Microsoft did not recognize or point out the obvious flaws in these web service patterns during their analysis – they merely parroted what they saw.

Microsoft entitled this paper as “Service Oriented”. Later in the press, Microsoft alludes to this “Service Oriented” benchmark test to help draw credibility to Microsoft’s SOA strategy. Based on what I just said about Trade web services, you’ll see why IBM has never talked about Trade and web services to help customers understand web service performance. Meanwhile, this Microsoft report focuses on these trivial payload web services to prove they can support “Service Oriented” benchmarks. Follow-up posts and press coverage followed the “Service Oriented” title, proclaiming that this in some way helps Microsoft’s SOA strategy. Drawing that conclusion is incorrect. The industry agrees that SOA is more than trivial web services. SOA is about a business centric IT architecture that provides for a more agile and flexible IT support of business needs. Realistic web service usage is a cornerstone of SOA, but web services alone do not define SOA.

Is Microsoft performance better than WebSphere performance?

I believe Microsoft is trying to draw IBM out of our commitment to standard benchmarking organizations to confuse customers about performance. You can draw your own conclusions from specific comments apparently made by Greg Leake in press coverage, asking for IBM to “collaborate”. Some have commented correctly on community discussions that the Microsoft report isn’t specific enough in terms of topology, hardware, and tuning to make any sensible conclusions based on the performance data. It is a compelling story that this Microsoft report weaves – Microsoft beating IBM on its own benchmark. However, IBM didn’t run Trade as a benchmark in the way shown in the paper’s results. As a customer, you should always be careful how much you trust proprietary benchmark results produced by a single vendor. These things can always be coerced to create FUD and confusion.

We have reviewed the paper and results and found inconsistencies with the best practices for how to run WebSphere Application Server. Assuming items Microsoft chose not to document along with improvements in performance allowed by following best practices, we in fact believe that IBM WebSphere Application Server would win across all the scenarios shown in the results.

You may well ask, if WebSphere Application Server would win, why wouldn’t you say so and publish a contrary IBM report. I don’t believe publishing a proprietary view of performance would help our customers – for all the reasons stated above. At best, if IBM was to respond to this paper, you could expect both vendors to degrade to the lowest common coding styles for implementing the benchmark so they would “win”. As shown already by Microsoft’s implementation, Microsoft wouldn’t choose to use their framework classes and features, but instead code optimal patterns in the application code. When those patterns are not tested and supported by the formal product, no customer wins by seeing the resulting performance.

IBM has a history of competing in standard benchmarking organizations such as SPEC. We do so because such organizations are standards based and unbiased and therefore trusted. Standards-based benchmarking processes give all participating vendors equal opportunity to review run rules, implementation code, and the results of other vendors. Given this, if you find Trade and SOA benchmarks useful, maybe it is time for IBM and Microsoft to jointly propose a SOA benchmark to a standard benchmarking organization. SOA benchmarking under standard benchmarking organizations is where our customers and the industry can truly benefit.


I personally talk to many customers about performance, SOA, and web services. I stand behind all that I say technically independent of marketing. I build strong long lasting relationships with these customers, many of whom know me personally. In good faith, I can stand in front of them with results from a standards benchmarking organization. I can stand in front of them showing our SOA leadership based on both customer references and analyst reports. On the other hand, I cannot in good faith show a one-off competitive benchmark run by a single vendor. I hope you can understand this position and it helps you discuss the coverage of this Microsoft report and any similar efforts by any vendor that follow.

This Microsoft report shows basic levels of interoperability and work within the WS-I shows higher levels of interoperability. The Microsoft report points out that Microsoft has to hand code many enterprise ready features that are just taken care of for you in WebSphere technology. The title of this Microsoft report and its follow-on press coverage attempts to confuse the industry on SOA, which points out how desperate Microsoft is to get press coverage in SOA. This Microsoft report doesn’t do a good job of showing the true performance story. Specifically on performance, I encourage all customers to put their trust in standard benchmarking organizations

Tuesday, September 11, 2007

New Redbook with SOA customer case study

A new Redbook has been published called "Implementing and Testing SOA on IBM System z: A Real Customer Case".

The Redbook does a good job of breaking down SOA implementation into a series of logical decisions, with real world implications. It covers not only the overall architecture, but the steps to get there. The final solution includes much of the WebSphere Business Process Modeling stack, including WebSphere Application Server, WebSphere Process Server, and WebSphere Portal, as well as tooling from WebSphere Integration Developer and WebSphere Business Modeler.

In particular, the Redbook shows how the SOA solutions fit into the existing ecosystem at the customer shop. SOA is evolutionary, not rip-and-replace. A good read.