Wednesday, September 30, 2009

XML-In-Practice Day #1 Summary

I'm at IDEAlliance XML-In-Practice 2009 in DC this week talking about IBM WebSphere XML and learning about other XML products and technologies. There are four tracks -- Publishing and Media, Applications, Foundation and Interoperability AKA the technology track, Electronic Medical Records and President Obama's Economic Plan, and e-Government. Based on my totally unscientific head count, the attendance to each track is 50%, 25%, 12% and 13%.

I attended the following sessions today -

Keynote - XML Enabled Medical Records - Dr. Clement McDonald

I learned a) how technology is used to create repositories of information within hospitals/caregivers b) how much workload these systems exchange in the form of HL7 messages c) how distributed systems share data within a localized region for decision making, consistency of care d) how Web 2.0 is helping replace very complicated forms based desktop apps that are trusted currently e) how Dr's are happy to have a wealth of electronic information to help, but see putting new data into the system as something they cannot afford to do given already limited time with patient (call for smarter devices/speech to text) f) how different data is across the various medical interactions all the way from very structured to very narrative. The best two parts of the talk was the 0.5 seconds he showed an XML document which stressed the business aspect of this is key - the technology just has to exist behind the scenes to make it possible and seeing a Dr. throw stuffed pings to the audience (joke on how LOINC standards sounds like OINK).

Overview of President Obama's Electronic Medical Records Plan and Health Information Technology Architecture - John Quinn

I a) learned how much money is set aside to rewards providers that move to standardized medical health records and what timelines exist to get these rewards b) learned how these timelines are aggressive based on time to implementation of typical systems c) learned how the rewards are based upon a certified system which is challenging to guarantee for valuable use. I really took away a deeper appreciation for not only the complexity inside of a single hospital, but also how challenging a national mandate will be (especially to individual physicians).

XSLT Stylesheets from Version 1.0 to 2.0 - Priscilla Walmsley

I didn't take alot of notes in this session as I'm rather knowledgeable about this topic. However, I'd say it was a great presentation given the example (before and after) based approach.

Customer Use Case: How IBM Simplifies Complex Content Developing and Publishing Across the Enterprise. - Daniel Dionne

Great presentation that didn't go into just what DITA for content development/publishing is, but showed the entire lifecycle and processes needed to make a wide adoption work. Went into some rather impressive use cases of the technology, along with challenges, within the IBM company.

Technical Overview of Relax-NG - Bob Ducharme

Can't say I was a huge supporter of this talk, but that is likely due to the fact that I'm a data-oriented XML guy and I'm working on standards, customer situations that are very dependent on XML Schema. Bob discussed areas where Relax-NG was better than XML schema for mostly document oriented scenarios. I would have liked to see more mention of XML Schema 1.1 and how that changed the story. I did get some good value out of understanding why some document-centric customers are still using DTD's.

HL7's use of XML - Paul Knapp

Learned how HL7 V3 XML isn't really yet used in US e-healthcare apps (every hospital is exchanging internal messages in HL7 V2). Abroad, new projects that are less than three years old are very likely to be HL7 V3. We should be seeing more of this in the states with new projects, especially as we start to consider the need to share information outside of a single hospital, etc. Paul did mention binary XML and how that would help many of the HL7 V3 current issues.

MarkLogic Beer and Demo Jam

I did a 4 minute demo along with nine others during the reception. You get 5 minutes to do a demo with no preparation and the best demos win free stuff. I demoed the XML Feature Pack and the 40 samples we have along with the end to end blog checker sample written in XPath 2.0, XSLT 2.0, and XQuery 1.0. The samples I showed had a nice CSS and dashboard we've added since Beta 4 and that visual skinning over the XML technologies drew positive comments from the crowd. Didn't win anything in the end. Oh well.

After hours

Finally, I was able to do dinner with about 15 folks who regularly attend these conferences. Some great conversation with people from all parts of the industry.

Sunday, September 20, 2009

WebSphere eXtreme Scale cache provider for Dynacache

The dynamic cache engine is the default cache provider for the Dynacache APIs and frameworks. Starting WebSphere Application Server and Dynacache allows WebSphere eXtreme Scale to act as the core caching engine for Dynacache.

You can configure the dynamic cache service to use WebSphere eXtreme Scale as your cache provider instead of the default dynamic cache engine.

This provides customers the ability to leverage transactional support, improved scalability, high availability and other XTP features without making changes to their existing Dynacache caching code.

This capability can also be enabled on WAS service packs, and via APAR PK85622.


Tuesday, September 15, 2009

XML Feature Pack Thin Client Demo - Zero to running in 6 minutes

NOTE: This post is our of date. For the same demo on the released product see this link

As we announced, the latest beta release of the XML Feature Pack contains the Thin Client for XML. As well as allowing you to use this in your client applications to WebSphere Application Servers, the thin client allows for quick and easily evaluation of the technology. Here I show a quick demo of using the following simple XML, XPath, XSLT and XQuery files along with Java files to invoke them.

Demo 7 - XML Feature Pack Beta 4 Thin Client for XML

Direct Link (HD Version)

Here are the files for the demo:

Which contains (,,, simple.xsl, simple.xq, locations.xml)

Monday, September 14, 2009

Rational Automation Framework for WebSphere

Leigh and David spent the better part of 8 years working on the WebSphere Foundation Architecture and WebSphere Application Server - specifically in the areas of administration, configuration, systems management, and performance tooling. In mid-2007 both David and Leigh took the opportunity to expand their horizons and explore new options in IBM, though never really moving too far away from WebSphere systems management. Since leaving the WebSphere Architecture and Development organization in 2007, we have been working in the IBM Rational brand focusing on software delivery automation. We are excited to announce that the result of that effort is the announcement and delivery of the Rational Automation Framework for WebSphere - available as of May 15, 2009.

IBM Rational Automation Framework for WebSphere is an optional feature that extends and enhances IBM Rational Build Forge around WebSphere Application Server and WebSphere Portal environments. This customizable management framework is designed specifically to automate installation, patching, configuration management, and application deployments for IBM WebSphere Application Server and IBM WebSphere Portal.

Rational Automation Framework for WebSphere reduces the complexity of managing your IBM WebSphere Application Server and IBM WebSphere Portal environment due to common pains, such as:
  • The lack of consistency and/or repeatability in the installation, configuration, and application deployments in IBM WebSphere Application Server and IBM WebSphere Portal environments as a part of the Software Delivery Lifecycle.
  • The challenge of connecting disparate application development, test, and operations groups into a single traceable and enforceable process for the Software Delivery Lifecycle.
  • The inability to manage IBM WebSphere Application Server and IBM WebSphere Portal environments across multiple Software Delivery Lifecycle environments and/or beyond the cell scope leading to the development of costly, difficult to support, homegrown solutions.
  • The lack of change history, auditability, and governance around the changes to the IBM WebSphere Application Server and IBM WebSphere Portal environment configurations.
  • The need to be able to quickly reproduce IBM WebSphere Application Server and IBM WebSphere Portal environments in the case of a disaster.

For those companies facing IBM WebSphere Application Server and IBM WebSphere Portal infrastructure management challenges, the key to delivering greater operational productivity with quality is automation. By eliminating manual and complex tasks when managing IBM WebSphere Application Server and IBM WebSphere Portal environments, Rational Automation Framework for WebSphere can provide accuracy, reliability, repeatability, and consistency to help cut costs and improve productivity and quality.

David Brauneis
Chief Architect, Rational Automation Framework for WebSphere

Leigh Williamson
Distinguished Engineer & Chief Architect, Rational Software Delivery Automation

XML Feature Pack Beta 4 - Now With Thin Client

A month ago, we announced the Beta 3 refresh which was specification complete on XPath 2.0, XSLT 2.0, and XQuery 1.0. On Friday we released a Beta 4 refresh which continues to remove any remaining restrictions as well as adds one new major feature - The Thin Client for XML with WebSphere Application Server.

As noted on the open beta download page,

The beta includes the IBM Thin Client for XML with WebSphere Application Server. The thin client allows access to the same Feature Pack API and runtime functionality (XPath 2.0, XSLT 2.0, XQuery 1.0) available in the WebSphere Application Server Feature Pack for XML. The thin client can be copied to multiple clients running Java SE 1.6 in support of a WebSphere Application Server V7.0 installation.

This means if you have client applications to WebSphere Application Servers you can copy the XML Feature Pack thin client file to these clients and get the same XML programming model support in your clients.

We also believe this thin client support will help "new to WebSphere" folks evaluate this technology. As such, we have added a download link to the jar file on the open beta website. Click on that link and then click on "Local install using Download Director or HTTP" and follow through to download "IBM Thin Client for XML with WebSphere Application Server". I hope to show a demo of how fast you can get up and going with the thin client in the next day or so.

Saturday, September 12, 2009

Hidden nodes in XPath - fail on namespaces by me

I was working on a sample with the XML Feature Pack last week to show good integration between the XML Feature Pack Beta and databases that support XML columns, such as DB2 pureXML.

I ran into an issue that stumped me for a while and wanted to write about it so maybe others won't be slowed down as long as I. I was writing a XCollectionResolver and XResultsResolver that connected to the database. For some reason, while these resolvers returned data that looked valid, they couldn't be navigated by XPath 2.0. I saw things like this is XQuery:

let $a := trace($domainSpammers/spammers/spammer/email, "email =")

Traced nothing, while

let $a := trace(node-name($domainSpammers/*/*/*), "threestars = ")

Traced email, uri, and name. I even put domainSpammers into the output of the XQuery and could see the spammers/spammer/email tree:

<spammers xmlns="" xmlns:xs="">
    <name>Joe Smith</name>

I looked at this for a few hours. Luckily one of my team members saw the issue. You can see by the title of this post and the above xml, the issue was the elements are in the XHTML namespace.

It turns out I was writing to the document from XSLT 2.0 using the new feature of multiple result documents. While I wanted my browser returned page to be in the default namespace of XHTML, I didn't want the data written to the database to be in the XHTML namespace. However, since I didn't clarify this, it mistakenly was written to that namespace.

Next time, and maybe this will help you, I'll add namespace-uri() to my debugging arsenal:

let $a := trace(
  namespace-uri($domainSpammers/*/*/*)), "threestars = ")

Which would clearly have shown that email was in the XHTML space:

a = email;

Which would have saved me a few hours of pulling my hair out.

Friday, September 11, 2009

SCA 1.0.1 Beta Refresh Available

The SCA team in WebSphere has revved the 1.0.1 Beta yet again and it utilizes the Rational Install Manager (IM) software which allows the 1.0.1 to be laid down on a vanilla WAS w/o having to install the 4Q08 GA (1.0.0) level of code.

Rather than go into detail about additional capability, I'll tease you to go look at the official early program website for our beta.


Wednesday, September 9, 2009

SPEC working on standard SOA Benchmark

I continue to be interested in helping customers understand the performance of Service Oriented Architect (SOA) applications. As you can see here, I'm working (as the chair) in this SPEC working group, along with considerable input from Oracle and VMware on pushing forward on a standard benchmark for SOA based applications and the middleware infrastructure on which they run.

The interesting parts (in my opinion) of this press release are:

The benchmark will be developed by a trusted benchmarking organization with input from all SPEC members. Also, as mentioned in the press release, we're looking for participation by other interested parties. If you're interested in joining SPEC or providing input, let Bob Cramblitt know. I'm truly excited to see a SOA benchmark come from SPEC as they have a proven track record in creating industry trusted benchmarks for middleware performance.

While the initial focus is Web Services, Enterprise Service Buses, and Business Process Management (BPEL), the group realizes these technologies are only part of the entire SOA picture. It's good to see the group start with a sensible core and grow the effort over time.

The group is working to stay flexible on its support of multiple approaches to implementing these technologies. This is key, as SOA is an architectural approach and there are multiple ways to implement such technologies. However, in an industry standard benchmark it's important to audit and standardize common implementations to confirm they would be used in typical customer implementations.

I'll continue to post publically shareable information as the work group makes progress. If you have any quick questions, post them here and I'll ask them at the working group.