In-Depth

SOAP testing easier said than done

Testing tools and methodologies for the web are in their infancy, most observers agree. The good news, though, is that some concepts and tools to help get you started do exist, and more are coming in the next few years.

How you test, and what you test for, depends on the nature of the SOAP-based Web service you are creating. A service that is intended to help integrate several internal applications behind the firewall is a very different beast from a composite Web service that grabs pieces of components coming from multiple sources that you did not create and cannot control.

Also, this second type of application -- called an external-facing Web service -- may be accessed from everything ranging from a PDA running Windows to a laptop running Java. Smart developers will test against all the possibilities. So an external-facing application requires more testing than one that will be used internally and where all the pieces are presumably within your control.

How testing is different

Testing any Web service is fundamentally different from testing a "traditional" application. There are several testing layers, from ensuring that the Web service correctly adheres to all protocols, to checking out the types and amounts of traffic that flow from all ends of the application. Testing the business logic of the application does not change, nor does unit testing of the code itself, but other things certainly do.

The first step is testing that the SOAP application's envelope has the correct headers and that the body correctly adheres to the Web Services Description Language (WSDL) that describes the Web service. These standards-related issues are covered by the SOAP 1.2 specification, currently being finalized by the World Wide Web Consortium (W3C).

Another body, the Web Services Interoperability Organization (WS-I), is creating profiles and test tools to ensure that software conforms to SOAP, WSDL, XML schema and other Web services pieces. This is particularly important if the Web service is accessed from multiple, different clients but these are not yet completed.

The next piece is where things get interesting. There are tools that help monitor the message traffic -- to see if the SOAP-based data streams passing between client and server are legal. These tools are not always integrated with existing IDEs or other application development packages, however.

So, a Web service is like a distributed application -- but one where you cannot always predict which client will access it, which Web services components the application will grab from day to day, or the quality of those components. Pieces of the code are scattered among clients and servers, and all of the relevant information may not be in front of the developer or the tester to analyze in familiar ways. And, of course, being the Web, the nature of the traffic will vary as well.

All of this was summed up by Jim Hebert, a contractor working at San Jose, Calif.-based Adobe Systems Inc. to test the installer piece of Acrobat. His team set up a network of 15 machines running a homegrown SOAP-based application to test the installer on different operating systems and various other configurations.

Hebert said that with a traditional application, a developer or tester can step through the code and usually figure out where any problems are. The process is well understood, and there are numerous tools and methodologies to help figure it out.

But with a Web service, he said, "you get to a point in the client code where it calls over to the Web service, and then it's this big black box." Hebert said that there is a dearth of tools that can treat the issue holistically -- that can follow the client call all the way through -- so "you end up solving the problem piecemeal."

The common solution is to use one tool to deal with the server end, another for the client and yet a third to track and analyze the data flowing between them.

Sometimes, it is a simple glitch that will get you. Hebert recalled a situation where the SOAP request ran for a very long time, then failed. "We realized that the server that hosts the Web service needed to be configured with a longer time-out. And in this case, the server's time-out was different from the client's." The result of that learning experience was that there is now an extremely detailed list for when a new machine is added to the test farm. "There are so many opportunities for things to be misconfigured," Hebert said.

Ease the test burden with 'defensive programming'

In general, Hebert suggests developers engage in "more defensive programming." The notion is that instead of expecting that things will go mostly right and that any bugs can be found and dealt with, assume the worst up front and plan for it.

Another anecdote from Hebert illustrates this point. When passing a paragraph of data from one machine to another -- one Macintosh, one Windows -- the carriage returns were changed and the line feeds were subsequently mismatched. Because the testers had thought to do a checksum, they realized what was happening.

"We never would have seen the silent data munching" otherwise, he recalled.

Ted Schadler, a principal analyst at Forrester Research in Cambridge, Mass., and a programmer himself, agrees that developers need to be more proactive. "Application developers want to assume that a system is going to work," he said, but with Web services it is important to design the system so that it can crash gracefully. "This means you don't leave a user in an unknown state, and the Web site says something like 'the service is down, try again in an hour.'"

Although this sounds simple, he said, it is anything but.

A developer tool that will reportedly ease things on that front is the Business Process Execution Language (BPEL), co-authored by Microsoft and IBM, said Arthur Ryman, IBM Corp.'s development manager for Web services tools for WebSphere Studio. BPEL allows for a higher-level description of a Web service, one that keeps track of what the application has invoked. If it fails, it automatically sends a "compensating action" -- canceling the hotel room that was in mid-order or rolling back a financial transaction. This is called "choreography" in Web services parlance, and the language sees that it happens without the developer having to manually keep track of it all.

The bottom line: You never really know how and where the Web service is going to be used. (And, after all, is that not the whole point of a Web service?) So design and test accordingly.

Who does the testing?

Mark Eshelby, a product manager at Compuware Corp. in Farmington Hills, Mich., said that companies need to decide who will do component-level testing of SOAP-based Web services or Enterprise JavaBeans (EJBs) or anything else along these lines. He maintains that the "people doing the testing are not necessarily aware of what's lurking underneath the covers." Traditionally, testers are in a different part of the organization from developers.

Eshelby said most testers are not trained enough in the complexities of object-oriented programming to be able to effectively test a Web service. In fact, Compuware dropped support for DCOM from its QALoad testing product because "it was too complex for the audience. Nobody understood it," said Eshelby.

And now that problem is back, he maintains. "If you're testing simple objects, that's one thing," Eshelby said. "But in the real world, complex objects are being passed between the client and the server. People don't know what they are and why they were built."

He added, "No matter how easy it is to invoke WSDL, if you don't know what the object was supposed to do, I don't believe you can test it." He advocates that developers should test Web services.

But for now, component-level testing "hasn't really found a home," Eshelby said. "Someone needs to take ownership. So either the developers have to do more testing, or we need to help QA come up to speed with how to test these complex objects."

What is available to help?

It will be a while before one product can do it all -- standards testing, stress and load testing, monitoring SOAP traffic over the wire -- within a familiar IDE, as well as across different server and client platforms. That is a lot to ask from one tool.

In the meantime, different companies are carving out various pieces of the pie. Mercury Interactive Corp., based in Sunnyvale, Calif., has added a few SOAP-related features to its LoadRunner testing tool. First, the company integrated LoadRunner with several IDEs, including SunONE Studio4 and Borland's JBuilder. Then it added support for SOAP within LoadRunner itself so customers can send the same data stream to the server and client and see where the stresses are.

In addition, Mercury provides a service called ActiveTest that allows it to populate a Web service with real data loads from its server farm.

In the future, the company plans to provide a set of templates to enable compatibility testing, said Ido Sarig, Mercury's vice president of strategy. This will allow customers to test for different SOAP configurations for different middleware and clients "at a click of a button," he promised.

For its part, Compuware has added SOAP and XML parsing features to its QAload 5.0 product, now in beta.

Cape Clear Software Inc., San Mateo, Calif., has several tools. Its tester generates sample requests for a specific Web service. It creates requests and tests responses. Where this works at the basic wire level, another is for Java clients. A third tests the WSDL description.

"You wouldn't believe the amount of WSDL rubbish we see here," said John Maughan, director of engineering at Cape Clear Software. "People are creating WSDL in Notepad. So we provide an editor to create the Web service descriptions and then test the validity of those descriptions."

SOAPscope, from Mindreef LLC in Hollis, N.H., is a monitor that checks the validity of the data passed back and forth between client and server. The plan is to eventually team up with partners to integrate SOAPscope or its follow-on products with tools that actually generate tests and exercise regressions.

In the meantime, SOAPscope allows customers to see all the traffic going over the wire, and to then collect that information in a relational database. A Web browser interface allows users to view that data in different ways, as well as analyze the data and resend it.

The Mind Electric in Addison, Texas, is building testing tools into the next generation of its "Web services fabric," Gaia. Intended for serious enterprise Web services adoption, Gaia will allow systems to be assembled from heterogenous services. Graham Glass, chief architect at The Mind Electric, likens Gaia to an electricity grid that will provide services like clustering, load balancing and failover to all Web services regardless of their hosting platform.

Gaia 1.0 will be available by June. The next version, coming by the end of the year, will incorporate Web services testing tools.

"In our opinion, the testing infrastructure should be part of the fabric, independent of the servers," Glass said. "Rather than do all the testing at the end point, you want to embed technology so it's done independent of the end points."

This will, in turn, eliminate the need for different toolkits and frameworks, he added.

Be prepared

In the meantime, it is better to be prepared for anything, both technologically and culturally.

"You're probably not going to have nearly the level of comfort in testing a large-scale Web services application that you had previously," said Jim Moskun, a Mindreef co-founder. "You will have to solve the problems that hadn't come up before, and there's no way to predict all the possible permutations."

Mark Ericson, another Mindreef co-founder, made the point that testing will have to change to accommodate SOAP and Web services. "Over time, we've given up on the waterfall approach to development. But the one place we hold onto it is in testing.

"We expect there's a phase where everything stops so we can test it and then throw it over the wall to deployment," he added. "Web services means that the testing process is continuous. We will not have the luxury of freezing everything to test [it]."

Web services are new enough that there are no real end-to-end testing solutions out yet. Until more tools arrive, it will be up to the development community to do what they always have done -- figure out ways to solve new problems.

Please read the associated article "SOAP interoperability testing coming along" by Johanna Ambrosio.