In-Depth

SOAP test: A guided tour

SOAP-based Web services implementations were on the absolute bleeding edge just a year or two ago. Now organizations are asking Web services to do more, while simultaneously wrestling with the testing problems the technology raises.

Web services testing issues include at least three distinct areas. As with any application, there is functional or regression testing. Then, as John Maughan, director of engineering at Cape Clear Software, a Dublin, Ireland-based Web services integration vendor, notes, there is availability testing to ensure that Web services are actually accessible and usable. Finally, there is interoperability testing, an area that has received a lot of attention because of the very public efforts to fix the interop problem by standard-setting groups, particularly the Web Services Interoperability (WS-I) organization.

WS-I recently announced Basic Profile 1.0, which consists of implementation guidelines on how core Web services specifications should be used together to develop interoperable Web services. The specifications covered by the Basic Profile include SOAP 1.1, WSDL 1.1, UDDI 2.0, XML 1.0 and XML Schema.

WS-I also plans to release test tools and sample applications that support Basic Profile 1.0 for those teams that want to inspect and ensure that a Web service meets its interoperability requirements.

Web services have grown rapidly, especially over the last six months, in response to a "dramatic evolution" in the available tool sets, said Gautam Desai, vice president of research at Doculabs, a Chicago consulting firm. "Developing Web services is now a no-brainer," he said. But one of the problems is that Web services do not quite live up to their full promise of easy and assured interoperability. "There are about 220 different tests that must be passed to deem a Web service interoperable," Desai said. And for many Web services, passing those tests is not easy.

That is because, using existing standards, it is still easy for developers to inadvertently include platform-centric features in their Web services. And, particularly when Web services are exposed across an enterprise -- or worse still, to anyone, anywhere -- those platform-specific idiosyncrasies can wreak havoc with interoperability.

Even within an enterprise there can be significant challenges. Consider the experience of Jim Hebert, a computer scientist at San Jose, Calif.-based Adobe Systems Inc. who is in charge of a nightly product regression-testing operation. (Hebert also spoke with ADT for the April 2003 issue. See "SOAP testing easier said than done" by Johanna Ambrosio.) Earlier this year, Hebert was in the early stages of getting his homegrown SOAP apps to run across an internal farm of 15 servers set up to emulate the wide range of operating systems Adobe products must encounter in the real world. SOAP was supposed to make it easy. But, he noted, when the client code calls over to the Web service, it becomes a "big black box" in terms of getting visibility into what is working and what is not.

When Hebert began the process, most existing tools proved to be of limited use. "With a typical debugger, you make the call and get the results back, but you can't see the internals," he said.

After months of head-scratching and laborious attempts at manual solutions, Hebert discovered SOAPscope from Hollis, N.H.-based Mindreef Inc., one of a new generation of SOAP testing tools.

Hebert explained that SOAPscope installs a packet-capture agent and "sniffs" the SOAP calls as they go by. "It's great if I have an application where I don't want to touch it, even in terms of proxy settings," he said. That capability allows Hebert to "take the system as it is, capture traffic, and look at applications that are in production or were used days ago," he explained.

Because it maintains its own database, Hebert said he can simply leave SOAPscope running overnight and find out from the morning log what broke down and -- most importantly -- why.

It can also help in SOAP development, said Hebert, because it can compare similar SOAP calls on an ad hoc basis -- for instance, if the parameters or arguments are different. "That's a help because sometimes the fault might occur on some particular function, but the issue might have come from something several steps before," he explained.

Sandra Rogers, director of Web services and integration software at Framingham, Mass.-based analyst firm IDC, said there are a lot more problems with Web services than many people expect, but that its future looks bright because there is so much movement on testing. She pointed out that while the standards have not all been finalized and continue to change, they are becoming more comprehensive with "proposals to focus on everything from reliability of messaging to management.

"The key will be the momentum that vendors maintain regarding seeing these standards into initial release and then making any necessary adjustments," she said.

Even with standards still in flux, Rogers said Web services have seen a slow and steady uptake with a lot of companies using the technology to do discrete interconnections. Increasingly, she noted, companies are developing Web services specifically for reuse as opposed to just as finite, granular integration points. "It changes the complexion of testing when companies begin to really use Web services in a reusable fashion," she said.

Rogers said niche players such as Mindreef and Monrovia, Calif.-based Parasoft are currently the ones addressing the evolving market most pointedly by offering standalone testing products that either test the actual Web services or determine whether they adhere fully to the Web services protocol.

Mindreef, for its part, recently released SOAPscope 2.0 (Version 1.0 was released in November 2002). The company calls the release a Web services "diagnostics system" aimed at helping developers isolate and solve problems in Web services apps throughout development, testing and management. Version 2.0 focuses especially on the interfaces between Web services. These interfaces are described in a Web Services Description Language (WSDL) document. Errant WSDL can result in Web services that are not interoperable or do not function properly.

"Analysis tools are often hard to use and oriented toward specialists," said Jim Moskun, co-founder of Mindreef. Interface design using WSDL is a complex but necessary skill for making Web services interoperable. Moskun said his company's product makes WSDL easier by automating the process. "We tried to make SOAPscope's WSDL analysis easy to use with descriptive help so developers and testers can find problems as they emerge," he said.

Meanwhile, Parasoft offers SOAPtest 2.1, which includes extended testing capabilities for the Web services Basic Profile 1.0. V2.1 tests the WSDL for conformance to Basic Profile 1.0 using the Test Tools developed by the WS-I. (Parasoft contributed to writing the Test Tools as a member of the WS-I Test Tools Working Group.)

"Being actively involved in the development of Basic Profile 1.0 and the Test Tools has allowed us to tightly couple them with the new version of SOAPtest," said Gary Brunell, Parasoft's vice president of professional services. Brunell said the special challenge of testing Web services lies in the fact that the problem could be in the application server, the network or the Web service itself -- making it difficult to pinpoint the root cause of failure.

SOAPtest uses the URL to the WSDL to test for conformance to Basic Profile 1.0. SOAPtest parses the WSDL, passes it to the Test Tools and produces a WS-I conformance report that can be viewed in a Web browser. Since the Test Tools are tightly integrated with SOAPtest, Parasoft said users can also employ the conformance reports to create regression tests. According to the company, SOAPtest automatically performs server functional testing, load testing and client testing, and can create test suites from WSDL documents to test operations associated with those documents.

In addition, said company reps, the same test suites used for functional testing can be used for load testing and monitoring, enabling users to verify whether traffic levels, patterns or combinations cause functionality problems or decreased performance. SOAPtest verifies that clients send appropriate requests to services and are able to handle responses as expected. The product also works as a proxy server, enabling users to view and verify messages between a client and a Web service. SOAPtest 2.1 runs on Windows 2000/XP, Linux and Solaris.

But while Mindreef and Parasoft are getting attention for blazing new trails in SOAP testing, there are other vendors, large and small, that have already begun to address SOAP-related issues -- or seem likely to do so in the near future. "Most of the traditional testing vendors have been rolling out Web services testing capability in their most recent release," noted IDC's Rogers. Doculabs' Desai cited the example of traditional load-testing players like Empirix, IBM and Mercury Interactive.

According to Simon Berman, director of product marketing, load testing at Mercury Interactive Corp., Sunnyvale, Calif., the company's Mercury Quality and Mercury Performance Centers support developers and testers in ensuring that their Web services are unit-tested for correct functionality and data integrity, are optimized for performance and scalability, and are reliable.

"The creation of Web services as widely distributed entities poses new challenges for developers and testers in ensuring that they perform and scale under a vast range of usage conditions," noted Berman.

Even Computer Associates (CA) has gotten into the act with Unicenter Web Services Distributed Management (Unicenter WSDM), its solution for managing Web services in a Service-Oriented Architecture (SOA). The product focuses on discovery and management of Web services, and on identifying service interruptions. According to the company, it also provides for real-time "in band" observation of XML transactions and is platform-independent (either J2EE or .NET).

Like SOAPscope, Unicenter WSDM uses what CA calls "observers" to discover and monitor a range of service characteristics of Web services transactions. WSDM "managers" aggregate data from the observers and can automatically set alerting thresholds to help IT to proactively respond to technical issues before they result in service disruptions.

Theresa Lanowitz, an analyst at Gartner Inc., Stamford, Conn., counts more than a half-dozen firms on her list of major vendors tackling Web services testing. Lanowitz includes a large number of testing vendors active in the distributed systems arena in assessing Web services testing options. She said the overall leader, based on new license revenue, is Mercury Interactive, with a 54% share of the market. Other companies that figure as significant in her calculations include Empirix, Compuware, Segue, IBM Rational, Quest Software, MKS and Radview, as well as Parasoft. Other less-prominent vendors she identified include Borland, Telelogic, Solstice Software and Keynote Systems.

Lanowitz said most organizations are still using Web services in an internal mode. Often, they are just taking existing applications and seeing how they can make them work better, make them more robust, or share them more widely with Web services.

An underlying problem, she explained, is that most organizations have poor software practices and perform testing, if they do it at all, as an afterthought. "That's simply not adequate when you are talking about Web services," said Lanowitz.

Testing that is done varies widely in its focus and depth, from system testing to end-user testing. Now, with Web services, Lanowitz said it is critical to test down to the API level. In fact, thanks to the impetus she believes Web services will provide, where testing is done and by whom is likely to change significantly in the next 24 months, she said.

"As Web services begin to be exposed to trusted partners and beyond, we will see this specified in the requirements phase of software documents," said Lanowitz.

Because of the nature of Web services, failing to test at all levels will have consequences, said Lanowitz. Unlike traditional applications, where a modest level of unavailability could be tolerated, Web services will often be bound together in a loosely coupled system in which failures are additive. "If you have three Web services that are each available 80% of the time, that equals about 54% availability taken together," she said. Thus, she said, the necessity of building quality in up front will become readily apparent.

At present, she estimated, only about 25% to 30% of applications are thoroughly tested -- meaning that the whole market for testing is under-penetrated and that there is lots of room for improvement. "If Web services are not thoroughly tested, they will fail and people will not be inclined to depend on them," she added. Indeed, Lanowitz estimates that the majority of "Web services attempted in a production environment will fail due to poor quality issues through 2007."

"It is much deeper than just interoperability," added Lanowitz. What is more, she remarked, testing alone is not the key to Web services: quality is. Only 10% to 15% of enterprises have good software engineering practices. Instead, most operate in what she calls the chaotic or reactive mode. "Most organizations feel that testing is overhead or something you do only at the end of the process: That mode of thinking must change," she said.