This is part of IEEE Spectrum's special report: What's Wrong—What's Next: 2003 Technology Forecast & Review.
The last two years have been a disaster for the software industry. Companies that loaded up on new systems during the Y2K upgrade program and the dot-com boom are now facing a turbulent stock market and the threat of a double-dip recession. "The software industry normally grows at 10 percent minimum," says Tony Picardi, senior vice president of global software at IDC, a consulting company headquartered in Framingham, Mass. "In 2001 it grew 0.3 percent, and in 2002 it grew 0.8 percent."
It's a bad time to be knocking on doors trying to sell new technology. But, paradoxically, the very conditions that have created the current slump may contain the seeds of the next wave of growth. Businesses are now focusing on getting the most out of their expensive suites of enterprise software, including applications that track finances and maintain databases. Helping them do that are industrywide efforts aimed at making software interoperate much better. "Now it's a question of getting it all to work together to improve interdepartmental work flow, and of building analytic applications to mine data," says Picardi.
Introduced in 2001, Microsoft's .NET TECHNOLOGY is designed for Web services and equipped with plenty of development support, as usual with this company. But based as it is on the Microsoft platform, some worry it would lock them into using Microsoft
LINUX 2.6, a new version of the Linux kernel, is expected this summer. If it can deliver on its promise of improving reliability and scalability on high-end hardware, look for it to appear in ever more corporate-server rooms
UNITEDLINUX was created by four major Linux manufacturers (notably, not including Red Hat) to produce a standardized, enterprise-class distribution of Linux. The first version of UnitedLinux's operating system was released in November, and if it gains widespread acceptance, it could reduce training and administration costs for companies that use Linux
Getting software to work together is known as interoperability, a catchall term that often means different things to different people. To some, it simply indicates that two programs can exchange data files. To others, the implication is that software written for one computer platform can run without modification on others. For this article, interoperability means that two pieces of software can work together without requiring significant alterations and that the interface between them is transparent to the end user.
For Fred Hoch, vice president of software programs of the Washington, D.C.-based Software and Information Industry Association (SIIA), dealing with interoperability issues will also help the software industry win over a skeptical corporate world. Software has gotten "a bad rap right now," he says, in good part because of guilt by association following the collapse of the dot-coms.
But by helping its customers achieve a real return on the investment in their existing software, the industry will make buying its future products a much more palatable proposition.
Of course, getting software to talk together is no easy task, not least because of entrenched attitudes and rivalries in the industry. Almost two decades ago, journalist Charles Platt coined his sardonic Fifth Law of Computing: "Computer designers talk a lot about compatibility, but secretly they hate the idea of standardization. It cramps their style."
In an effort to outsmart the competition and to lock in customers, vendors have long pursued adding their own bells and whistles to a basic product, making it impossible to mix and match computer systems in the way that, say, an audiophile can with amplifiers, CD players, and speakers. This probably accounts for why IBM Corp. estimates that 40-60 percent of information technology (IT) development and maintenance costs is spent on integration issues.
The road to interoperability
Before the 1990s, most computer systems were islands unto themselves. But the advent of the Internet meant that companies had to interface with computer systems over which they had no control, such as the desktop PCs of potential customers in e-commerce. Merging with or buying another company became an integration nightmare, and as time went on, the problems involved with dealing with a heterogeneous mass of legacy systems also became difficult—getting the accounting department's spreadsheets to talk to the warehouse database, for example. [See "Why Managers Are Learning to Love Linux".]
By the end of the 1990s, even giants like Microsoft Corp. (Redmond, Wash.) were beginning to recognize the impracticality of the traditional solution of wholesale homogenization to a single platform. Steven VanRoekel, director of Web services technical marketing at Microsoft, recalls how customers began saying, "We don't want to be forced to have the same vendor solution on both ends—that's not where we want to take computing."
The need was for standard ways for applications from different vendors to communicate. Initially, just passing around static data files, such as text documents, posed something of a problem. In the late 1990s, the industry finally began making some headway. It "ended up coalescing together around XML," says VanRoekel. XML (eXtensible Markup Language) allows data to be combined with a description of what the data is supposed to represent, rather than an opaque jumble of numbers as with earlier formats. XML is a stablemate of HTML (HyperText Markup Language), the language of the World Wide Web, but it is a much more flexible format that can be used to do considerably more than simply describe Web pages.
Every community that wants to exchange information—be it of bankers or biologists—creates an XML template, or schema, for its use. That template defines how documents are to be structured, what exactly is meant by the various data labels, and so on. While this situation has created an explosion of XML schemas, complete with battles over formats reminiscent of the chaos XML was supposed to replace, things have, in fact, gotten better, explains Picardi.
First, unlike earlier formats, a high degree of interoperability is built in. "XML schemas are designed to work over a network, so they're platform independent," he says. Second, every XML schema provides some minimal guaranteed information about the data contained, information that can, at the very least, be used to figure out what application should be used to process it. Finally, if all else fails, as every XML schema that hopes to be used by a community is published online, developers have a dictionary of sorts to fall back on.
Now the push is on to move beyond simply exchanging data and to allow software applications to employ other applications in the same way that a person might use multiple programs to achieve a task. For example, a person has no problem using a Web browser to download an image, another program to sharpen the image, and finally an e-mail program to send the image to someone else. VanRoekel says this effort to expose functionality has ultimately resulted in plans to create so-called Web services [more about this below]. These plans originated as companies tried to figure out distributed computing, which spreads heavy computational tasks over multiple computers in a network and yet makes the results appear to the end user as if they were the product of running an application on a single, superpowerful machine.
Steve Vinoski, the Boston-based chief architect at Iona Technologies PLC (Dublin, Ireland), agrees with Picardi that business needs were becoming a driver for interoperability even before the recent recession. Iona sells software based on the Common Object Request Broker Architecture (CORBA), which provides a common interface that lets software components access each other. Vinoski remembers that when CORBA was developed in the early 1990s, it was aimed at "people like me who were building systems. In the mid-1990s, end users started to speak up about wanting to use these systems we'd developed to start solving their business problems." For example, Boeing Co. (Chicago) used Iona's products to tie together its disparate engineering and business databases and applications so that the company could keep track of all the elements that went into designing and building major aerospace projects such as jetliners.
Although collaborating in interoperability consortia, companies still jockey to have their PET SOLUTION adopted as the standard
The OPEN-SOURCE COMMUNITY has yet to settle on interoperability standards for desktop applications
Browser makers commit to the Web standards established by Tim Berners-Lee's World Wide Web Consortium (better known as the W3C) so that OPEN WEB STANDARDS have a better chance of being uniformly adopted
A UNIFORM INSTANT-MESSAGING PROTOCOL, which would help it become an enterprise-class tool
A certain standard of behavior
Long established at the hardware level, standards-based approaches to interoperability are finally taking hold among software makers. "We've really taken an approach on working with standards to embrace the industry in solving [interoperability problems]," says Microsoft's VanRoekel. Microsoft was a founding member of the Web Services Interoperability Organization (WS-I), whose aim is to enable Internet-based software to work together, regardless of the language, platform, or operating system used to create and run it.
A recent addition to WS-I is Microsoft's chief rival in the interoperability arena, Sun Microsystems Inc. (Palo Alto, Calif.). "All the major players support Web services standards," says Graham Hamilton, a distinguished engineer in Sun Microsystems' Java Web services group. "We want to make sure our Java Web services can interact well with Microsoft's .Net Web services."
Definitions of Web services vary, but the core concept is that applications are built out of self-describing components that can be activated on the fly over a network. What that means in practice, for example, is what would happen when a user loads a word processor with a document containing a complex calculation. The word processor could ask the network for an appropriate software component to display the calculation, then ask for another component from a different application to process the calculation, which in turn could rely on accessing a corporate database.
To the user, this functionality should appear as seamless as if it had been installed along with the word processor in the first place. Obviously, this can't happen without a high degree of interoperability—all software components must be written to conform to a universally agreed-on set of rules for understanding commands, returning results, and reporting errors.
In theory, Web services make it possible never to install the word processor locally at all, but instead use it over the Internet on a subscription basis from some third-party server. But this would represent a huge shift in how businesses account for software costs and hence estimate their return on investment. SIIA's Hoch observes that if companies "move away from, say, a capital cost of US $2 million to a monthly cost of $5000, how do you easily quantify the long-term value to the company?"
Another challenge is that there is still a gap between creating a standard and actually achieving interoperability. "You can be conformant to a standard and not be interoperable," warns Scott Valcourt, director of the University of New Hampshire's InterOperability Lab (Durham), which works with various industry consortia on interoperability issues. Not surprisingly, the devil is in the details—in this case, in the standard's written specification. If it contains undetected ambiguities, different vendors may implement the standard in a different way, quite unaware that they are introducing incompatibilities. All can claim in good faith to be adhering to a standard, yet each makes products unable to talk to those of other vendors.
Teasing these ambiguities out is a large part of the lab's mission. It works with standards bodies like the IEEE to look at standards as they emerge for areas where ambiguity can arise, explains Valcourt.
Eliminating vagueness in the specification is only one reason why creating an acceptable standard can often be a tortuous process. "I have a few gray hairs. Some of them are from my children, the rest are from standards," says Iona's Vinoski ruefully. The principal criticism of the formal standards process is that it can be very slow, as it typically requires achieving a consensus among distrustful technology competitors.
However, despite this drawback, support within the software industry for such formal standards is growing, in contrast to the former tendency to rely on rough-and-ready de facto standards.
But what impact does all this have on the antistandardization individualistic behavior described by Platt's Fifth Law? Perhaps the best evidence is to be found in such end-user application companies as Mathsoft (Cambridge, Mass.), maker of the engineering mathematics application Mathcad. Allen Razdow, senior vice president of strategic planning for Mathsoft, explained how, in the latest version of Mathcad, "we focused not so much on improving the user interface or adding more math functions, but in making sure that it will work with a company's enterprise-level systems."
Finally software developers may have learned to work with standards without cramping their style.
To Probe Further
The Web Services Interoper-ability Organization (WS-I) can be found at http://www.wsi.org/
The Linux Standards Base's Web site is at http://www.linuxbase.org/
Links to the various industry consortia set up under the University of New Hampshire InterOperability Lab can be found at http://www.iol.unh.edu/