Simulating Networks

For quick and accurate results, hybrid simulation is becoming the technique of choice

5 min read
Simulating Networks

This is part of IEEE Spectrum's special report: Always On: Living in a Networked World.

network model

Image: Source
Network Model: A simple network model that can be used in a simulation typically consists of PCs and X-terminals (computers often without disk drives, also known as thin servers), servers, routers, switches and hubs, printers, and communication lines such as a T1 line. Click on the image to enlarge.

Simulation tools, and the relatively recent class of hybrid simulation tools in particular, are just the ticket for those faced with designing a network from scratch or keeping one operating in the face of ever-increasing traffic. It's the most cost-effective way to predict how the myriad PCs, servers, printers, routers, hubs, and switches, as well as all the applications, protocols, and communication link technologies making up a network will function, what with all the data packets, more data packets, and still more data packets they must handle.

Fortunately, lots of tools are available to do such straightforward tasks as specifying the topology of a local-area or a frame relay network, and finding points of bottleneck or delay. There are library models for the simulators containing the attributes of devices such as routers, switches, and workstations, and for communication links such as T1 lines, 1-Gb/s Ethernet, and wireless local-area networks (LANs). With these libraries, entire networks of anywhere from several nodes to tens of thousands of nodes can be modeled, depending on the level of detail required. "Some of our customers run detailed simulations of one node that last for hours, or days. Others run 10 000 nodes [with less detail] in a few minutes," noted Todd Kaloudis, vice president of marketing at Opnet Technologies Inc., Washington, D.C.

To check out different operating scenarios, network designers can rerun the simulators using different device throughputs specified in bits or packets per second, together with transaction rates, routing protocols, and applications such as Web browsing, videoconferencing, and so on. The payoff is in how well the designers can balance user needs with network resources and cost, noted Arnold W. Bragg, principal member of the technical staff with Fujitsu Network Communications, Raleigh, N.C.

The Modeler network simulator from Opnet Technologies is the leading hybrid simulation tool. At US $29 000 per license, it combines two types of simulations. Analytical simulation relies on mathematical models of the network using such data as the average packet rate at a node. The other, a discrete-event simulator, relies on analyzing the movement of packets through the network. The hybrid approach is helpful because it can speed up performance considerably while still providing the required information [see "Hybrid Simulation's Ingredients"].

For example, to study an application's performance, a designer may simulate that application precisely, that is, simulate every packet--the creation of the packet, its IP address and size, segmentation and reassembly at each layer in the protocol stack, plus any overhead, routing, and so on, Kaloudis explained. The other traffic on the network is modeled mathematically as background traffic, even if its rate is as high as gigabits per second. The math model includes basic information such as the source and destination of the traffic, and its basic properties.

The payoff is in how well the designers can balance user needs with network resources and cost

Increasingly, modeling packages are put on-line to acquire real traffic data. That's what Opnet's application characterization environment, or ACE, does. Announced last May, the software identifies and visually depicts packets of traffic of networked applications, as picked up by probes on the network. It can then derive the rate of data flow (in bytes per second), as well as other information about the application.

The ACE data is subsequently combined with Opnet's Modeler or DecisionGuru software to simulate the captured application traffic in "what-if" scenarios, to predict the outcome.

The blessing of modules

Users of simulation software often turn to software modules to address specialized needs, rather than having to add code to general network-simulation packages to meet those needs. Some companies--Opnet Technologies and Netcracker, Waltham, Mass.--are already offering such modules, typically in the form of libraries that are included with the main software or available as an option at an added cost. "I would like [vendors] to build more of these," Bharat Doshi, senior director of performance analysis at Lucent Technologies Inc., Murray Hill, N.J., told IEEE Spectrum. He noted that his staff writes a lot of simulation modules for wireless networks because "new problems [show up] all the time." He remarked that today's simulation tools do not have modules for universal mobile telecommunications systems (UMTS), the next generation of wireless systems.

Doshi also noted that the commercially available graphic modeling environment is inadequate for simulating such complex problems as the restoration of an optical network following a disruption caused by, say, a cable break or power interruption. Very sophisticated mathematical algorithms are needed to solve a problem like who gets how much capacity, and when, during the restoration. "This is better done in C, or C++," he added, meaning that Lucent engineers must write their own code to solve these problems.

Ease of use is important

Ease of use is a blessing to those trying to handle such tough problems. Relative ease of use is the claim of NetRule by Analytical Engines Inc., McLean, Va., a software package that uses primarily analytical techniques to predict network performance. Ever since Analytical Engines began marketing NetRule in late 1998, the company has emphasized the tool's simple data model and its intuitive graphical depiction of the networks [see illustration].

Although NetRule scales well and can model networks with over 100 000 nodes, it is inexpensive enough ($7500) to explore performance issues on any size network. NetRule 3.0, expected to become available at press time, models such quality-of-service (QoS) techniques as class-based weighted fair queuing--one that ensures that short messages have a fair access to the network in comparison with access afforded to longer messages such as graphics files. NetRule 3.0 also predicts all of the QoS measures, as does Opnet's Modeler. Typical measures include system availability (up time), packet drop rates (the percentage of packets that fail to reach their destination), and packet latency (end-to-end packet delay).

Research focal points

While vendors update and improve commercial simulation tools, researchers are tackling longer-range goals. John Heidemann, researcher with the Information Sciences Institute at the University of Southern California in Marina del Rey, noted the increasingly pressing need to simulate very large networks, or perhaps even the entire Internet, as one of the main research focal points today. Of course, this latter goal is easier said than done [see "Simulation is Crucial"].

Heidemann added that researchers in network simulation are also preoccupied with how to study the network in different time scales, say, at 1-, 10-, and 100-second intervals, and with how to validate the simulation results. Such studies at are needed because "there is increasing evidence that different protocols behave very differently at different time scales," Heidemann noted. For example, Web traffic is bursty across all time scales, which may not be the case with audio traffic, he explained.

As for validation, the networking community is developing better techniques for demonstrating that simulations actually match real-world networks, Heidemann noted.

To Probe Further

An insight into the use of network design and simulation tools is provided in "Which Network Design Tool is Right for You?," by Arnold W. Bragg, writing in IT Professional, a new IEEE magazine (September/October 2000, pp. 23-31).

For more on network simulation software, check out,, and

Go to introduction

The Conversation (0)

How Philanthropy Became This IEEE Member’s Cause

Bob Dent is New Jersey's 2021 Outstanding Philanthropist

4 min read
Alice Dent

You don't need to be a millionaire to be a philanthropist. Many philanthropists are working-class people with a passion for charitable causes. Bob Dent, who had a 40-year career in the power industry, is one of those individuals.

The IEEE life senior member has donated to many causes, including the IEEE History Center, IEEE Smart Village, and IEEE REACH.

Keep Reading ↓ Show less

Stress Levels Revealed in Micro-Beads of Sweat

New sensor needs just two microliters of perspiration

4 min read

Sweat analysis isn't new: the dynamic chemical composition of perspiration has inspired researchers from elite athletic performance specialists to chronic disease experts to try to decipher its signals.

But translating raw perspiration into "sweat equity" in real time in everyday life is not an easy task. Sometimes climatic conditions don't lend themselves to creating enough sweat to analyze. Sometimes the heightened level of activity necessary to get enough sweat produces chemical changes in the fluid that aren't indicative of a subject's true state. And sometimes, a person is physically unable to create sufficient sweat volumes to analyze with existing technologies.

Keep Reading ↓ Show less

How to Write Exceptionally Clear Requirements: 21 Tips

Avoid bad requirements with these 21 tips

1 min read

Systems Engineers face a major dilemma: More than 50% of project defects are caused by poorly written requirements. It's important to identify problematic language early on, before it develops into late-stage rework, cost-overruns, and recalls. Learn how to identify risks, errors and ambiguities in requirements before they cripple your project.

Trending Stories

The most-read stories on IEEE Spectrum right now