In the new world of competition, power traders, grid managers, public service boards, and the public itself all need to take in what's happening at a glance
Visualization software packs a large amount of information into a single computer-generated image, enabling viewers to interpret the data more rapidly and more accurately than ever before. This kind of software will become still more useful, even indispensable, as electricity grids are integrated over ever-larger areas, as transmission and generation become competitive markets, and as transactions grow in number and complexity.
Tracking and managing these burgeoning transaction flows puts operating authorities on their mettle. While the electric power system was designed as the ultimate in plug-and-play convenience, the humble wall outlet has become a gateway to one of the largest and most complex of man-made objects. For example, barring a few islands and other small isolated systems, the grid in most of North America is just one big electric circuit. It encompasses billions of components, tens of millions of kilometers of transmission line, and thousands of generators with power outputs ranging from less than 100 kW to 1000 MW and beyond. Grids on other continents are similarly interconnected.
In recent years, a further complicating factor has emerged. Along with the broadening integration of power systems has come the increased transfer of of large blocks of power from one region to another. In the United States, because of varying local power loads and availability, utilities purchase electricity from distant counterparts and independent suppliers, exploiting price differentials to economize on costs. For one, the Tennessee Valley Authority, which provides power to more than 8 million residents in seven states using over 27 000 km of transmission lines, handled a mere 20 000 transaction requests through its service territory in 1996, compared to the 300 000 in 1999.
The net effect is that data once of interest mainly to small cadres of utilities now must be communicated to the new entities being established to manage restructured grids. In the United States, that means independent system operators (ISOs) and regional transmission organizations (RTOs), which have to be able to grasp fast-changing situations instantaneously and evaluate corrective strategies nearly as fast.
Power marketers' needs, too, be-come more urgent, as access to the grid is opened and competition among generators is introduced across the United States and elsewhere. They must be able to see just how much existing and proposed transactions will cost, and the availability of electricity at any time and any point in the system.
Finally, concepts like power flow, loop flow, and reactive power, which once mattered only to the engineers directly involved in grid operations, now must be made intuitive. This is because they must be communicated to public service commissions and the consumer-voters to whom such boards are answerable.
In short, whether the client/user is a power marketer, a grid operator or manager, a public authority, or a member of the public, power system visualization tools can aid their comprehension by lifting the truly significant above background noise. Such tools can expedite decision-making for congestion management, power trading, market organization, and investment planning for the long term.
The visualization tools illustrated here are available from PowerWorld Corp., Urbana, Ill. Visualization tools offered by others rely on updated text. ABB, Alstom ESCA, GE Harris, and Siemens, for example, offer tools that are part of larger energy management systems packages.
How flows are managed
The usual reason that a large transfer of power can be hard to handle is that there are few mechanisms to control its route through the transmission system from generator to distant load. Often that route is indirect, dictated by the impedances of the lines and places where power enters or leaves the system. In effect, a single transaction between a generator and a utility spreads throughout a large portion of the grid--a phenomenon termed loop flow.
(To be sure, current can be and is directly guided during high-voltage direct-current [HVDC] transmission. And ac current is being nudged in desired directions by devices like phase-shifting transformers and series compensation capacitors, often lumped together as flexible ac transmission (FACT) devices. However, very few of these devices are available in most large power systems, so in effect transmission flows are not controllable.)
The percentage of a transfer that flows on any component in the grid--a transformer, say--is known, in language developed for the U.S. Eastern Interconnect, as the power transfer distribution factor (PTDF). A transaction that would send power through an overloaded component, in a direction to increase the loading, may not be allowed, or if already under way, may have to be curtailed. The U.S. procedure for ordering such curtailments is known as transmission-line loading relief (TLR). Its developer was the North American Electric Reliability Council (NERC), the utilities' voluntary reliability organization in Princeton, N.J.
To reiterate, a grid component owner that detects overloading serves notice with the relevant authority--an ISO or RTO, for example--and asks for relief. The independent operator, or whoever, thereupon orders loading relief measures. For the component in question, any transaction involving a distribution factor higher than a predetermined level--set by NERC at 5 percent of the transaction--is a candidate for curtailment. If more than 5 percent of the power transferred as part of a transaction will go over a grid component subject to a TLR, the transaction may be scaled back or canceled.
Those TLR measures in turn will affect other existing and proposed transactions, requiring further near-instantaneous analysis by utilities, grid supervisors, and power marketers. The need at every level for state-of-the-art visualization tools is obvious, since any bottleneck in this complex system can quickly cause brownouts, blackouts, or nasty price spikes.
Averting price spikes, islanding
Problems with grid management are not necessarily the cause of electricity outages or price spikes--California's current electricity crisis seems to have been induced primarily by unforeseen generating shortages and misguided public policy. Here, visualization can help only indirectly, by better showing policy-makers the potential impact policy decisions can have on grid operation.
But when grid congestion is at the root of problems and floods of data are involved, visualization tools like conttouring, dynamic pie charts, animated diagrams, and two- and three-dimensional outlines have much more to offer.
Congestion played a pivotal role, for example, in the notorious U.S. midwestern price spikes of June 1998. That month, spot market prices for electricity soared three-hundredfold from US $25 to $7500 per megawatt-hour. Though there were many contributing factors, the most important were barriers to importing electricity from outside the region. Electricity was available elsewhere on the grid to the east and west, but could not be transferred because of overloads (congestion) on just two elements: a transmission line in northwest Wisconsin and a transformer in southeast Ohio.
The situation at the time of the June 1998 price spikes is diagrammed in [ Fig. 1], where the small ovals represent operating areas in the Eastern Interconnect, each a potential seller. In the transaction illustrated, the buyer was a utility in northern Illinois. The contour indicates what percentage of the power transfer requested would have flowed through overloaded devices; shaded areas on the left could not sell because of the overload in northwest Wisconsin, those on the right because of the overload in southeast Ohio.
The visualization provides a picture of the complex interaction between the grid and the power market, allowing market participants to respond more quickly to changing conditions. With the market segmentation visualized on the prior page, power buyers in the affected areas could move quickly to procure long-term power capacity contracts, rather than having to buy at the astronomical spot market prices.
In the past, to form a mental picture of how line-loading relief measures might affect a market or reliability area, marketers or operators would have had to scan a long numerical list of distribution factors--no easy task once the list grows beyond a hundred or so entries. This is because in any large grid system, there are huge numbers of distribution factor sets, each depen-dent on pairs of buyers and sellers. Contouring provides a good solution, making the impact of loop flow apparent at a glance.
Another way of mapping the implications of TLRs is illustrated in [ Fig. 2] the map shows the distribution factors for a hypothetical power transfer from a utility in eastern Wisconsin and the Tennessee Valley Authority. Note that the transfer affects lines as far away as Nebraska and eastern Virginia. Of the 45 000 lines modeled in the case, 171 had PTDFs above 5 percent, while for 578 the PTDFs were above 2 percent.
With the aid of such tools, a marketer can easily start considering a host of WHAT IF scenarios. How might a loading relief on a transmission line affect market participants other than those directly involved in a transaction? What if there is an outage of a major transmission line? What is the outlook for other potential buyers?
Visualizing voluminous flows
To determine how power moves through a transmission network from generators to loads, it is necessary to calculate the real and reactive power flow on each and every transmission line or transformer, along with associated bus voltages (in other words, the voltages at each node). With networks containing tens of thousands of buses and branches, such calculations yield a lot of numbers. Traditionally they were presented either in reams of tabular output showing the power flows at each bus or else as data in a static so-called one-line diagram. (One-line diagrams are so named because they represent the actual three conductors of the underlying three-phase electric system with a single equivalent line.)
The visualization challenge is to make these concepts intuitive. One simple yet effective technique to depict the flow of power in an electricity network is to use animated line flow [see figure 3, and link to PowerWorld site]. Here, the size, orientation, and speed of the arrows indicate the direction of power flow on the line, bringing the system almost literally to life.
Dynamically sized pie charts are another visualization idea that has proven useful for quickly detecting overloads in a large network. On the one-line, the percentage fill in each pie chart indicates how close each transmission line is to its thermal limit.
When thousands of lines must be considered, however, checking each and every value is not an option. Of course, tabular displays can be used to sort the values by loading percentage, but with a loss of geographical relevance. Because engineers and traders are mostly concerned with transmission lines near or above their limits, low-loaded lines can be eliminated by dynamically sizing the pie charts to become visible only when the loading is above a certain threshold.
Contouring the grid
Using pie charts to visualize these values is helpful, unless a whole host of them appear on the screen. Here, an entirely different visualization approach is useful--contouring.
For decades, power system engineers have represented bus-based values by drawing one-line diagrams embellished with digital numerical displays of the nearest bus's values. The results, being numerical, are precise and displayed next to the bus to which they refer. But for more than a handful of buses, it takes a lot of time to find a pattern. Contours are a familiar way of displaying continuous, spatially distributed data. The equal-temperature contours provided in a newspaper's weather forecast form a well-known example.
The trouble with contouring power system data is that it is not spatially continuous. Bus voltage magnitudes exist only at buses, and power only as flows on the lines, yet the spaces between buses and lines appear in contour maps as continuous gradients, not as gaps.
In practice the artificially blended spaces between nodes and lines do not matter much, as the main purpose of a contour is to show trends in data. Values are exact only at the buses or on the lines. Colors can be used to represent a weighted average of nearby data-points. This color gradation brings out the spatial relationships in the data.
Power flows matter not only to operations engineers and power traders, but also to the authorities charged with deciding whether two utilities should be allowed to merge, whether a new combustion turbine is needed in a trendy suburb, or whether the absence of a single transmission line could send electricity prices soaring. Overloads on just a few transmission lines can segment, or separate, even large power markets, causing prices to spike, with some players reaping huge windfall profits. The end result: serious misgivings about the whole process of restructuring and deregulation.
The central concern is that benefits from breaking up the old vertically integrated utilities will be for nought if the newly unbundled generation and transmission companies are able to exercise quasi-monopolistic power over local and regional markets. Collusion is one method, and another is "gaming" the system--taking advantage of legal loopholes and operational quirks to create or exploit bottlenecks and chokepoints.
Such abusive power, dubbed market power in the electricity context, refers to the ability of one seller or a group of sellers to maintain prices above competitive levels for a significant period of time. This can be done in various ways, depending on how markets have been organized to set prices in jurisdictions adopting market mechanisms--notably the United Kingdom, Norway, New Zealand, and, in the United States, the California Power Exchange, the PJM (Pennsylvania New Jersey Maryland) Interconnection, the New England Power Pool and New York ISO.
In most jurisdictions introducing competition, markets are organized so that spot prices can be determined at every node (or bus) in the system. The U.S. name for this is the locational marginal price (LMP). Under truly competitive conditions, it equals the marginal cost of providing electricity to that point in the transmission system, where the provider is any generator bidding into the system.
In the absence of overloads, spot marginal prices are about equal across an entire power market (though this depends somewhat on how resistive line losses are taken into account). But when overloads occur, spot prices can rapidly diverge. Because LMPs are bus-based values, contouring is again extremely useful for showing market-wide patterns.
As an example, [ Fig. 4] shows a contour of the LMPs generated by an optimal power flow (OPF) study using a 9270-bus system to model those in the northeast [see "A Brief History of the Power Flow]. An OPF sets the outputs of generators to minimize the total cost of operating the power system, while at the same time ensuring that no transmission system elements are overloaded. In the study, marginal prices were calculated for 5774 buses, and some 2000 of these values were used in creating the contour. The contour is superimposed on a map of the high-voltage transmission lines in the Northeast, and the pie charts indicate which transmission lines or corridors are congested.
Note the price differential between New York and New England caused by a congested line on the boundary between northern New York and New England. The pocket of high prices in western New York is due to a constraint on a single 230/115-kV transformer. A transmission element is said to constrain the power system when generation must be moved from the most economical operating point in order to reduce the loading on the element. This constraint can be eliminated by bringing on stream a relatively small 85-MW generator on the constrained side of the transformer.
To be truly effective, however, computer visualization must be interactive and it must be fast. Using a standard desktop computer, the contour on the opposite page can be re-created, with a reasonable resolution, within a few seconds. Fast contouring, coupled with easy zooming and panning, equips the market analyst with an interactive tool with which to quickly explore a power system data set. For instance, zooming could be used to provide more details about pricing across Massachusetts, while dynamically sized pie charts could be show lines that are close to but not yet exceeding their limits.
The grid goes 3-D
Contouring can be quite helpful when one is primarily concerned with the visualization of a single type of spatially oriented data, such as bus voltages or transmission line flows. But the data of interest in a power system could include a long list of independent and dependent variables. Bus voltage magnitudes and prices, transmission line loadings and PTDFs, generator reserves and bids, and scheduled flows between areas all come to mind.
In more advanced applications involving OPF and available transfer capability (ATC) calculations, this list of variables is even longer. ATC calculations determine maximum amounts of megawatt transfers that can occur across the transmission system. One solution is to leave the two-dimensional views behind and enter the third dimension.
Interactive, 3-D visualization is certainly nothing new. Nevertheless, in designing it for power system visualization several issues arise. First and foremost, in visualizing power system data there is usually no corresponding "physical" representation for the variables. For example, there is no physical correlate to the reactive power output of a generator, or for the marginal cost of enforcing a transmission line constraint. These are abstract calculated values, to be added as desired to diagrams in which physical flows are represented in the first two dimensions.
The abstract nature of the data makes this kind of visualization different from the characteristic use of interactive 3-D for some types of scientific visualization, in which the purpose of the environment is to visualize physical phenomena, such as flows in a wind tunnel or molecular interactions. To address this issue, an environment based upon the traditional one-line representation (in which the three ac phases are represented by single lines) serves as a good starting point. Those concerned with power systems are familiar with it. The new environment differs from the old in that a traditional one-line is a two-dimensional representation, whereas the new is 3-D, opening a world of possibilities of how to use this additional dimension.
The 3-D environment must also be highly interactive. In power systems, there is too much data for everything of interest to be displayed. Rather the user should be able to access the data of interest quickly and intuitively.
This leads to the question of how to interact with the 3-D environment. With a 2-D one-line, there are just three degrees of freedom associated with viewing: panning in either the x or y directions, and zooming. The 2-D one-line could be thought of as lying in the xy -plane, with the viewing "camera" poised above. Panning the one-line can then be thought of as moving the camera in the x and y directions. And zooming is simply changing the height of the camera above the one-line. A 3-D environment has the same three degrees of freedom, but adds three more, because now the camera itself can be rotated about each of its three axes. Navigation can, however, be simplified by restricting camera rotation.
One useful approach is to allow rotation about two axes only. If the camera rotates about the axis passing through its sides, it can change its angle with respect to the horizon (elevation). Alternatively, it can rotate about the axis passing through it from top to bottom. Rotation about the axis passing through the camera from front to back (twist) is not allowed.
The results can be stunning. Suddenly the one-lines come to life. The 3-D environment gives the viewer a greater sense of involvement with the system, making important information harder to overlook and otherwise hidden relationships easier to see.
For example, one limitation on the transmission system is the need to maintain high enough bus voltages--values that contouring can display quite effectively but minus information about controlling factors, such as the reactive power output of generators, that could correct problems with the voltages. Voltage security analysis requires a simultaneous awareness of both the bus voltage magnitudes and the generator reactive characteristics, including the generator reactive reserves. Showing all this information numerically on a 2-D one-line would only be effective for a very small system.
The 3-D alternative is to draw the 2-D one-line in the xy -plane using a perspective projection, in which closer objects appear larger. The generator reactive output and reserves are then shown using cylinders in the third dimension. In the figure, above, the height of each cylinder is proportional to the maximum reactive capacity of the generator; the dark lower part of the cylinder indicates the present reactive output, while the lighter part indicates the reactive reserves. The bus voltage values are indicated by a contour in the xy-plane; voltages below 98 percent of desired values are shaded.
Thus the 3-D one-line now shows, at one and the same time, the location of low voltages, the present generator reactive power outputs, and the reserves. It does a good job of conveying qualitative information about the magnitude of these values, but not exact numerical values. Thus [see Fig. 5] only shows that the reactive power at bus 20 is about 50 percent of its maximum, not its actual value. In some situations this could be a serious limitation. Hence the 3-D one-lines are meant to supplement, rather than to replace, existing display formats.
The last issue to address is performance. For effective use of the 3-D environment, fast display refresh is crucial. This in turn depends on, among other things, the speed of the computer's processor, the speed of the graphics card, whether the graphics card has hardware support for 3-D, and software considerations such as the level of display detail. At present, PCs with the best mainstream display cards are approaching display rates of 10 million polygons per second, enough for good refresh rates plus a fair amount of detail.
Performance is driven strongly by computer games--their popularity, indeed, could not come at a better time for power visualization.
Spectrum Editor: William Sweet