Keeping the Lights ON!

Maintaining reliable grids in a deregulated power industry will get harder, as temptations to cut corners multiply

15 min read

POWER

Of all the energy conversion processes in existence, the U.S. electric power system is the largest and most complex. Unlike such industries as communications and transportation, where a demand in excess of supply produces a "busy signal" or temporary grid lock, the nature of the electric power system is one of instantaneously matching supply and demand. Failure to sustain this balancing act can result in partial or complete breakdown of the grid system. Even just a disruption in supply or a merely inadequate voltage can cause key industries like oil refining and high-technology manufacturing to suffer expensive shutdowns and lengthy production line recovery times.

With deregulation introducing market principles into the power industry, concern over the reliability of the electricity supply has magnified. This is because the emphasis seems to be shifting from reliability as the mainstay of the nation's essential power base to reliability as a commodity in the power market.

At the root of all the changes is the industry's movement from simple "wheeling" (trading power) between utilities to wholesale and retail competition among utilities and distributors, a move that was initiated in part by the 1992 Energy Policy Act and Order 888, issued by the Federal Energy Regulatory Commission (FERC), Washington, D.C., in 1996. Now, nonutility generators not only have the right to sell into the market, but also are afforded open and equal access to the transmission grid--all to foster competition, increase efficiency, and lower energy costs. Consequently, issues of reliability and security have come under pressure from financial interests, and utilities' previous "obligation to serve" has been supplanted by entrepreneurial vigor [see "What is power system reliability?"].

Since the issuance of the commission's Order 888, the paramount concerns within the industry have been that:

Market economics would define the optimal cost/benefit tradeoff that determines how system reliability is maintained and provided.

Voluntary cooperation between utilities and integrated planning would disappear.

Voluntary compliance with reliability issues would be lacking to the detriment of the global network.

Open access would lead to multiple transactions, system overloads, and operational difficulties.

The current and near future

In the face of these concerns, the U.S. electric power industry has performed a mammoth task in moving forward in its restructuring efforts while keeping the lights on. It is true that a high degree of chaos still exists, but it must be remembered that, despite the high level of cooperation that has existed in the past, the U.S. industry is greatly fractured. Today's electric utilities exist in many forms--investor-owned, state-owned, federally owned, and municipals. What's more, each state has its own Public Utility Commission, and interconnections cross utility and state boundaries.

To review progress in North America, it is often helpful to look overseas to see how the global move to privatization and restructuring is functioning there. But the comparisons for electric power are often futile. Much progress has been made, for example, in nations whose countrywide systems consist of a single entity. A case in point is the United Kingdom, where privatizing the system was a comparatively simple process. The Central Electricity Generating Board both generated and transmitted power to 12 area distribution companies within an area that is about the size of New York State.

In the midst of chaos in North America, however, several key initiatives in the energy market are focusing on reliability issues, notably the work of the North American Electric Reliability Council (NERC), based in Princeton, N.J., and its regional reliability councils.

NERC at work

One such initiative focuses on simplifying and standardizing information on complex energy transactions taking place in the interconnected networks of North America. To improve the flow of such information and describe transmission limits in a consistent and commercially viable manner, NERC has developed such tools as the transaction information system and the interchange distribution calculator.

The transaction information system tags all interchange transactions with information on sources, intermediary entities, and destinations. These data are in turn fed into any one of the 22 designated security coordinators that are spread throughout the NERC regions and are tasked with identifying conditions that threaten system security. The coordinators use the interchange distribution calculators to determine which transactions are affecting the loading of critical transmission facilities.

Building on today's area-to-area limit expressions, NERC panels also developed a so-called flowgate, which is intended to measure and monitor total network and transaction impacts on a relatively small number of whole network interface points.

Another development that has been introduced through NERC is the transmission reservation and scheduling process. This procedure facilitates the reservation and scheduling of transmission service by transmission customers reliably, in accordance with the actual flow that would result on the system from proposed interchange transactions.

An early parallel information source, intended to promote competition, is the so-called available transmission capability, or ATC. Designed to communicate to generation buyers and sellers whether sufficient transmission capability is available between sources and sinks, ATC is calculated and published on an Internet-based bulletin board system called Oasis (for Open Access Same-time Information System) [see " Midnight at Oasis: tapping into scarce transmission capacity"]. If transmission capacity is available, a buyer and seller can reserve a firm or non-firm transaction on the interface, reducing the ATC available for the next potential user of that transmission.

Of great importance in this restructuring effort is NERC's desire to transform itself from a watchdog entity into a self-regulating reliability organization, authorized to mandate compliance with planning and operating standards. (Established by the utility industry following the Northeastern blackout in 1965, NERC's objectives were to develop planning and operating standards, criteria, and guidelines that would ensure the overall reliability of the system.)

So far this plan has worked on a voluntary basis with rather successfully. But NERC must now extend its charter to recognize new industry participants, such as merchant plants, energy marketers, and power brokers. More importantly, NERC must address the concern of traditional members that "independence" and market economics will not lead to acceptable system reliability.

To that end, NERC has issued fresh planning and operating standards, very much in line with its own existing standards and those of its regions, fleshing out information and performance reporting requirements. The group has also moved to define compliance requirements for all entities using the bulk power system, including erstwhile non-NERC participants, and has proposed penalties for noncompliance. A pilot program begun this year is testing the compliance processes and evaluating the efficacy of the information reporting and associated sanctions.

Mandating compliance

What NERC will need, of course, is legislated authority to mandate compliance. Of relevance are the findings of the Secretary of Energy's Advisory Board Task Force on Electric System Reliability. The task force has emphasized the need for congressional clarification of exactly what authority FERC has over a self-regulating industry reliability organization, and of possible expansion of its jurisdiction for reliability.

In parallel with that finding is the DeLay-Markey Bill, HR-4432, which would amend the Federal Power Act to grant FERC the jurisdiction it needs over electric reliability organizations, operators, and users of the bulk-power system so as to enforce compliance with U.S. standards developed by NERC. That bill, introduced last year, missed congressional deadlines and is destined to be re-introduced.

FERC's order for open access and the industry's concern for maintaining reliability amidst all the chaos is answered in great part by the establishment of regional regulating bodies called independent system operators (ISOs). The requirement for independence stems from the need for nondiscrimination and clear separation of system operation from participation in the market. As the name suggests, the ISO "operates" the system and, depending on its terms of reference, has daily responsibility for such matters as processing requests for and scheduling transmission service, managing congestion, ensuring provision of ancillary services, coordinating maintenance, and generally maintaining security. Whether or not an ISO does planning will depend on how its responsibilities are defined and how its efforts are coordinated with transmission owners.

The structure and function of ISOs appear well established. California set the trend by starting ISO operation in 1998, coordinating ISO efforts with a power exchange and defining procedures for bidding line load relief and congestion management. Before it formally became an operating entity, however, the California ISO's trustee had the foresight to investigate and identify "must-run" units to ensure reliability of the network. This study highlighted the crucial nature of generation locations in relieving transmission constraints and providing voltage support, emphasizing the fact that generation is more than just a source of energy.

What the ISO's future role may be in providing the solution to coordinating the need for both facilitating the market place and maintaining system security is already evident. In looking at the ISOs now functioning throughout the continent--in New England, the Pennsylvania-New Jersey- Maryland area, New York, and Texas--it is interesting to note that the models and the mandates are not the same. New York, in particular, provides for a separate entity, independent of its ISO, to define reliability objectives and measures. Many of these structures can be expected to continue through a transitional period as the industry sorts out its needs.

Distribution issues

At the distribution level, utilities have always tried to maintain and improve reliability, but in a relatively haphazard way. With today's sophisticated analytical tools, reliability levels can be quantified and cost/benefit tradeoffs can be plotted. Some states have even mandated reliability targets and use performance-based rates to enforce adherence to the target levels. The difficulty with this approach is identifying the cause of unreliability. Faults on the transmission system can affect the distribution system.

A further concern is the disparity between the types of distribution customer. A reliable supply is more costly to provide in a rural than in an urban area or city. Besides it is not necessarily correct to provide every customer with the same reliability.

Still, in high-density areas, it is difficult, if not currently impossible, to have customers each select their own level of reliability on a tariff basis. Under a competitive structure it will be important to ensure that disincentives are not placed on distribution companies that demonstrate poor reliability in rural areas. The Texas Public Utilities Commission, for instance, took steps to reduce the risk of pockets of unreliability by mandating that utilities not let two feeders from the same substation fall on the 10 worst-performing feeders list two years in a row.

The nightmare of inadequacies

With all the activity taking place, it could be imagined that reliability is in good hands. Most of this movement, however, is reactive rather than proactive. While this statement may appear to be contentious under other circumstances, it is perfectly appropriate when applied to the disorder prevailing today as the power industry makes the transition to a market environment.

The philosophy--that competition will introduce low-cost energy and the marketplace will stimulate adequate generation capacity--gains credence from the number of merchant plants being readied for the Northeast. (At last count, about 25 000 MW of new generation capacity has been proposed in New England to serve a peak load of 20 000 MW already being served by existing generation and imports.) There is some comfort to be taken from this phenomenon, at least with respect to the reliability of generation in that area.

From the point of view of security of operations, however, the industry could be headed toward a nightmare if transmission planning is inadequate. The concern stems from the fact that in a market environment, there is less control, and larger uncertainty, in the near-term dispatch and longer-term source and availability of generation.

The characteristics of modern, gas-powered generation, favored by entrepreneurs, include high efficiency and fast installation, among others. The ability of transmission providers to supply additional capacity continues to be hampered by siting, licensing, and environmental issues. Where before generation planning lead subsequently to transmission planning in an achievable time frame, now generation expansion is happening much more rapidly and, more importantly, without the benefit of global planning. The transmission planner is thus faced with a future full of uncertainties and unknowns.

Utilities worldwide are confronting this problem, some more successfully than others. Efforts have been made in Mexico, Central America, and Southeast Asia to identify planning methodologies to deal with it. Eletrobrás in Brazil, for instance, is embarking on the development of planning methodologies for the short, medium, and long term in full knowledge that the restructuring of the power industry will introduce great uncertainty in generator rating and location--to the planners' chagrin. But utilities in the United States, under the pressure of a rapidly changing environment, have not yet collectively faced this prospect.

It is clear that many of the current initiatives for ensuring reliability focus on facilitating transactions on the basis of "available" transmission. Two issues are worth consideration. First, an increasing number of transactions does not introduce transmission congestion (the congestion myth). Second, a system operator has at its disposal only those lines and equipment installed as a result of planning.

As an initial examination of the congestion myth, suppose that on the first day after FERC issued its Order 888, every generating unit in the system were divested (if not already independent) such that each were subject to a bilateral transaction (total trades equal to total demand). At that time, to within a few percent, the power-flow conditions would be identical to those prior to Order 888 despite the existence of thousands of transactions.

Simply stated, neither the generating units nor the load centers would have been relocated. Power flow is a spatial phenomenon. It is the combined location of demand and generation that determines the loading on each transmission line. With the influx of new generation units now plugging into the existing network at random locations, stress is created [ Fig. 1].

The top of Fig. 1shows two load centers being served by two plants so that 500 MW must flow from area A to area B. The assumed transmission limit is 500 MW. Now assume that a new industrial demand of 200 MW locates at B and sets up a contract with a new 400-MW merchant plant. That plant sells 200 MW to the industrial factory and the balance to the rest of the world. If the new plant sites at A, then the transmission flow would increase to 700 MW.

Apparently, the new transaction has resulted in congestion. If, however, the new plant sites at B, the new load will be served and the transmission path become unloaded. The proverbial rocket scientist is out of work here. The transaction does not cause congestion. Rather, it frees up transmission capacity. Clearly plant siting is the important factor. When ATC calculations are performed as a function of transaction impact, they are merely identifying the impact of plant location.

'Planning is everything'

In examining the system operators' problem, it is useful to remember Winston Churchill's observation that "A plan is nothing. Planning is everything." Given that a high-cost energy region can attract merchant plants like bees to honey, transmission providers are attacking the need to accommodate new plant offerings the way they best know how: through legitimate application of regional and NERC deterministic planning criteria. The task can be onerous, especially when financial forces conflict with the need for comprehensive evaluations of the impact of each additional plant on the transmission system's capability. The idea is to handle injection at the plants' preferred location. As Fig. 1shows, plant location plays the key role, and multiple plant additions require an evaluation of their several and joint impacts.

What Fig. 2illustrates is that supply to the two load centers cannot be achieved if generator C is not dispatched. The transmission path limitations are again 500 MW. Putting this another way, generators A and B cannot be dispatched together at their full capacities of 400 MW and 300 MW, respectively, without overloading the transmission. To avoid line overloading, generator C could be dispatched instead of generator B. Another of many possible solutions is to assume that generator C is always dispatched with at least 200 MW against any combination of A and B totaling no more than 500 MW. This simple example is what operators' nightmares are made of.

In the real world of dynamically complex networks with dozens of interdependent units, planners are currently attempting to accommodate all new applicants through an ongoing design of future, robust transmission networks that have the capacity to handle many dispatch combinations.

Under financial and regulatory pressures, however, planners could be forced to shortcut their analyses in order to find any workable dispatch for a plant or group of plants. The number of untenable dispatch conditions could grow rapidly. At the best of times, with the elements put at their disposal by the planners, operators struggle to maintain security on a day-to-day basis. If planning requirements are to fall short of traditional exigencies, the operators' task will become even more arduous.

One example of the impact of open access on operations security was demonstrated on 25 June 1998, when an event occurred in which a large area of the U.S. Mid-Continent Area Power Pool (MAPP) region became separated from the rest of the eastern interconnection. Although not cited as a direct cause, the report on the disturbance discussed the difficulty system operators face in the deregulated environment when an outage makes the system unable to withstand a second severe contingency. While this is a problem with a solution, it portends a more complex life for operators who must deal with an expectation of reduced security.

Exacerbating the situation, the well-meaning hand of the "regulator" is becoming somewhat intrusive. In 1992, a review of security standards was initiated by the National Grid Co. PLC, in Britain, in response to the regulators' concern over the high cost of security constraints. Currently, during maintenance and other outages, the National Grid's security standards require the possibility of constraining off specific generation to ensure system security.

Among the regulator's suggestions was to relax the National Grid's fault criteria to a less strenuous level. While in the public interest in terms of the overall cost of energy, this advice is tantamount to trading off security in the interest of lower prices. In an excellent and comprehensive review that examined various responses to the regulator's concerns and suggestions, National Grid concluded that a relaxation of the existing standards would markedly decrease reliability.

FERC, too, has showed its hand following a complaint from Champion International Corp., Stamford, Conn., and Bucksport Energy Corp., Bucksport, Me. The complainants charged that "...their access to the New England Power Pool (NEPOOL) Pool Transmission Facilities has been made uncertain and prohibitively costly as a result of delayed placement of Bucksport in the NEPOOL System Impact Study (SIS) transmission request queue and that, under ... existing SIS requirements, complainants will be required to pay for system upgrade costs that may be unnecessary."

FERC found that "NEPOOL's existing SIS procedures are based on unrealistic assumptions, produce unreliable cost estimates and are not otherwise justified....Bucksport and other project applicants who may be similarly situated should be allowed to connect to the NEPOOL [system] without regard to the expansion cost estimates resulting from NEPOOL's existing SIS criteria....and Bucksport's request to use economic redispatch [should be granted] in lieu of paying for [system] upgrades until such time as NEPOOL implements revised SIS procedures."

Of interest is the commission's ruling that the connection is allowed before the cost of necessary reinforcements is evaluated, which translates to "before necessary reinforcements are understood and planned." The ruling demonstrates the manner in which financial forces, not reliability, will drive, and are driving, the structure of transmission networks.

In California, the ISO has resorted to signing short-term (one-year) "reliability must-run" contracts to ensure availability of generation during extreme conditions--in the hopes that a transmission solution will be available at some future date. Who will fund and construct transmission reinforcements--and how--are key questions. Perhaps this is another case where the market concept has leapt forward into reality while a complementary structure for reliability was yet to be defined.

So far generation additions are running ahead of planning transmission enhancements so that the operator's role in life will no doubt be one of coping with constraint management rather than contending with accommodation of bids. Economic redispatch and congestion management will not further the cause of either open access or reduced energy cost to consumers and, with a mountain of untenable dispatches and constraint procedures to handle, the operator's life will be a crapshoot.

So, what to do?

Several measures are in order:

First, educate. Both FERC and prospective plant owners need to recognize that integration of a power plant into a large-scale electric power system requires a comprehensive and sophisticated analysis of transmission requirements. It is not akin to plugging in the latest videocassette recorder. What's more, without a robust transmission system, there can be neither "open access" nor cost savings to consumers. Unless time is allowed to plan transmission reinforcements and build them (if they can be built), both generation and transmission reliability will suffer. Transmission system security will be depleted by complexity and lack of capacity. And generation reserves, no matter how large, will be inaccessible.

For the future, operator training and enhanced on-line security assessment tools are essential--primarily because transmission capacity is most likely to lose ground. Two big footholds are that new gas-powered generation can be tendered and installed quite quickly, and crucial work is being done on operator training and development of enhanced software tools for operations.

Next, improve planning. The NEPOOL situation highlights the complexities of handling many requests for access in a short time frame. More importantly, it emphasizes the need for development of planning methodologies that can handle future uncertainties. This is an area in which NERC, despite its important endeavors, has fallen short. Its move to a self-regulating reliability organization, coupled with the development of (not entirely) new planning standards, has not dealt with the problem of how to plan a network when information on future resources is unavailable or uncertain.

Essentially, NERC has regurgitated rather than reformulated. While conventional deterministic planning criteria and methods have been advocated, enhanced now with more comprehensive data and information reporting for mandated compliance, they fail to meet future needs.

There is also a need to embrace already available planning methodologies capable of handling new complexities. Methods should accommodate uncertainty and risk, statistical analysis should play a greater role, and cost/benefit and customer impact need to be factored in. Some utilities are already embracing these ideas. Furthermore, as market hedges and insurance against failure to deliver or transmit gain prominence, statistical analysis of the system will take on a greater significance.

Among other gaps to be filled, generator-siting techniques should be developed to include the effects of available resources, existing energy costs, available transmission, and environmental acceptability. New plant owners should also be encouraged (presumably financially) to build where their location will enhance the overall system and maximize their own availability.

As an adjunct to that goal, the potential impact of distributed generation should be recognized. There are pluses and minuses. Locating generation close to or at loads will reduce dependence on transmission. But injecting power into a distribution system, has always been an exercise fraught with difficulties of protection, stability, and equipment rating. It is not too early to begin formulating new ideas on the structure and topology of distribution systems to exploit these new technologies. The question is far bigger than mere connection facilities design.

Keep oversight nonintrusive . FERC should continue to take care of the public good by firm oversight of procedures and market making, but it should also avoid too intrusive a role in the complexities of system design and operation. Similarly, congressional action on behalf of mandating reliability should resist going beyond oversight and granting of authority to mandate compliance with reliability standards. Unlike a telephone system, the market is not in the network but in the providers of energy.

Above all, address transmission inadequacies . Open access was predicated on the false belief that the transmission system was a transportation network. In fact, it is a system of conductive paths, largely uncontrolled, which at any time may collapse just from the manner in which it is used. Without an available, robust transmission network, operators wage an endless, losing battle. Special protection schemes, application of Facts (flexible ac transmission system) devices, and congestion management tools cannot replace capacity. In the desire to find cheaper, more efficient ways to supply power for the lights, the industry may have risked its ability to keep the lights on.

Spectrum editor: William Sweet

About the Author

John D. Mountford is manager of the system planning and operations department at Power Technologies Inc. (PTI), Schenectady, N.Y. During the past 28 years there, he has been responsible for designing major transmission systems throughout South America, Southeast Asia, and the United States. He is now on assignment in Brazil, working with Eletrobrás on the development of planning methodologies for the restructured electric power industry there.

Ricardo R. Austria is manager of PTI's transmission reliability services unit. His recent activities include consulting for utilities and merchant developers to help address reliability issues in the new competitive power market environment. He also leads efforts in developing methods and analytical software to meet the changing needs in power system assessment.

To Probe Further

For a discussion on the ambitions of the North American Electric Reliability Council (NERC) to transform itself into a self-regulating reliability organization with authority to mandate compliance, see "Reliable Power: Renewing the North American Electric Reliability Oversight System," prepared by NERC's Electric Reliability Panel, Princeton, N.J., 22 December '97.

For a report on the Department of Energy's deliberations about the Federal Energy Regulatory Commission's authority over NERC and over power reliability in general, see "DOE Task Force Emphasizes Necessity of Increased Efforts to Ensure System Reliability," printed in Washington Letter, Edison Electric Institute, 16 October 1998.

A detailed description of the outage that occurred in the mid-continental region (the agrarian Midwest) appears in "Transmission System Open Access Versus System Reliability: A Case History-The MAPP Disturbance of June 25, 1998," presented by Karl N. Mortensen at the winter IEEE Power Engineering Society meeting in New York City, January 1999.

The costs of transmission security constraints are discussed in a British context in "A Review of Transmission Security Standards," The National Grid Co. PLC [London], August 1994. For background on the Bucksport case, see USA FERC Docket No. EL98-69-000.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions