By Siddarth Ramesh
Over the last two decades, the electric power grid has made a lot of advances in electric energy technology than the public may realize. This rapidly evolving power grid is now experiencing significant shifts towards a market-oriented distribution of electric power geared towards the consumer. By collecting real-time data about consumer power use, power plants and distributors will better be able optimize power efficiency and costs at the wholesale level. Thus, in January 2016, the United States Department of Energy’s (DOE) Advanced Research Projects Agency–Energy (ARPA-E) was awarded close to $2 million dollars in research funding to a collaborative team led by UW–Madison’s Christopher Demarco, professor of power engineering and Wisconsin Energy Institute affiliate. The goal will be to develop data sets for large-scale, realistic and open-access network models of the electrical power grid.
The project, called EPIGRIDS (Electric Power Infrastructure & Grid Representations in Interoperable Data Sets), will be led by DeMarco, whose team includes Bernie Lesieutre, professor of electrical and computer engineering, Qunying Huang, assistant professor of geography, and Michael Ferris, professor of computer sciences. The team will also consult with the DOE’s Argonne National Laboratory and private companies such as GE/Alstom and GAMS Development Corporation, and will coordinate efforts with Commonwealth Edison, the largest electric utility in Illinois.
EPIGRIDS is part of what’s known as the Midcontinent Independent System Operator (MISO). MISO is responsible for operating the electric power grid in the Midwestern region of the US, and it oversees the market for electric power at the wholesale level. All of the power-generating companies participating in this market update their prices at five-minute intervals on the MISO network. “In reality they may not change their prices that often, but in theory a generator can have a new price schedule every five minutes of the day,” claims Demarco. MISO’s job is to monitor all of the generating companies offering to provide power, predict the load they have to serve, and set prices accordingly. The prices are then plotted geographically on the map. On a fairly quiet day without extreme temperatures, the whole region should have a uniform color-coding where usually the darker regions get priced lower than the lighter regions. “It is a very dynamic situation that changes with supply and demand,” Demarco says. A lot of approximation is still used, making it possible to compute prices every five minutes. The importance of this to the project is that the prices change with the availability of the involved transmission systems. “For example, if a transmission line becomes unavailable, suddenly a path that used to be delivering power has to be re-routed and it is not easy to re-route that power,” Demarco says.
At the national level, the Federal Energy Regulatory Commission (FERC) works to make these markets more efficient. “The origin for why this project is interesting comes, in part, from the FERC with the goal of making these national-scale wholesale electricity markets more efficient, more flexible, and more accommodating,” Demarco says. This makes these markets better able to accommodate new sources like solar or wind where generation outputs vary rapidly.
One focus of the project, according to Demarco, “is to recognize what it takes to do the computation to figure out the best possible prices and best possible patterns of purchase, which is all predicated on an underlying electrical computation.” The minute details of the computations get complex, which means that any small shifts in the numbers has an exponential effect on the grid’s power distribution. Another special focus of the project is to see how the electric power grid has been driven by social factors over the years. The whole point of the power grid is to deliver electrical energy to customers. “We basically want to synthetically grow a power grid. This is a rough analogy, but think of it like a Sim-City game where we actually grow a power grid,” Demarco says. Following the same rules that made the actual power grid, this new construction will be using the Geographic Information Systems (GIS) data on population density, industrial and commercial energy consumption patterns, and land use over geographic footprints ranging from the city level up to a continental scale. “We will try to realistically mimic the requirements of the current power grid,” Demarco says.
All of this translates into optimizing this circuit based on the all-market conditions as input information. “The goal is to push state-of-the-art software both for these markets and other kinds of studies that would benefit from it,” Demarco says. The team here at UW-Madison is planning to execute this project over two years on a couple of different scales. One of the projects will actually start from the state of Wisconsin, regarding it as a stand-alone region, and then will ultimately grow to encompass the whole eastern U.S. and western U.S.
“A big part of this project is that it will provide open-access data,” Demarco says. Even though real-world data currently exists, the current federal regulations severely limit the access to any real-world data that could be used to guide any software development or experimentation by the public. “The exact same data that would let you figure out how to best optimize this system could be misused by malicious actors trying to design attacks against our critical electric power infrastructure,” Demarco says. The FERC has increasingly recognized this and in the past, a lot of synthetic test systems were created by professional organizations. However, the work in the recent past with synthetic test systems has proven to be not very successful. The data necessary to do experimentation for software development in an open environment, without security constraints, is several orders of magnitude behind what the real world needs. “We feel if we keep the number of components that have some association to real-world data at 50 percent or less, we will come up with data sets where it would be pretty close to impossible to use it as guidance to do damage to the real world system,” Demarco says.
As new technologies are being added to the power grid, Demarco hopes to have the study tools that effectively address the costs and benefits of the new generation, commonly known as the distributed generation. By producing data sets for modeling purposes, Demarco hopes to incorporate some of these new technologies and to “really open the door for better system-wide evaluation of the benefits of these new technologies down the road.”
However, there are some challenges to collecting accurate data that are tied to different factors such as geography or population. For example, severe weather is the number one cause of power outages in the United States, costing the economy between $18 and $33 billion dollars every year in lost output and wages, delayed production, and damage to grid infrastructure. Some of the most common events of somewhat lower severity are lightning strikes, which can hit transmission towers or big metal objects. However, when lightning hits a transmission line, there is a significant impact on grid operations because you almost always have to briefly de-energize that line to get rid of the effects of lightning. “When you see lights flicker in a lightning storm, that’s probably what happened,” Demarco says. Therefore, part of what the team wants to be able to model is the reliability effects of the common impacts of lightning, tornadoes, flooding, hurricanes and all other major weather events.
Looking to the future, Demarco envisions that the power grid will be more integrated across tens of hundreds of kilometers by creating an all-encompassing system with minimal environmental impact, lower cost, and greater reliability. “We made a lot of advances in how each individual device operates and we are seeing the benefits of that in higher levels of production from those sources,” Demarco says. To really take it to the next level, the question of how the grid can control and coordinate many discrete devices across the big geographic footprint will be the next frontier in getting more effective and efficient energy technology. Perhaps, Demarco and his team are paving the way in remodeling a smarter and efficient power grid.