Chicago: Battery Central

At the end of November, the U.S. Department of Energy announced that it had selected Argonne National Laboratory in suburban Chicago to host the Joint Center for Energy Storage Research (JCESR), and bestowed upon it a $120 million grant over 5 years, alongside a $35 million commitment for a new 45,000 square foot facility from the State of Illinois.

As noted in this article in the Chicago Tribune, the goal for the JCESR is to improve battery technologies by a factor of five — five times cheaper, with five times higher performance — within five years.

One of the nation’s Energy Innovation Hubs just being launched, the JCESR has an impressive list of collaborators.  In addition to Argonne, four other national laboratories – Lawrence Berkeley, Pacific Northwest, Sandia and SLAC National Accelerator – will also conduct research under the JCESR umbrella.  University research partners include Northwestern University, the University of Chicago, the University of Illinois at Chicago, the University of Illinois at Urbana-Champaign, and the University of Michigan.  A long list of the leading venture capital firms active in the cleantech arena – including ARCH Ventures, Khosla Ventures, Kleiner Perkins, Technology Partners and Venrock – will serve on an advisory panel to help focus the research on commercially-interesting opportunities.  Corporate titans Applied Materials (NASDAQ: AMAT), Dow Chemical (NYSE: DOW) and Johnson Controls (NYSE: JCI) have loaned their names to the effort.

Whether it was because the team didn’t want their influence or because they didn’t want to be involved, no corporate representatives from the automotive or electricity industries are part of the JCESR constellation.

Especially when paired with the Galvin Center for Electricity Innovation just 30 miles away at the Illinois Institute of Technology, where smart-grid research is a primary focus, the JCESR announcement arguably leapfrogs the Windy City into the top echelon of cleantech technology research clusters, particularly as it relates to electricity management.

Small Hydro Emerging as Viable Sector for Renewable Energy Development

by David Niebauer

With many states adopting renewables portfolio standards (RPS) and the prospect of a federal RPS somewhere on the horizon, more attention is being given to hydroelectric power generation.  Renewable resources such as sun, wind and water, are those that can be harvested in a sustainable manner to provide the electric power that our society depends on. Water (or gravity moving water) has received less attention from project developers than wind and solar.  But that may be changing.

Approximately 18% of the total world energy supply is hydroelectric. But of course, all hydro is not created equal.  The bulk is large hydro, which employs dams and weirs that disrupt the environment in unalterable ways.  Most hydroelectric facilities are not considered “renewable” – at least not by environmentalists.  Large man-made reservoirs change habitats forever and are often blights on the natural settings in which they are built.

Small hydro – facilities that generate up to 30 MW – can be developed without harming the environment.  So called run-of-river facilities are designed to take advantage of flowing water in rivers and streams in such a way as to have minimal impact on fish habitats and natural settings.  Also, many of the dams in the US are not powered. These facilities, where the environmental impact of the dams cannot be undone, are ripe for small hydro development.  In September 2009, U.S. Energy Secretary Steven Chu said the hydro industry could add 70,000 MW of capacity by installing more efficient turbines at existing dams, increasing the use of pumped-storage projects and encouraging the use of run-of-river turbines. That capacity is equivalent to 70 nuclear plants or 100 coal-fired plants.

Until recently, the major impediment to the development of small hydro has been regulatory.  There are two major federal agencies responsible for hydroelectric power development – Federal Energy Regulatory Commission (FERC) and the US Army Corp of Engineers – neither of which are known for their nimble, user-friendly ways.  While wind and solar projects can often avoid federal regulation, relying instead on individual state authority, FERC is responsible for licensing all non-federal government hydroelectric projects that touch navigable waterways or affect interstate commerce (i.e., if the system is to be connected to a regional electric transmission grid).  Horror stories abound of FERC applying the same licensing and fee structure to a 500kW run-of-river system as it would to a 500MW hydroelectric dam project.  This appears to be changing.

FERC has been investigating ways to simplify the process of obtaining small hydropower licenses and exemptions and, on August 31, 2010, unveiled its Small/Low Impact Hydropower Program Internet site, explaining how developers can quickly and efficiently win FERC approval to build and operate small hydro projects.  The website is part of a FERC plan to expedite small hydro projects.  Another important component is an initiative to enter into memoranda of understanding with state governments to advance FERC exemptions for small hydro projects in those states.  In August 2010, FERC announced a pilot program with the State of Colorado, and has entered into similar MOUs with the states of Washington, Oregon, California and Maine.

Developers appear to be rising to the challenge.  FERC issued 50 preliminary permits to study small sites in 2009, compared to 15 in 2007.  There is money available at both the state and federal level, mostly untapped, in the form of low interest loans, and investors appear to be warming to the sector. An Internet search uncovered at least one developer engaged in a strategy of rolling-up small hydro assets, and undoubtedly more will follow.  A logical approach for a developer would be to acquire a portfolio of revenue-generating assets as a way to demonstrate satisfactory investor returns.  From this base, a developer should be able to build profitable projects at existing unpowered dam sites, and to pursue run-of-river and pumped storage opportunities.

Much attention has been paid to wind turbines and solar PV as ways to harness nature’s abundant energy resources.  Hydroelectric power has often been overlooked due primarily to its scale and the high regulatory hurdles facing developers.  That may be changing in regard to small hydro.  The country has countless unpowered dams that are ripe for development.  This, combined with the prospect of streamlined permitting and exemption processes at FERC for run-of-river and pumped storage facilities, has developers exploring ways to advance small hydro in the service of the nation’s renewable energy goals.

David Niebauer is a corporate and transaction attorney, located in San Francisco, whose practice is focused on financing transactions, M&A and cleantech.



“Cost Causer Pays” or Where is the Incentive for T&D Grid Upgrade?

by David Niebauer

In representing a utility-scale solar developer client recently, I was surprised to learn (naively, I now realize) that the general rule for transmission upgrades is  “cost causer pays”.  What that means for my developer client is that, regardless of how desirable the project, the developer will have to pay the full cost of upgrades to the grid network to bring the generation on line.  This is the case even though most of the positive effects of the upgrades will benefit the utility and the electricity consumers in general, and even competitors that will be able to piggyback on the investment.

This has led me to ask the question in the title of this article:  who has the incentive to invest in upgrades to the nation’s electricity transmission and distribution system?

It is common knowledge to anyone working in Cleantech that the transmission grid requires extensive upgrades.  These upgrades are required in order to allow more renewable resources to be brought online, and they are necessary for modernization and expansion.   The grid was built a long time ago and infrastructure investment in the area has lagged for decades.  The most recent (and reliable) estimate that I have seen anticipates that $165 billion will be deployed over the next 20 years upgrading and expanding the grid.

Deregulation has forced utilities to cede control of transmission assets to Regional and Independent System Operators in order to open the transmission grid to all participants.  Under the current regulations, RTOs and ISOs, being non-profit entities, have no incentive or ability to either acquire existing transmission assets or develop new ones.  Some observers believe that independent for-profit transmission companies will emerge, with regulatory and financial incentives that will permit a roll-up of transmission assets into stand-alone businesses. Should such a structure emerge, the right incentives for grid upgrade might exist, but this structure is only one of a number of solutions and only time will tell if it will emerge.  In the meantime, ISOs/RTOs are unlikely candidates to spend money on transmission upgrades.

Ultimately, or course, we will all pay through higher electric utility bills.  David J. Leeds of Greentech Media makes the case that utilities will drive investment in T&D upgrades.

“When you consider that the U.S. electric utility sector, with it’s annual revenues of roughly $300 billion, is 30 percent larger than the automobile industry and twice as large as the telecommunications industry, and then bring to mind the craze of dotcom investments and telecom M&A which occurred in the mid to late 1990s, a reasonable picture starts to emerge of what can be expected of in terms of Smart Grid investments and M&A in the next five to 10 years. Many of the senior level employees working for privately held companies in Smart Grid, have backgrounds working in either telecom or IT.”

From a macro perspective, I am sure this is true.  However, given the difficulty that utilities have in passing on costs to ratepayers, the build-out will almost certainly go slower than most observers would like.  The so-called “SmartGrid City” being built out by Xcel Energy in Boulder Colorado is a case in point.  Xcel has been allowed to pass on to ratepayers $45 million of the estimated $100 million cost of that project, and the good citizens of Boulder are not happy about it. No doubt this will be read as a cautionary tale for other utilities with plans to move forward on their own with T&D upgrades.

The Federal government will be able to stimulate some of the upgrades through grants and tax incentives, but its impact is both jurisdictionally and fiscally limited.  While the FERC regulates wholesale prices, it has no authority to mandate the construction of new transmission lines – these decisions are all made at the state level.  But the grid is a network of interconnected transmission lines which of necessity cross state and regional borders.  Without a central planning authority, development occurs in a piecemeal and halting fashion.

The American Recovery and Reinvestment Act of 2009 (ARRA) is providing about $4 billion in Smart Grid stimulus funding, but given the enormity of the required work, this is really a drop in the bucket.  Yes, we desperately need a national energy policy that would include construction and upgrade of regional transmission lines.  But given the legacy of the transmission grid and the desire of state and local governments to have control over energy costs, I have a hard time seeing how coordinated activity can occur.  Add on top of this the debacle of deregulation and you can begin to see the quagmire we are in.

State governments have big plans for bringing large amounts of renewable energy on-line.  The Texas CREZ (Competitive Renewable Energy Zone) is a $5B plan to move 18 GW of wind from west Texas and the panhandle to the major load centers in east Texas consisting of 2300 miles of new 345kV transmission.  Search “Intl_ROW_012710.pdf” for more information.   In California and the west, the Western Governor’s Association has developed the Western Renewable EnergyZones (WREZ) to bring wind, solar and geothermal into the western load centers. The WREZ initiative seeks to develop 30 GW of clean energy by 2015. This initiative calls for the construction of significant new interstate transmission lines.

The CREZ will be paid for by ratepayers, but the WREZ has no funding for its ambitious plans.

To highlight the problem, the WREZ initiative states:

“In order to plan and support the permitting and construction of new transmission lines, there must, at a minimum, be close coordination among resource planners, transmission providers, sub-regional and interconnection-wide transmission planners, transmission developers, federal land use agencies, renewable developers, state, provincial and federal regulators, and environmental organizations.”

With benefits to be derived by 11 US states, 2 Canadian provinces and some areas of Mexico, how do the costs get allocated?

The Brattle Group has done a study on the cost allocation and recovery approaches to transmission grid upgrades.  They explore a number of the methodologies being used and being developed.  They document the complexity of current cost allocation approaches.  While some single state approaches appear to be working, regional transmission upgrades, which are by far the most important to the national grid, are more difficult.  The final takeaway from the report:  “Despite years of effort, cost allocation remains the number one barrier for multi-state, multi-utility transmission projects.”

Obviously, “cost causer pays” is not going to get the job done.  We need a national energy policy with a strong transmission and distribution grid upgrade component.  The task is complicated by overlapping and sometimes competing federal and state objectives, but failing to act is simply not an option.  Both financial and policy incentives must be made clear for stakeholders so that the greenpower superhighway that many envision can become a reality.

David Niebauer is a corporate and transaction attorney, located in San Francisco, whose practice is focused on clean energy and environmental technologies.

Power Flow Control Devices: Hardware for the Smart Grid

by David Niebauer

A significant amount of attention (and money) is directed at the communications and IT upgrades necessary to empower a Smart Grid.  The very concept of a more intelligent power transmission system implies a vast increase in data.  The more that can be “known” about conditions of the system, the better and more efficiently the system will operate.  This is the world of software, of information gathering, of machines “talking” to machines.

But there is the other side of the equation, a side that is not as often discussed in the media and research reports on the Smart Grid:  hardware.  One improvement to the electricity transmission infrastructure that unquestionably falls in the category of hardware is the construction of new transmission lines.  The limitations on building new transmission lines, however, become apparent as soon as one starts to think about it.  Not only do zoning, environmental and NIMBY (Not in My Backyard) concerns add delay and uncertainty to the construction of new lines, the cost is truly exorbitant.  Estimates from the National Counsel on Electricity Policy in a 2004 report (Electricity Transmission: A Primer) peg the cost at between $285,000 and $1.71 million per mile, depending on terrain and line type.

Power Flow Control

Another (far less expensive) solution in the category of hardware for the Smart Grid is comprised of devices designed to control the flow of electrons.  Power flow control devices increase the capacity of the overall system without the need to construct new transmission lines.  These devices are an integral component of the Smart Grid and utilize a variety of strategies to modulate the flow of power and therefore increase the efficiency of the power grid.

To understand how power flow control devices work, one must first understand the “meshed” nature of the Grid.  Historically, the power grid was built in a radial structure.  That is, transmission and distribution lines were constructed to directly connect the generating facility with the ultimate load centers.  This is the simplest structure and provides the most control:  essentially one on/off switch is all that is needed. However, it is also highly unreliable – a fault along any part of the line can cause the entire down-stream system to collapse.  System operators have begun to address this problem by constructing transmission and distribution networks in a meshed structure.  A meshed system is more reliable because congestion and faults can be isolated in discrete segments of the mesh without affecting other segments.  See D. Divan, H. Johal, “A Smarter Grid for Improving System Reliability and Asset Utilization,” in Proc. IEEE Power Electronics and Motion Control Conference, 2006.

Power Flow Control Increases Capacity

The increased reliability of meshed networks is obtained at the cost of capacity underutilization and inefficiency.  In a radial structure, transmission and distribution lines are kept at or near capacity. In a meshed system, capacity is limited by the lowest-capacity segment. Electricity always follows a “path of least resistance” (lowest impedance), so the first line to reach its thermal capacity limits the capacity of the entire system, even though a majority of the lines of the system are significantly below their limit.  It is estimated that US grid capacity utilization rates are typically only 45% to 60% of theoretical capacity.

Power flow control devices steer the current in a line in order to balance the loading on all the lines allowing the overall system to operate at its theoretical maximum capacity.  Visualizing electricity flowing like water through a hose helps to understand how this works.   Power Flow Control devices are like valves on the water hose.  One important difference: electricity flows at the speed of light, so controlling the flow requires highly technical solutions.

FACTS devices

A number of solutions have been proposed and are being deployed by utilities and transmission operators.  The most common such solutions are power electronics based Flexible AC Transmission Systems (FACTS) devices.  FACTS devices work by either controlling the voltage or modifying the impedance of transmission lines, thereby controlling the power flow.  For under-utilized lines, increasing voltage allows additional current to be pulled into the line; in congested areas, increasing line impedance pushes excess current into other parallel paths.  The combined effect results in an increase in system capacity and line utilization.

FACTS devices are expensive, but not when compared to the cost of construction of new power lines.  FACTS devices are priced based on increase in capacity of existing lines, and generally range from $150 – $300 per kVA. New distributed FACTS devices are being developed for significantly less and may soon be deployed.


Software is not the only Smart Grid play.  Developments in Power Flow Control – hardware wedded to power electronics – promise to increase the capacity of the existing electric transmission grid, thereby allowing the system to operate more efficiently for lower infrastructure costs.  Controlling the flow of electrons in order to improve the existing system can and is being done.   As the Smart Grid is built out, watch for companies that design and build the hardware that all the software is being designed to control.

David Niebauer is a corporate and transaction attorney, located in San Francisco, whose practice is focused on clean energy and environmental technologies.

The “Smart Grid”: An Overview

By David Niebauer

The electricity transmission and distribution grid in North America is awe-inspiring.  Often called the “world’s largest machine”, the Grid connects huge power generating facilities with end users (both residential and commercial) in a system that would have been considered magic only 150 years ago.

The big news of the 21st Century is that the Machine is getting a significant upgrade.  The electricity grid was designed to distribute power, and power only, in one direction:  from generation to end-user.  This system worked fine when electricity was a novel resource and relatively abundant, but it is rife with waste and inefficiency. Because there is no practical way to store electricity, the Grid was built with a capacity to meet the absolute peak demand.  And because utilities are paid to sell electricity, they have historically had little incentive to find ways to conserve.  Today, the accumulated hit to environmental quality caused by this inefficiency, together with the cost of constructing and operating generation assets, has reached the limits of what is tolerable.

The new “Smart Grid” is being designed to allow information flows, as well as energy, to reach all parts of the system.  The information available to system operators at present is limited.  When and how the power is used, where congestion might occur, how usage might be curtailed at critical times  – the system is essentially blind to these and many other important data points.  A more intelligent system, enhanced by developments in telecommunications and information technology, will allow the system to operate more efficiently, with corresponding benefits to society.

It is estimated that electricity transmission infrastructure investment will exceed $600 billion by 2020.  In addition to spending by utilities, the venture capital community investment in the space is accelerating, and large U.S. companies such as Microsoft, Google and Oracle are beginning to stake claims.  The vision is of a more sentient Power Machine that organizes the flow of energy and information through all of its limbs for the benefit of all who touch it.  There is even a nascent movement calling for the interconnection of a global power grid.  Essentially, the Smart Grid will allow utilities to proactively manage demand, re-route power around disturbances, integrate distributed renewables and electric transportation and continue to offer reliable and affordable electricity into the foreseeable future.

Following the excellent work of David J. Leeds of Greentech Media in his report The Smart Grid in 2010: Market Segments, Applications and Industry Players we will divide this discussion into four segments:  Advanced Metering Infrastructure (AMI), Demand Response (DR), Grid Optimization and Energy Storage.

Advanced Metering Infrastructure (AMI)

Advanced Metering Infrastructure, as its name suggests, is focused on the meter – that is, at the point of consumption.  AMI deployment is replacing mechanical meters with digital meters that allow for two-way communication.  By providing information as well as energy, the consumer is empowered to shift consumption patterns away from peak-demand periods when prices are high and system reliability is low.  Utilities are also able to collect usage data that can be used to provide more efficiency and less waste.
The Obama administration famously called for the installation of 40 million Smart Meters in US homes and businesses by 2015 and has backed up this pledge with funding from the American Recovery and Reinvestment Act.  AMI has received the lion’s share of venture investment to date and leads the Smart Grid deployment.

“AMI can best be seen as a transformative application since the AMI/FAN [Field Area Network] communication network necessary to run advanced metering applications can also be used to transport data for all kinds of other emerging Smart Grid applications.” Leeds, p.7.

Demand Response (DR)

Because electricity must be used when generated, providing sufficient power for “peak” demand periods is an ongoing problem for utilities.  The problem has been traditionally addressed with so-called “peaker plants” that are brought on-line only when needed – when demand is expected to spike, such as during a hot summer afternoon when air conditioners are sucking energy to keep things cool.  Peaker plants are generally old, inefficient, expensive and dirty to operate.  Demand Response is an alternative solution that is enabled by the Smart Grid.

DR allows a customer to reduce its use of energy during these peak periods, lowering cost for the consumer and allowing the utility to re-route the electricity where it is needed – without having to rely on starting up its peakers.  DR is cheaper, faster, cleaner and more reliable.

To date, most DR solutions have been deployed by large commercial energy users.  But with the widespread integration of Smart Meters, the practice can now begin to be rolled out for residential consumers as well.

DR is implemented by third party aggregators who enter into contracts with consumers that allow the aggregator to reduce the consumers’ energy usage during peak hours (using thermostats and intelligent grid-aware devices).  The aggregated “virtual peak power” is then sold to the utility.

Grid Optimization

Grid Optimization is all about making the distribution network more efficient through the use of information management and system controls.  Rather than focusing on changing consumer behavior, which is essentially the goal of AMI, Grid Optimization enables utilities to clean up their side of the street – distribution from the substation to the point of use.

There are a wealth of devices and technologies that are contributing to Grid Optimization, and more will be developed as the Smart Grid is built out.  Some of the many benefits include monitoring grid assets, decreasing faults and outages, rerouting power to maximize efficiency, minimizing congestion, determining when to bring renewables online and generally allowing proactive management of generation and distribution assets.  (Leeds p. 60-61).  Leeds anticipates that Grid Optimization and its cousin, Distribution Automation, will be the fastest growing market segment over the next five years.

Energy Storage

Anyone working in the renewables field (solar, wind, etc.) can immediately see that a breakthrough in energy storage would revolutionize the industry.  Renewables are referred to as “intermittent” resources because they only generate some of the time – when the sun shines or the wind blows.  If only we had an economical way to store electrons, renewable energy could begin to supply base load, and that would change the game forever.
But this is going to require a true technological breakthrough.  The available options at present are woefully inadequate.  Energy storage, such as pumped storage (hydro and air), thermal storage and flywheels, provide the best solutions, but even they have severe limitations (cost, scalability, geography, etc.)  Electricity storage –batteries (Lead Acid, sodium-sulfur, Lithium ion, etc.) and supercapacitors are worse:  expensive and inefficient.

What is needed is a distributed storage solution allowing energy to be stored at the point of use and relayed through Smart Grid management when and where it is needed.  Energy storage is getting the attention of investors and major players (such as GE and AEP), but clearly more can and needs to be done.


The Age of the Smart Grid is upon us.  Huge amounts of capital are being and will be deployed over the next decade and beyond in upgrading the nation’s power grid.  Both the political and financial will appears to be behind Smart Grid deployment.  Fortunes will be made in this arena, and our lives will all be changed for the better through the intelligent delivery of more efficient and cleaner energy.

David Niebauer is a corporate and transaction attorney, located in San Francisco, whose practice is focused on clean energy and environmental technologies.