Energy Storage Drivers Play Big Role in an Evolving Grid

Few industry observers dispute the critical role of energy storage in the scaling of variable renewable generation. The grid is rapidly evolving with wind and solar, an inevitability driven by the fundamental economics of the two technologies. As solar and wind achieve higher levels of grid penetration, the intermittencies of these renewable fuel sources can limit their effectiveness. That’s where storage comes into play. With costs rapidly decreasing, an inflection point is coming soon where the economics of storage-plus-solar will increasingly win out over conventional energy generation sources. Bloomberg New Energy Finance analysts stated recently that “falling costs will push the price of [U.S.] solar plus storage projects below gas plants without any state subsidies in the next few years.”

That’s a heady prediction: the price of large-scale storage when paired with solar will drop enough by the early 2020s—less than a decade from now—to outcompete natural gas-fueled powered plants on a cost per kilowatt hour basis. Take a look at the graph from Bloomberg New Energy Finance’s latest report on co-located storage in the Southwest U.S., which demonstrates that the Federal Investment Tax Credit  (ITC) is a serious accelerator.

Estimated Levelized Cost of Energy for Southwest Projects

For those pursuing the opportunity offered by the looming energy storage boom, it’s important to have a crisp rationale for your product solution or development pipeline and take note of the drivers at play in this time of grid evolution. Those drivers include cost, policy mandates, traditional power plant retirements, resource adequacy, and emerging markets, as well as the climate imperative of attaining 100% carbon-free energy. From where I see it, long-duration storage will enable a carbon-free grid that is both stable and affordable. 

Right now, state mandates are driving most of the storage projects. These mandates account for hundreds of megawatts of storage systems (many of them stand-alone) taking shape in 2018 in states like New York, New Jersey, California, and Massachusetts. The below graphic from GTM Research shows several examples of state level front-of-the-meter storage policies.

Source: GTM Research / ESA / U.S. Energy Storage Monitor Q3 2018

However, many mandate-driven efforts lack a clear purpose and use case, disregarding opportunities to optimize technology and the grid itself. Melissa Johns, Duke Energy’s VP of business development noted in a recent GTM webinar: “Our challenge right now is identifying where storage can bring the most value to the grid and our customers.”

Some utility commissions appear to be using completely generic tenders to encourage IOUs to squeeze storage into their generation mix. There’s often a preference given to specific battery technologies as well, precluding opportunities to optimize system performance through long-duration storage and optimized DC-coupled architectures. To move this forward, incentives are needed that reward dispatchable PV, specifically during evening peak demand hours, or off-sun hours.

Of course, there are exceptions like the Oakland Clean Energy Initiative, which seeks to use storage to defer transmission upgrades. With a definite purpose for storage laid out, the likelihood of the bid-winning projects being scoped, built, and optimized is greatly improved, creating long term value and saving rate payers money. 

The cost of materials, especially lithium and cobalt, plays a very important role in driving the market. To unlock the market, Li-ion battery manufacturers are designing cobalt out of their products. When battery prices fall enough and supply of the various competing battery cell technologies stabilizes, the inflection point of storage competing on its own economics will be within reach. Rapidly increasing battery cell manufacturing capacity that serves stationary storage, electric vehicles, and computers is paving the way for this cost reduction, ultimately leading to a more commoditized market. In the meantime, those trying to price their storage developments are watching vital commodity costs very carefully; some are deciding to pull the trigger on bids if commodity prices reach a certain level, while others continue to keep their powder dry.

Outside of battery cost reductions, system optimization at the battery and inverter level is another key to unleashing the market. One way to optimize the system is by implementing DC-Coupled architectures, which improve production efficiency while decreasing costs. For now, system-level innovation lies dormant waiting for a well-integrated class of PV-plus-storage solutions to emerge.

Another driver shaping the new storage market stems from the need to retire older generation assets while maintaining resource adequacy. In this case, it’s important to remember that replacement solutions cannot increase rates for users. Resource adequacy mandates that grid operators provide electricity to end-consumers within a future pricing framework similar to today’s pricing. Old gensets cannot be road-mapped away just because smart storage-plus-solar offers the most sustainable choice. Contractual and financial risk (including the stranded assets problem) are at play in the regulatory sausage making, so the case for resource adequacy-driven storage must be made economically too.

In addition, the new storage/renewables technologies offer excellent solutions for replacing dirty and crumbling infrastructure; however, in states like California, utilities must have enough capacity to provide for 10% of extra forecast load over a peak event. Storage will play a major role in resource adequacy studies used to plan for such peak events.

What might forecasting 110% load look like in a 100% renewable, carbon-free future? If we consider the goals set forth in legislation like California’s SB 100 or what’s required to hit the climate mitigation targets needed to turn the tide on global warming, the answer is mind-blowing. 

Governor Jerry Brown signing SB 100, which has effectively established the massive goal of 100% clean energy in California.

A National Renewable Energy Lab study on integrating high levels of variable renewable energy (VRE) into the electric power system estimates that to reach 30% VRE-sourced electricity, 60% of the generation capacity must be renewable with 70-80 GW of 8-16-hour duration storage. At 50% VRE electricity, the amount of long-duration storage needed jumps to about 120 GW, according to the NREL model. Solutions like the NX Flow™ vanadium flow battery system will need to be deployed to avoid rising curtailment and provide long-duration energy shifting.

NX Flow integrated system deployed in Asia

The roadmap for attaining 100% renewables doesn’t just require generation and storage technologies that can provide power around the clock. Advanced grid-level controls are needed to manage the various assets and integrate high penetrations of solar, wind, and other renewables while maintaining grid stability and reliability. System architecture becomes increasingly important at this stage, with power controls and transmission optimization adding another layer of complexity. Some challenges, such as seasonal energy shifting that results in major differences in production by each asset type, require serious attention in order to reach our goals. Other than pumped hydro, no storage technology exists to solve this use case, and alternate solutions such as compressed air have yet to be successfully commercialized.

While these momentous challenges may not have near-term impact as storage market drivers today, they offer opportunities for visionary companies and organizations to develop the tools needed and create future drivers for growth in the smart energy sector of tomorrow.

To discuss market drivers for storage and more, come and speak with the Nextracker storage team at Energy Storage North America Booth 1920 next week in Pasadena.