Tag Archives: energy

Risk Dashboard Could Help the Power Grid Manage Renewables

Post Syndicated from Jeremy Hsu original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/risk-dashboard-could-help-the-power-grid-manage-renewables

To fully embrace wind and solar power unless, grid operators need to be able to predict and manage the variability that comes from changes in the wind or clouds dimming sunlight.

One solution may come from a $2-million project backed by the U.S. Department of Energy that aims to develop a risk dashboard for handling more complex power grid scenarios.

 

Grid operators now use dashboards that report the current status of the power grid and show the impacts of large disturbances—such as storms and other weather contingencies—along with regional constraints in flow and generation. The new dashboard being developed by Columbia University researchers and funded by the Advanced Research Projects Agency–Energy (ARPA-E) would improve upon existing dashboards by modeling more complex factors. This could help the grid better incorporate both renewable power sources and demand response programs that encourage consumers to use less electricity during peak periods.

“[Y]ou have to operate the grid in a way that is looking forward in time and that accepts that there will be variability—you have to start talking about what people in finance would call risk,” says Daniel Bienstock, professor of industrial engineering and operations research, and professor of applied physics and applied mathematics at Columbia University.

The new dashboard would not necessarily help grid operators prepare for catastrophic black swan events that might happen only once in 100 years. Instead, Bienstock and his colleagues hope to apply some lessons from financial modeling to measure and manage risk associated with more common events that could strain the capabilities of the U.S. regional power grids managed by independent system operators (ISOs). The team plans to build and test an alpha version of the dashboard within two years, before demonstrating the dashboard for ISOs and electric utilities in the third year of the project.

Variability already poses a challenge to modern power grids that were designed to handle steady power output from conventional power plants to meet an anticipated level of demand from consumers. Power grids usually rely on gas turbine generators to kick in during peak periods of power usage or to provide backup to intermittent wind and solar power.

But such generators may not provide a fast enough response to compensate for the expected variability in power grids that include more renewable power sources and demand response programs driven by fickle human behavior. In the worst cases, grid operators may shut down power to consumers and create deliberate blackouts in order to protect the grid’s physical equipment.

One of the dashboard project’s main goals involves developing mathematical and statistical models that can quantify the risk from having greater uncertainty in the power grid. Such models would aim to simulate different scenarios based on conditions—such as changes in weather or power demand—that could stress the power grid. Repeatedly playing out such scenarios would force grid operators to fine-tune and adapt their operational plans to handle such surprises in real life.

For example, one scenario might involve a solar farm generating 10 percent less power and a wind farm generating 30 percent more power within a short amount of time, Bienstock explains. The combination of those factors might mean too much power begins flowing on a particular power line and the line subsequently starts running hot at the risk of damage.

Such models would only be as good as the data that trains them. Some ISOs and electric utilities have already been gathering useful data from the power grid for years. Those that already have more experience dealing with the variability of renewable power have been the most proactive. But many of the ISOs are reluctant to share such data with outsiders.

“One of the ISOs has told us that they will let us run our code on their data provided that we actually physically go to their office, but they will not give us the data to play with,” Bienstock says. 

For this project, ARPA-E has been working with one ISO to produce synthetic data covering many different scenarios based on historical data. The team is also using publicly available data on factors such as solar irradiation, cloud cover, wind strength, and the power generation capabilities of solar panels and wind turbines.

“You can look at historical events and then you can design stress that’s somehow compatible with what we observe in the past,” says Agostino Capponi, associate professor of industrial engineering and operations research at Columbia University and external consultant for the U.S. Commodity Futures Trading Commission.

A second big part of the dashboard project involves developing tools that grid operators could use to help manage the risks that come from dealing with greater uncertainty. Capponi is leading the team’s effort to design customized energy volatility contracts that could allow grid operators to buy such contracts for a fixed amount and receive compensation for all the variance that occurs over a historical period of time.

But he acknowledged that financial contracts designed to help offset risk in the stock market won’t apply in a straightforward manner to the realities of the power grid that include delays in power transmission, physical constraints, and weather events.

“You cannot really directly use existing financial contracts because in finance you don’t have to take into account the physics of the power grid,” Capponi says.

Once the new dashboard is up and running, it could begin to help grid operators deal with both near-term and long-term challenges for the U.S. power grid. One recent example comes from the current COVID-19 pandemic and associated human behavioral changes—such as more people working from home—having already increased variability in energy consumption across New York City and other parts of the United States. In the future, the risk dashboard might help grid operators quickly identify areas at higher risk of suffering from imbalances between supply and demand and act quickly to avoid straining the grid or having blackouts.

Knowing the long-term risks in specific regions might also drive more investment in additional energy storage technologies and improved transmission lines to help offset such risks. The situation is different for every grid operator’s particular region, but the researchers hope that their dashboard can eventually help level the speed bumps as the U.S. power grid moves toward using more renewable power.

“The ISOs have different levels of renewable penetration, and so they have different exposures and visibility to risk,” Bienstock says. “But this is just the right time to be doing this sort of thing.”

The Privatization of Puerto Rico’s Power Grid Is Mired in Controversy

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/policy/the-privatization-of-puerto-rico-power-grid-mired-in-controversy

When Hurricane Maria razed Puerto Rico in September 2017, the storm laid bare the serious flaws and pervasive neglect of the island’s electricity system. Nearly all 3.4 million residents lost power for weeks, months, or longer—a disaster unto itself that affected hospitals and schools and shut down businesses and factories.

The following January, then-Gov. Ricardo Rosselló signaled plans to sell off parts of the Puerto Rico Electric Power Authority (PREPA), leaving private companies to do what the state-run utility had failed to accomplish. Rosselló, who resigned last year, said it would take about 18 months to complete the transition.

“Our objective is simple: provide better service, one that’s more efficient and that allows us to jump into new energy models,” he said that June, after signing a law to start the process.

Yet privatization to date has been slow, piecemeal, and mired in controversy. Recent efforts seem unlikely to move the U.S. territory toward a cleaner, more resilient system, power experts say.

As the region braces for an “unusually active” 2020 hurricane season, the aging grid remains vulnerable to disruption, despite US $3.2 billion in post-Maria repairs. 

Puerto Rico relies primarily on large fossil fuel power plants and long transmission lines to carry electricity into mountains, coastlines, and urban centers. When storms mow down key power lines, or earthquakes destroy generating units—as was the case in January—outages cascade across the island. Lately, frequent brownouts caused by faulty infrastructure have complicated efforts to confront the COVID-19 outbreak. 

“In most of the emergencies that we’ve had, the centralized grid has failed,” says Lionel Orama Exclusa, an electrical engineering professor at the University of Puerto Rico-Mayagüez and member of Puerto Rico’s National Institute of Energy and Island Sustainability.  

He and many others have called for building smaller regional grids that can operate independently if other parts fail. Giant oil- and gas-fired power plants should similarly give way to renewable energy projects distributed near or within neighborhoods. Last year, Puerto Rico adopted a mandate to get to 100 percent renewable energy by 2050. (Solar, wind, and hydropower supply just 2.3 percent of today’s total generation.)

So far, however, PREPA’s contracts to private companies have mainly focused on retooling existing infrastructure—not reimagining the monolithic system. The companies are also tied to the U.S. natural gas industry, which has targeted Puerto Rico as a place to offload mainland supplies.

In June, Luma Energy signed a 15-year contract to operate and maintain PREPA’s transmission and distribution system. Luma is a newly formed joint venture between infrastructure company Quanta Services and Canadian Utilities Limited. The contract is valued between $70 million and $105 million per year, plus up to $20 million in annual “incentive fees.”

Wayne Stensby, president and CEO of Luma, said his vision for Puerto Rico includes wind, solar, and natural gas and is “somewhere down the middle” between a centralized and decentralized grid, Greentech Media reported. “It makes no sense to abandon the existing grid,” he told the news site in June, adding that Luma’s role is to “effectively optimize that reinvestment.”

Orama Exclusa says he has “mixed feelings” about the contract.

If the private consortium can effectively use federal disaster funding to fix crumbling poles and power lines, that could significantly improve the system’s reliability, he says. But the arrangement still doesn’t address the “fundamental” problem of centralization.

He also is also concerned that the Luma deal lacks transparency. Former utility leaders and consumer watchdogs have noted that regulators did not include public stakeholders in the 18-month selection process. They say they’re wary Puerto Rico may be repeating missteps made in the wake of Hurricane Maria.

As millions of Puerto Ricans recovered in the dark, PREPA quietly inked a no-bid, one-year contract for $300 million with Whitefish Energy Holdings, a two-person Montana firm with ties to then-U.S. Interior Secretary Ryan Zinke. Cobra Acquisitions, a fracking company subsidiary, secured $1.8 billion in federal contracts to repair the battered grid. Last September, U.S. prosecutors charged Cobra’s president and two officials in the Federal Emergency Management Agency with bribery and fraud

A more recent deal with another private U.S. firm is drawing further scrutiny.

In March 2019, New Fortress Energy won a five-year, $1.5 billion contract to supply natural gas to PREPA and convert two units (totaling 440 megawatts) at the utility’s San Juan power plant from diesel to gas. The company, founded by billionaire CEO Wes Edens, completed the project this May, nearly a year behind schedule. It also finished construction of a liquefied natural gas (LNG) import terminal in the capital city’s harbor. 

“This is another step forward in our energy transformation,” Gov. Wanda Vázquez Garced said in May during a tour of the new facilities. Converting the San Juan units “will allow for a cheaper and cleaner fuel” and reduce monthly utility costs for PREPA customers, she said.

Critics have called for canceling the project, which originated after New Fortress submitted an unsolicited proposal to PREPA in late 2017. The ensuing deal gave New Fortress an “unfair advantage,” was full of irregularities, and didn’t undergo sufficient legal review or financial oversight, according to a June report by CAMBIO, a Puerto Rico-based environmental nonprofit, and the Institute for Energy Economics and Financial Analysis.

The project “would continue to lock in fossil fuels on the island and would prevent the aggressive integration of renewable energy,” Ingrid Vila Biaggi, president of CAMBIO, told the independent news program Democracy Now!

The U.S. Federal Regulatory Commission, which oversees the transmission and wholesale sale of electricity and natural gas, also raised questions about the LNG import terminal.

On 18 June, the agency issued a rare show-cause order demanding that New Fortress explain why it didn’t seek prior approval before building the infrastructure at the Port of San Juan. New Fortress has 30 days to explain its failure to seek the agency’s authorization.

Concerns over contracts are among the many challenges to revitalizing Puerto Rico’s grid. The island has been mired in a recession since 2006, amid a series of budget shortfalls, financial crises, and mismanagement—which contributed to PREPA filing for bankruptcy in 2017, just months before Maria struck. The COVID-19 pandemic is further eroding the economy, with Puerto Ricans facing widespread unemployment and rising poverty

The coming months—typically those with the most extreme weather—will show if recent efforts to privatize the grid will alleviate, or exacerbate, Puerto Rico’s electricity problems.

The Software-Defined Power Grid Is Here

Post Syndicated from Patrick T. Lee original https://spectrum.ieee.org/energy/the-smarter-grid/the-softwaredefined-power-grid-is-here

My colleagues and I have been spending a lot of time on a project in Onslow, a remote coastal town of 850 in Western Australia, where a wealth of solar, wind power, and battery storage has come on line to complement the region’s traditional forms of power generation. We’re making sure that all of these distributed energy resources work as a balanced and coordinated system. The team has traveled more than 15,000 kilometers from our company headquarters in San Diego, and everyone is excited to help the people of Onslow and Western Australia’s electric utility Horizon Power.

Like other rural utilities around the world, Horizon faces an enormous challenge in providing reliable electricity to hundreds of small communities scattered across a wide area. Actually, calling this a “wide area” is a serious understatement: Horizon’s territory covers some 2.3million square kilometers—about one and a half times the size of Alaska. You can’t easily traverse all that territory with high-tension power lines and substations, so local power generation is key. And as the country tries to shrink its carbon footprint, Horizon is working with its customers to decrease their reliance on nonrenewable energy. The incentives for deploying renewables such as photovoltaics and wind turbines are compelling.

But adding more solar and wind power here, as elsewhere, brings its own problems. In particular, it challenges the grid’s stability and resilience. The power systems that most people are connected to were designed more than a century ago. They rely on large, centralized generation plants to deliver electricity through transmission and distribution networks that feed into cities, towns, homes, schools, factories, stores, office buildings, and more. Our 100-year-old power system wasn’t intended to handle power generators that produce electricity only when the sun is shining or the wind is blowing. Such intermittency can cause the grid’s voltage and frequency to fluctuate and spike dangerously when power generation isn’t balanced with demand throughout the network. Traditional grids also weren’t designed to handle energy flowing in two directions, with hundreds or thousands of small generators like rooftop solar panels attached to the network.

The problem is being magnified as the use of renewables grows worldwide. According to the United Nations report Global Trends in Renewable Energy Investment 2019, wind and solar power accounted for just 4 percent of generating capacity worldwide in 2010. That figure was expected to more than quadruple within a decade, to 18 percent. And that trend should continue for at least the next five years, according to the International Energy Agency. It anticipates that renewable energy capacity will rise by 50 percent through 2024, with solar photovoltaics and onshore wind making up the lion’s share of that increase.

Rather than viewing this new capacity as a valuable asset, though, many grid operators fear the intermittency of renewable resources. Rather than finding a way to integrate them, they have tried to limit the amount of renewable energy that can connect to their networks, and they routinely curtail the output of these sources.

In late 2018, Horizon Power hired my company, PXiSE Energy Solutions, to better integrate renewables across its vast territory. Many electric utilities around the world are grappling with this same challenge. The chief difference between what others are doing and our approach is that we use a special sensor, called a phasor measurement unit[PDF], or PMU. This sensor, first developed in the 1980s, measures voltage and current at various points on the grid and then computes the magnitude and phase of the signals, with each digitized measurement receiving a time stamp accurate to within 1 microsecond of true time. Such measurements reveal moment-by-moment changes in the status of the network.

For many years, utilities have deployed PMUs on their transmission systems, but they haven’t fully exploited the sensors’ real-time data. The PXiSE team developed machine-learning algorithms so that our high-speed controller can act quickly and autonomously to changes in generation and consumption—and also predict likely future conditions on the network. This intelligent system mitigates any grid disturbances while continuously balancing solar generation, battery power, and other available energy resources, making the grid more efficient and reliable. What’s more, our system can be integrated into virtually any type of power grid, regardless of its size, age, or mix of generation and loads. Here’s how it works.

The basic thing a PMU measures is called a phasor. Engineering great Charles Proteus Steinmetz coined this term back in 1893 and described how to calculate it based on the phase and amplitude of an alternating-current waveform. Nearly a century later, Arun G. Phadke and his team at Virginia Tech developed the phasor measurement unit and showed that the PMU could directly measure the magnitude and phase angle of AC sine waves at specific points on the grid.

PMUs were commercialized in 1992, at which point utilities began deploying the sensors to help identify outages and other grid “events.” Today, there are tens of thousands of PMUs installed at major substations throughout the United States. (The accuracy of PMU data is dictated by IEEE Standard C37.118 and the newer IEEE/IEC Standard 60255-118-1-2018, which call for more accurate and consistent power measurements than are typically required for other sensors.)

But these devices are valuable for much more than simply tracking blackouts. Today’s high-speed PMUs provide data 60 times per second, which is more than 200 times as fast as the sampling rate of the conventional SCADA (supervisory control and data acquisition) systems used on most electric grids. What’s more, PMU data are time-stamped with great precision using the Global Positioning System (GPS), which synchronizes all measurements worldwide. For that reason, the data are sometimes called synchrophasors.

With that kind of time resolution, PMUs can provide an extremely accurate and detailed snapshot of power quality, indicating how consistently the voltage and current remain within their specified ranges. Wide fluctuations can lead to inefficiency and wasted electricity. If the fluctuations grow too large, they can damage equipment or even trigger a brownout or blackout. The time-stamped data are particularly helpful when your resources are scattered across a wide area, as in Horizon Power’s grid in Western Australia.

The main reason that PMUs haven’t been fully exploited is that most utilities don’t take advantage of modern data communications and advanced control technologies. Even when they have PMUs and data networks in place, they haven’t tied them together to help coordinate their solar, wind, and other energy resources. SCADA systems are designed to send their information every 2 or 3seconds to a central operations center where human operators are watching and can take action if something goes wrong. But the power electronics used in inverter-based systems like solar PV, wind turbines, and energy storage units operate on the millisecond level, and they require much higher-speed control to maximize their benefits.

I’ve spent my entire 33-year career as a power engineer working for small and large power companies in California, so I’m deeply familiar with the issues that utilities have faced as more distributed-energy resources have come on line. I long had a hunch that PMUs and synchrophasors could play a larger role in grid modernization. In 2015, when I was vice president of infrastructure and technology at Sempra Energy, I began working with Charles Wells, an expert on power system controls who was then at OSIsoft, to figure out whether that hunch was valid and what a synchrophasor-based control system might look like.

Meeting every Friday afternoon for a few hours, we spent close to a year doing calculations and running simulations that confirmed PMU data could be used to both measure and control grid conditions in real time. We then turned to Raymond de Callafon of the University of California, San Diego, and other experts at OSIsoft to figure out how to create a robust commercial system. Our goal was to build a software-based controller that would make decisions autonomously, enabling a transition away from the hands-on adjustments required by traditional SCADA systems.

The controller we envisioned would need to operate with great accuracy and at high speeds. Our system would also need artificial intelligence to be able to forecast grid conditions hourly and daily. Because our platform would be largely software based, it would require a minimum of new hardware, and the integration into an existing network would be quick and inexpensive. In essence, we sought to create a new operating system for the grid.

By early 2017, our system was ready for its first real-world test, which took our team of six engineers to a 21-MW wind farm and battery storage facility on an island in Hawaii. In recent years, Hawaii has dramatically increased solar and wind generation, but this particular island’s grid had reached a limit to integrating more renewable energy, due to the variability of the wind resources. It was less complicated for the grid operator to rely on fossil-fueled generators than to mitigate the intermittent power from the wind farm, which was periodically shut off, or curtailed, at night to ensure grid stability.

We were on a startup’s budget, so we used off-the-shelf equipment, including a standard PC running a digital simulator model that de Callafon created. We rented a house near the wind farm, where we spent several days testing the technology. It was a fantastic moment when we realized we could, in fact, control energy generation at the wind farm from the kitchen table. (We took all due cybersecurity precautions, of course.) With our control system in place, the inherent power fluctuations of the wind farm were smoothed out by real-time dispatch from the battery storage facility. This allowed the wind farm’s generation to be far more reliable.

Soon after completing the Hawaii project, we installed a controller running our PMU-based algorithm to create a microgrid in Sempra Energy’s high-rise office building in downtown San Diego. The controller was designed to optimize the use of the building’s electric-vehicle chargers, solar panels, and storage batteries. Meanwhile, it would also reduce the reliance on grid power in the late afternoon and early evening, when demand and prices typically spike. At such times, the optimization algorithm automatically determines the proper resource mix and schedule, enabling one floor to be served solely by local renewable sources and stored energy rather than the main grid. This shift reduced the building’s utility bill by 20 percent, which we’ve since found to be typical for similar microgrids.

More recently, our team traveled to Jeju Island, in South Korea. Prior to the installation of our grid controller, the island’s electricity primarily came from the mainland via high-voltage underwater cables. The system was designed to interconnect two local battery-storage systems—with capacities of 224 kilowatt-hours and 776 kWh—with a 500-kW solar farm and 600 kW of wind power. To meet the island’s renewable energy goals and save on electricity costs, wind and solar power are stored in the batteries during the day, when power demand and grid prices are low. The stored electricity is then used at night, when power demand and grid prices are high.

The on-site installation of our controller took just four days. In addition to operating the two battery-storage systems throughout the day and night, PXiSE’s controller uses the batteries to provide what’s known as “ramp control” for the wind turbines. Most grids can tolerate only a gradual change in power flow, typically no more than 1 megawatt per minute, so ramp control acts like a shock absorber. It smooths out the turbines’ power output when the wind suddenly changes, which would otherwise cause reliability problems on the grid.

Every 16.7 milliseconds (one cycle of the 60-hertz wave), the controller looks for any change in wind power output. If the wind stops blowing, the system instantly compensates by discharging the batteries to the grid, gradually decreasing the amount of discharged battery power to zero, as power from the main grid gradually increases to match demand. When the wind starts blowing again, the batteries immediately respond by absorbing a portion of the generated wind power, gradually reducing the amount being stored to zero as the controller ramps up the amount feeding into the grid, thereby meeting real-time energy demand.

The control system also creates its own forecasts of load and generation based on weather forecasts and AI processing of historical load data. The forecasting models are updated in real time to compute the optimal settings for power and energy storage in the system.

At each of our other projects—in Chile, Guam, Hong Kong, Japan, Mexico, and a handful of U.S. sites—the system we provide looks a little different, depending on the customer’s mix of energy resources and what their priorities are. We designed the PXiSE system so it can be deployed in any part of the electric grid, from a rooftop solar array all the way to a centralized power plant connected to a large network. Our customers appreciate that they don’t have to install additional complex equipment beyond the controller, that integrating our technology is fast, and that they see positive results right away.

The work we’re doing is just one part of a far bigger goal: to make the grid smarter and cleaner. This is the great challenge of our time. The power industry isn’t known for being on the cutting edge of technology adoption, but the transition away from a century-old, hardware-dependent analog grid toward a modern software-based digital grid is not just desirable but essential. Our team of enthusiastic entrepreneurs and engineers joins thousands of others around the world who are committed to making a difference in the future of our industry and our planet.

This article appears in the July 2020 print issue as “The Software-Defined Power Grid.”

About the Author

Patrick T. Lee is CEO of PXiSE Energy Solutions and vice president of infrastructure and technology at Sempra Energy, in San Diego.

Savant Systems Buys GE Lighting

Post Syndicated from Susan Hassler original https://spectrum.ieee.org/energy/the-smarter-grid/savant-systems-buys-ge-lighting

If you were trying to identify the handful of companies that created a business out of the emergence of electrical engineering in the Victorian era, certainly the list would have to include General Electric. GE’s lighting business was established by Thomas Edison in 1892, and it was just sold to engineering wunderkind Robert Madonna and his team at Savant Systems, in Hyannis, Mass. Savant builds high-end home-networking systems for the well-heeled.

By buying GE Lighting—and licensing the use of the GE Lighting and GE Smart Home names—Savant hopes to become a force in the emerging market for home networking devices, a business that has been a battleground for some of the biggest names in technology: Amazon, Apple, and Google.

I spoke to Savant president J.C. Murphy about the acquisition and Savant’s plans for the future.

Can you tell me a little about where Savant came from?

Bob [Madonna] tells a funny story about buying a place in [New York City’s] SoHo and getting his first automation system.

And so he bought this system and nothing really worked and it took weeks and weeks of programming to get simple functions up and working and when he would switch out a cable box or he’d switch out a TV, the whole system would come down and it was back to the drawing board.

That was really the beginning. He said, “I can definitely build a company and do this better.” And he took a lot of the thinking that we used previously in the telecommunications space in terms of reliability and redundancy, and fault tolerance, and we’ve imported that into what we build at Savant.

Basically, it’s a combination of hardware and software that ties the way you live in the home together through an elegant user experience. A simple app that’s on a touch pad or on your Android or iOS device, it basically allows the different things in your home to communicate seamlessly together. So your heating, and your lighting systems, and your audio and your video systems, and your security systems are all tied together in this unique app.

Our vision is to be able to expand that same quality look and feel to a broader marketplace. And obviously, we want to maintain the integrity of the Savant brand in this high-end custom home-automation area, but we want to import our technology into the iconic GE Lighting, and soon, the GE Smart Home brand, to bring it to more and more people at a better price point.

So why are lightbulbs central to this vision?

When you look at what you have in GE Lighting, it’s this unbelievable ability to build and manufacture and engineer quality products at scale.

There are some great individual point solutions out there, but there’s no company that has really introduced this at scale across the entire system. Our vision is to take the Savant DNA and technology and really build out an ecosystem of products and solutions that can be deployed very broadly.

This article appears in the July 2020 print issue as “Google Has Nest. Amazon Has Echo and Alexa. Savant Systems Has—GE Lighting!”

How the Pandemic Impacts U.S. Electricity Usage

Post Syndicated from Jeremy Hsu original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/how-the-pandemic-impacts-us-electricity-usage

As the COVID-19 outbreak swept through Manhattan and the surrounding New York City boroughs earlier this year, electricity usage dropped as businesses shuttered and people hunkered down in their homes. Those changes in human behavior became visible from space as the nighttime lights of the city that never sleeps dimmed by 40 percent between February and April.

That striking visualization of the COVID-19 impact on U.S. electricity consumption came from NASA’s “Black Marble” satellite data. U.S. and Chinese researchers are currently using such data sources in what they describe as an unprecedented effort to study how electricity consumption across the United States has been changing in response to the pandemic. One early finding suggests that mobility in the retail sector—defined as daily visits to retail establishments—is an especially significant factor in the reduction of electricity consumption seen across all major U.S. regional markets.

“I was previously not aware that there is such a strong correlation between the mobility in the retail sector and the public health data on the electricity consumption,” says Le Xie, professor in electrical and computer engineering and assistant director of energy digitization at the Texas A&M Energy Institute. “So that is a key finding.”

Xie and his colleagues from Texas A&M, MIT, and Tsinghua University in Beijing, China, are publicly sharing their Coronavirus Disease-Electricity Market Data Aggregation (COVID-EMDA) project and the software codes they have used in their analyses in an online Github repository. They first uploaded a preprint paper describing their initial analyses to arXiv on 11 May 2020. 

Most previous studies that focused on public health and electricity consumption tried to examine whether changes in electricity usage could provide an early warning sign of health issues. But when the U.S. and Chinese researchers first put their heads together on studying COVID-19 impacts, they did not find other prior studies that had examined how a pandemic can affect electricity consumption.

Beyond using the NASA satellite imagery of the nighttime lights, the COVID-EMDA project also taps additional sources of data about the major U.S. electricity markets from regional transmission organizations, weather patterns, COVID-19 cases, and the anonymized GPS locations of cellphone users.

“Before when people study electricity, they look at data on the electricity domain, perhaps the weather, maybe the economy, but you would have never thought about things like your cell phone data or mobility data or the public health data from COVID cases,” Xie says. “These are traditionally totally unrelated data sets, but in these very special circumstances they all suddenly became very relevant.”

The unique compilation of different data sources has already helped the researchers spot some interesting patterns. The most notable finding suggests that the largest portion of the drop in electricity consumption likely comes from the drop in people’s daily visits to retail establishments as individuals begin early adoption of practicing social distancing and home isolation. By comparison, the number of new confirmed COVID-19 cases does not seem to have a strong direct influence on changes in electricity consumption.

The Northeastern region of the U.S. electricity sector that includes New York City seems to be experiencing the most volatile changes so far during the pandemic. Xie and his colleagues hypothesize that larger cities with higher population density and commercial activity would likely see bigger COVID-19 impacts on their electricity consumption. But they plan to continue monitoring electricity consumption changes in all the major regions as new COVID-19 hotspots have emerged outside the New York City area.

The biggest limitation of such an analysis comes from the lack of available higher-resolution data on electricity consumption. Each of the major regional transmission organizations publishes power load and price numbers daily for their electricity markets, but this reflects a fairly large geographic area that often covers multiple states. 

“For example, if we could know exactly how much electricity is used in each of the commercial, industrial, and residential categories in a city, we could have a much clearer picture of what is going on,” Xie says.

That could change in the near future. Some Texas utility companies have already approached the COVID-EMDA group about possibly sharing such higher-resolution data on electricity consumption for future analyses. The researchers have also heard from economists curious about analyzing and perhaps predicting near-term economic activities based on electricity consumption changes during the pandemic.

One of the next big steps is to “develop a predictive model with high confidence to estimate the impact to electricity consumption due to social-distancing policies,” Xie says. “This could potentially help the public policy people and [regional transmission organizations] to prepare for similar situations in the future.”

Used EV Batteries Could Power Tomorrow’s Solar Farms

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/energywise/energy/batteries-storage/used-ev-batteries-could-power-tomorrows-solar-farms

As the number of electric vehicles on the world’s roads multiplies, a variety of used EV batteries will inevitably come into the marketplace. This, says a team of MIT researchers, could provide a golden opportunity for solar energy: Grid-scale renewable energy storage. This application, they find, can run efficiently on batteries that aren’t quite up to snuff for your Tesla or Chevy Bolt.

Spherical Solar Cells Soak Up Scattered Sunlight

Post Syndicated from Jeremy Hsu original https://spectrum.ieee.org/energywise/energy/renewables/spherical-solar-cells-soak-up-scattered-sunlight

Flat solar panels still face big limitations when it comes to making the most of the available sunlight each day. A new spherical solar cell design aims to boost solar power harvesting potential from nearly every angle without requiring expensive moving parts to keep tracking the sun’s apparent movement across the sky. 

Why Oil-Spill Tech Is So Primitive and How One Man Is Trying to Fix It

Post Syndicated from Larry Herbst original https://spectrum.ieee.org/energy/fossil-fuels/why-oilspill-tech-is-so-primitive-and-how-one-man-is-trying-to-fix-it

The seas had turned rough as a sudden squall whipped up the winds enough to howl through the rigging. And with those winds came a powerful smell of oil. Soon I could see the characteristic rainbow sheen from my position on the rail of this fishing trawler. It was May of 2016, and we were in the Gulf of Mexico, about 16 kilometers off the southeast coast of Louisiana.

“Skimmer in the water,” bellowed Kevin Kennedy, an Alaskan fisherman turned oil-spill remediation entrepreneur. Ropes groaned as the boat’s winches lowered his prototype oil-recovery system into the heaving seas. As the trawler bobbed up and down, Kennedy’s contraption rode the waves, its open mouth facing into the slick, gulping down a mix of seawater and crude oil.

The stomach of Kennedy’s device, to continue the analogy, was a novel separator, which digested the mixture of seawater and oil. By virtue of its clever engineering, it excreted essentially nothing but water. At the top of the separator’s twin tanks, the collected oil began to swell. When enough had accumulated, the oil was sucked safely into a storage tank. Then the cycle would begin again.

How much oil there was on the water here was a subject of great dispute. But its source was clear enough. In 2004, Hurricane Ivan blasted through the Gulf of Mexico triggering submarine landslides that toppled a drilling platform erected by New Orleans–based Taylor Energy. The mangled tops of Taylor’s subsea oil wells were then buried under some 20 meters of mud. But all that loose mud didn’t do much to stem the flow of oil and gas from many of these wells.

Efforts to contain the flow from the shattered wells, conducted between 2009 and 2011, saw only partial success. Oil continued to flow from some of these wells and rise to the surface for years to come.

While this oil spill posed a nasty threat to the marine environment, it served as a valuable testing ground for Kennedy’s invention. This former fisherman has spent a small fortune to prove he has created an effective system for cleaning up oil spilled on the water, one that works well in real-world conditions. But for all he can tell so far, it’s a system that nobody wants.

“I thought if I built a better mousetrap, everyone would want one,” Kennedy says. “Instead, the world has decided they’re okay with mice.”

There are countless oil tankers, barges, rigs, and pipelines that operate in, around, and through U.S. coastal waters. Every year, some of them leak some of their contents. In a typical year the leakage amounts to no more than a million gallons or so. But every now and then a monster mishap spills considerably more: In 1989, the Exxon Valdez tanker ran aground on a reef and gushed some 11 million U.S. gallons (42,000 cubic meters) of oil into the pristine waters of Prince William Sound, Alaska. In 2005, Hurricane Katrina unleashed more than 8 million gallons (30,000 cubic meters) from Louisiana storage facilities. And even those incidents pale in comparison with the Deepwater Horizon disaster in 2010, in which a drilling rig leased by BP exploded in the Gulf of Mexico, killing 11 people and ultimately releasing some 210 million gallons (almost 800,000 cubic meters) of oil.

Such disasters not only ravage huge, complex, and delicate marine ecosystems, but they are also economically devastating: The damage to tourism and commercial fisheries is often measured in the hundreds of millions of dollars.

To deal with such fiascoes, engineers, chemists, and inventors have devised, sometimes on the fly, a grab bag of equipment, systems, chemicals, and procedures for collecting the oil and removing it, or for breaking it up or burning it in place to lessen its environmental impacts.

Today, the oil-spill-management market is a roughly US $100-billion-a-year industry with hundreds of companies. But multiple studies of the biggest episodes, including the Deepwater Horizon disaster, have questioned the industry’s motives, methods, track record, and even its utility.

After decades in the industry, Kennedy, a small player in a big business, has unique perspectives on what ails it. His involvement with oil spills stretches back to 1989, when he bought his first fishing boat and paid for a license to trawl for shrimp near Prince William Sound. He couldn’t have chosen a worse time to begin a fishing career. On the first day of the shrimping season, the Exxon Valdez ran aground on Bligh Reef, and Kennedy found himself drafted as a first responder. He spent more than four months corralling oil with his nets and using his fish pumps to transfer the unending gobs of sticky mess into his boat’s hold.

Meanwhile, millions of dollars of oil-skimming equipment was airlifted to the nearby port of Valdez, most of it ultimately proving useless. Kennedy has witnessed something similar every time there’s a spill nearby: There may be lots of cleanup activity, but often, he insists, it’s just to put on a good show for the cameras. In the end, most of the oil winds up on the beach—“Nature’s mop,” he calls it.

In 2004, Kennedy participated in the cleanup that followed the grounding of the cargo ship Selendang Ayu—a tragic accident that cost six sailors their lives and released 336,000 gallons (1,272 cubic meters) of fuel oil into Alaskan waters. After that incident, he became convinced he could design gear himself that could effectively recover the oil spilled on the water before it hit the beach. His design employed fishing nets and fish pumps, normally used to transfer fish from the nets into the holds of fishing vessels. (Fish pumps use vacuum instead of whirling impellers, meaning no chopped-up fish.)

Fast forward to 2010 and the Deepwater Horizon disaster. The amount of oil released into the water by that well blowout seemed limitless—as did the money available to try to clean things up. Desperate for solutions, BP was exploring every avenue, including leasing oil-water separators built by actor Kevin Costner’s oil-cleanup company, Ocean Therapy Solutions, now defunct. BP ultimately spent some $69 billion on its response to the disaster, including legal fees.

In the midst of those frenzied cleanup efforts, Kennedy packed up a hastily assembled oil-recovery system and drove from Anchorage to Louisiana. He presented himself to BP, which worked out a contract with Kennedy. But before he could sign it, the oil well was capped.

Although enormous oil slicks still covered the sea, Kennedy was no longer allowed to participate in the cleanup. Only an emergency, the relevant regulators felt, gave them the flexibility to try out new technology to address a spill. With the emergency now officially over, cleanup would be done strictly by the book, following specific U.S. Coast Guard guidelines.

First and foremost, only equipment from a very short list of certified systems would be used. So Kennedy watched from the shore as others went to work. Less than 3 percent of the oil from the BP spill was ever recovered, despite billions spent on efforts that mostly involved burning the oil in place or applying chemical dispersants—measures for addressing the problem that pose environmental hazards of their own.

In 2011, in the wake of the Deepwater Horizon spill, the XPrize Foundation mounted the Wendy Schmidt Oil Cleanup XChallenge, named for the philanthropist and wife of Eric Schmidt, the former executive chairman of Google. The purpose of the contest was to foster technical advances in the oil-spill cleanup industry. Kennedy submitted his system on something of a lark and was startled to learn he was chosen as one of 10 finalists from among hundreds of entrants, including some of the biggest names in the oil-spill field.

“All the global players were there: Lamor, Elastec, Nofi. Some of these are hundred-million-dollar companies,” says Kennedy. “When I finished packing up the shipping container to go down to the competition, I think I had $123 left in my checking account.” His life savings depleted, Kennedy was forced to ask friends for donations to afford the plane ticket to New Jersey, where the competition was being held.

Located in Leonardo, N.J., the Department of the Interior’s Ohmsett facility houses a giant pool more than 200 meters long and almost 20 meters wide. Researchers there use computerized wave generators to simulate a variety of open-water environments. Whereas the industry standard for an oil skimmer was 1,100 gallons of oil per minute, the organizers of this XPrize competition sought a machine that could recover upwards of 2,500 gallons per minute with no more than 30 percent water in the recovered fluid.

Kennedy had cobbled together his skimmer from used fishing gear, including a powerful 5,000-gallon-per-minute fish pump. In addition, Kennedy’s system used lined fishing nets to capture the oil at the surface. This equipment would be familiar to just about anyone who has worked on fishing boats, which are often the first on the scene of an oil spill. So there would be a minimal learning curve for such first responders.

When the XPrize competition began, Kennedy’s team was the second of the 10 finalists to be tested. Perhaps due to inexperience, perhaps to carelessness, Ohmsett staff left the valves to the collection tank closed. Kennedy’s equipment roared to full power and promptly exploded. The massive fish pump had been trying to force 5,000 gallons a minute through a sealed valve. The pressure ruptured pipes, bent heavy steel drive shafts, and warped various pressure seals.

Replacement parts arrived with just an hour to spare, narrowly allowing Kennedy to finish his test runs. Although his damaged pump could no longer run at full capacity, his skimmer delivered impressive efficiency numbers. “On some of the runs, we got 99 percent oil-to-water ratio,” he says.

Kennedy didn’t win the contest—or the $1 million dollar prize his fledgling company sorely needed. The team that took first place in this XPrize competition, Elastec, fielded a device that could pump much more fluid per minute, but what it collected was only 90 percent oil. The second-prize winner’s equipment, while also pumping prodigious volumes of fluid, collected only 83 percent oil.

Although Kennedy’s system demonstrated the best efficiency at the XPrize competition, buyers were not forthcoming. It wasn’t surprising. “The real problem is you don’t get paid by the gallon for recovered oil,” says Kennedy, who soon discovered that the motivations of the people carrying out oil-spill remediation often aren’t focused on the environment. “It’s a compliance industry: You’re required to show you have enough equipment to handle a spill on paper, regardless of whether the stuff actually works or not.”

The problem, in a nutshell, is this: When there’s an oil spill, responders are typically hired by the day instead of being paid for the amount of oil they collect. So there’s little incentive for them to do a better job or upgrade their equipment to a design that can more efficiently separate oil from water. If anything, there’s a reverse incentive, argues Kennedy: Clean up a spill twice as quickly, and you’ll make only half as much money.

The key term used by regulators in this industry is EDRC, which stands for Effective Daily Recovery Capacity. This is the official estimate of what a skimmer can collect when deployed on an oil spill. According to the regulations of the Bureau of Safety and Environmental Enforcement, EDRC is computed by “multiplying the manufacturer’s rated throughput capacity over a 24-hour period by 20 percent…to determine if you have sufficient recovery capacity to respond to your worst-case discharge scenario.”

Reliance on the equipment’s rated throughput, as determined by tank testing, and assumed effectiveness is at the heart of the agreement hammered out between government and oil companies in the wake of the Exxon Valdez disaster. It’s a calculation of what theoretically would work to clean up a spill. Unfortunately, as Kennedy has seen time and again on actual spills, performance in the field rarely matches those paper estimates, which are based on tests that don’t reflect real-world conditions.

Even though he thought the rules made no sense, Kennedy needed to get his equipment certified according to procedures established by ASTM International (an organization formerly known as American Society for Testing and Materials). So in 2017 he paid to have his equipment tested to establish its official ratings.

Those recovery ratings are determined by placing skimmers in a test tank with a 3-inch-thick (almost 8-centimeter) layer of floating oil. They are powered up for a minimum of 30 seconds, and the amount of oil they transfer is measured. It’s an unrealistic test: Oil spills almost never result in a 3-inch layer of oil. Oil slicks floating on the ocean are usually measured in millimeters. And the thickness of an oil sheen, like that seen at the Taylor spill, is measured in micrometers.

“So many tests are really just a pumping exercise,” says Robert Watkins, a consultant with Anchorage-based ASRC Energy Services who specializes in spill response. “But that isn’t a true demonstration of response.” The value of ASTM ratings, he explains, is allowing a reproducible “apples-to-apples” comparison of oil-spill equipment. He doesn’t argue, however, that apples are the right choice in the first place.

Kennedy knows that it’s not difficult to game the system to get high numbers. According to ASTM’s testing rules, even a monstrous pump that doesn’t separate oil from water at all can still get credit for doing the job. If a company stockpiles enough of those pumps in a warehouse somewhere—or maintains enough barges loaded with oil-absorbent pads—it will be certified as having a compliant spill-response plan, he contends. In the event of an actual spill, Kennedy says, most of that gear is useless: “Good luck cleaning anything up with pumps and diapers!”

In recent years, the Bureau of Safety and Environmental Enforcement (BSEE) and a consultancy called Genwest have worked to develop a better guide, hoping to replace Effective Daily Recovery Capacity with a different metric: Estimated Recovery System Potential (ERSP). This new measure looks at the entire system functionality and delivers far more realistic numbers.

A highly efficient system like Kennedy’s would stack up favorably according to ERSP calculations. But according to Elise DeCola, an oil-spill contingency planning expert with Alaska-based Nuka Research and Planning Group, there has been limited adoption of the ERSP calculator by the industry.

“While BSEE recommends ERSP as an industry best practice, they do not require its use,” says DeCola. “Operators that have deep inventories of low-efficiency skimmers—equipment that is still compliant with current guidelines—could suddenly be forced to invest in new skimmers.” For many, moving the goal posts would simply cost too much.

The current rules, with their lack of emphasis on efficiency, accept pumping a large amount of oily water into your tanks—a mixture that must then be disposed of as hazardous waste. The better goal is to remove only the oil, and Kennedy’s equipment is about as good as you can get in this regard, with its most recent ASTM-certified oil-to-water rating being 99 percent.

What’s more, that “test tank” rating matches Kennedy’s experiences with his equipment under real-world conditions. Whether on the Taylor slick with its micrometer-thick sheen, a Lake Superior Superfund site with spilled creosote as viscous as peanut butter, or a toxic spill in California’s Long Beach Harbor, his efficiency numbers have always been very high, he claims.

Kennedy attributes this performance to his unique separation system. It uses a pair of collection vessels, in which the oil floats to the top of the mixture taken in. A specially designed float valve closes once the oil is drawn off the top. That extraction is done by a vacuum pump, which has the virtue of creating a partial vacuum, causing any water that gets caught up in the oil to boil off. The resultant water vapor is exhausted to the air before it can condense and dilute the recovered oil. The fuel oil his system collects often has even lower moisture content than it did when it came fresh out of the refinery.

Yet even with a skimmer that has remarkable performance, Kennedy has faced an uphill climb to find buyers. In 2016, he offered his equipment to Taylor Energy, only to be turned down. For the next two years, he repeatedly approached the Coast Guard, offering evidence that the Taylor Spill was larger than reported and insisting he had a potential solution. Even on a no-cure-no-pay basis, the Coast Guard wasn’t interested.

“The Coast Guard shines when it tackles major crises, like Hurricane Katrina or the devastation in Puerto Rico,” says retired Coast Guard Capt. Craig Barkley Lloyd, now president and general manager of Alaska Clean Seas. “But this was a slowly boiling frog.”

It wasn’t until 2018 that the Coast Guard was finally goaded to act. The Justice Department had hired Oscar Garcia-Pineda, a consultant who had studied the Taylor Spill, to do an independent assessment, which found the spill to be far more expansive than previously reported. According to the new calculations, the total volume of oil released over time rivaled that of the epic Deepwater Horizon spill. Picking up on that analysis, in October 2018 a Washington Post story labeled it “one of the worst offshore disasters in U.S. history.”

In response to that newspaper article, the Coast Guard began to look for solutions in earnest. It quickly hired the Louisiana-based Couvillion Group to build a giant collection apparatus that could be lowered onto the seafloor to capture the leaking oil before it reached the surface. In the spring of 2019, Couvillion installed this system, which has since been collecting more than 1,000 gallons of oil a day.

For 14 years after the Taylor spill commenced, oil covered large swaths of the sea, and not a single gallon of that oil was recovered until Kennedy demonstrated that it could be done. The incentives just weren’t there. Indeed, there were plenty of disincentives. That’s the situation that some regulators, environmentalists, and spill-cleanup entrepreneurs, including this former fisherman, are trying to change. With the proper equipment, oil recovered from a spill at sea might even be sold at a profit.

During Kennedy’s trial runs at the site of the Taylor spill in 2016, the crew of the shrimp boat he hired began to realize spilled oil could be worth more than shrimp. With the right technology and a market to support them—those same men might someday be fishing for oil.

This article appears in the June 2020 print issue as “Fishing for Oil.”

About the Author

Larry Herbst is a filmmaker and videographer with Cityline Inc., in Pasadena, Calif.

Dethroned! Renewables Generated More Power than King Coal in April

Post Syndicated from Sandy Ong original https://spectrum.ieee.org/energywise/energy/renewables/renewables-generated-more-power-than-coal-in-april

For the first time ever, renewable energy supplied more power to the U.S. electricity grid than coal-fired plants for 47 days straight. The run is impressive because it trounces the previous record of nine continuous days last June and exceeds the total number of days renewables beat coal in all of 2019 (38 days).

In a recent report, the Institute for Energy Economics and Financial Analysis (IEEFA) details how the streak was first observed on 25 March and continued through to 10 May, the day the data was last analyzed. 

“We’ll probably track it again at the end of May, so the period could actually be longer,” says Dennis Wamsted, an energy analyst at IEEFA. Already, the figures for April speak volumes: wind, hydropower, and utility-scale solar sources produced 58.7 terawatt-hours (TWh) of electricity compared with coal’s 40.6 TWh—or 22.2% and 15.3% of the market respectively.  

In reality, the gap between the two sources is likely to be much larger, says Wamsted. That’s because the U.S. Energy Information Administration (EIA) database, where IEEFA obtains its data from, excludes power generated by rooftop solar panels, which itself is a huge power source.

The news that renewables overtook coal in the month of April isn’t surprising, says Brian Murray, director of the Duke University Energy Initiative. The first time this happened was last year, also in April. The month marks “shoulder season,” he says, “when heating is coming off but air-conditioning hasn’t really kicked in yet.” It’s when electricity demand is typically the lowest, which is why many power plants schedule their yearly maintenance during this time.

Spring is also when wind and hydropower generation peak, says Murray. Various thermal forces come into play with the Sun’s new positioning, and the melting snowpacks feed rivers and fill up reservoirs. 

“Normally you would expect some sort of rebound of coal generation in the summer, but I think there’s a variety of reasons why that’s not going to happen this year,” he says. “One has to do with coronavirus.”

With the pandemic placing most of the country in lockdown and economic activity declining, the EIA estimates that U.S. demand for electric power will fall by 5% in 2020. This, in turn, will drive coal production down by a quarter. In contrast, renewables are still expected to grow by 11%. The reason behind this is partly due to how energy is dispatched to the grid. Because of cheaper costs, renewables are used first if available, followed by nuclear power, natural gas, and then finally coal.

Coronavirus aside, the transition has been a long time coming. “Renewables have been on an inexorable rise for the last 10 years, increasingly eating coal’s lunch,” says Mike O’Boyle, director of electricity policy at Energy Innovation, a San Francisco-based think tank. The average coal plant in the U.S. is 40 years old, and these aging, inefficient plants are finding it increasingly difficult to compete against ever-cheaper renewable energy sources.

A decade ago, the average coal plant generated as much as 67% of its capacity. Today, that figure has dropped to 48%. And in the next five years, coal production is expected to fall to two-thirds of 2014 levels—a decline of 90 gigawatts (GW)—as increasing numbers of plants shut. 

“And that’s without policy changes that we anticipate will strengthen in the U.S., in which more than a third of people are in a state, city, or utility with a 100% clean energy goal,” says O’Boyle. Already, 30 states have renewable portfolio standards, or policies designed to increase electricity generation from renewable resources.

The transition towards renewables is one that’s being observed all across the world today. Global use of coal-powered electricity fell 3% last year, the biggest drop on record after nearly four decades. In Europe, the figure was 24%. The region has been remarkably progressive in its march towards renewable energy—last month saw both Sweden and Austria closing their last remaining coal plants, while the U.K. went through its longest coal-free stretch (35 days) since the Industrial Revolution more than 230 years ago.

But coal is still king in many parts of the world. For developing countries where electricity can be scarce and unreliable, the fossil fuel is often seen as the best option for power. 

The good news, however, is that the world’s two largest consumers of coal are investing heavily in renewables. Although China is still heavily reliant coal, it also boasts the largest capacity of wind, solar, and hydropower in the world today. India, with it abundant sunshine, is pursuing an aggressive solar plan. It is building the world’s largest solar park, and Prime Minister Narendra Modi has pledged that the country will produce 100 GW of solar power—five times what the U.S. generates—by 2022.

Today, renewable energy sources offer the cheapest form of power in two-thirds of the world, and they look set to get cheaper. They now provide up to 30% of global electricity demand, a figure is expected to grow to 50% by 2050. As a recent United Nations report put it: renewables are now “looking all grown up.”

Coronavirus Pandemic Upends Research Plans

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/energy/the-smarter-grid/coronavirus-pandemic-upends-research-plans

IEEE COVID-19 coverage logo, link to landing page

The COVID-19 pandemic has affected virtually every facet of life, including scientific research carried out at companies and universities around the world.

As biomedical researchers scramble to find a treatment or vaccine, other scientists and engineers try to continue their own work in the midst of a pandemic. In some cases, this means writing a paper or grant from home rather than at the office. But in many others, the disruption is more pronounced.

Most academic, government, and corporate labs in the United States have scaled back operations or closed temporarily to comply with stay-at-home orders. The impacts of these changes can vary greatly from one field to the next, depending on the nature of the work.

John Verboncoeur, a director of IEEE and associate dean for research and graduate studies at Michigan State University, says, “Our surveys indicate that theoretical research teams—my own included—are operating at around 80 to 90 percent efficiency, with the main challenge being the ability to explain complicated concepts without our traditional ‘waving hands about’ and interactive work at the white- or blackboard.”

For experimentalists, the pandemic is more disruptive, although some experiments may be completed from home. “The early focus [for experimentalists] was on catching up on the literature, completing manuscripts, analyzing existing data, and so on, which led to a productivity of 50 percent or so,” says Verboncoeur. “However, much of that is coming to completion, and we are seeing productivity drop as the activities narrow down to designing upcoming experiments and protocols.”

Engineers in many fields are looking for new ways to remain innovative and productive. Take, for example, those in the green energy sector. While some climate and energy research may continue from home, other projects are more difficult or impossible to complete remotely.

Sally Benson’s lab at Stanford University does a mix of theory, modeling, and experiments to support the transition to a low-carbon future, including studies related to carbon capture and storage. While the theory and modeling aspects of this research are easy enough to continue, the experimental work involves analyzing rock samples at the extreme temperatures and pressures found in underground reservoirs—tests that aren’t feasible to carry out at home.

Despite this limitation, Benson’s group is still finding ways to continue with some aspects of their experimental work. “The good news is that as experimentalists, we tend to collect way, way more data than we can assimilate,” she says. “We generate these immensely rich data sets, where there’s plenty more we can mine out of those data sets.”

The group is now returning to its old data sets and reanalyzing the data to answer new, unexplored questions, in part by applying machine learning. By doing so, the researchers have uncovered previously unknown ways that carbon dioxide interacts with rock. Benson acknowledges, however, that this reuse of old experimental data can’t go on forever.

Further up the coast, at the University of Washington, Brian Johnson is leading two projects funded by the U.S. Department of Energy. Both are designed to facilitate a major shift from electromechanical power grids to grids based on power-electronics systems that will better support renewable energy.

One project involves the design of controllers for these new power grids. The effort launched in April just as the pandemic was taking hold in the United States, but the team was able to get the research started by focusing on pen-and-paper designs and software simulations.

However, the pandemic may prove more problematic for Johnson’s second endeavor. It involves the design of a new breed of high-efficiency power electronics that converts DC power from solar cells into grid-compatible AC power. “For that project, we have a heavy set of milestones coming up in the summer months to actually demonstrate the hardware,” says Johnson. “If we can’t do [tests] in the summer, we’re going to have to start coming up with some contingency plans. Since these experiments necessitate a power lab with specialized equipment, they cannot be done in our homes.”

While the pandemic affects each research project to varying degrees, its overall impact on the broader shift toward green tech—and on the state of engineering research more generally—is still unclear.

Benson says she’s slightly concerned that the pandemic may cause some researchers to shift their focus from climate change to medicine. “To me, the COVID-19 pandemic is sort of a multiyear challenge and a short-term nightmare,” she says. “If we’re not careful, climate change will be a decadal-scale nightmare. So this work needs all of the attention it can get.”

Johnson is less concerned that the pandemic will interfere with the advancement of green tech, saying: “I think that energy is such an integral part of modern life itself and infrastructure that I don’t perceive [the COVID-19 pandemic] fundamentally altering the fact that we all need energy, and cheap energy.”

This article appears in the June 2020 print issue as “COVID-19 Disrupts Research Plans.”

Next-Gen Solar Cells Can Harvest Indoor Lighting for IoT Devices

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/renewables/next-gen-solar-cells-harvest-indoor-lighting-iot-devices

Billions of Internet-connected devices now adorn our walls and ceilings, sensing, monitoring, and transmitting data to smartphones and far-flung servers. As gadgets proliferate, so too does their electricity demand and need for household batteries, most of which wind up in landfills. To combat waste, researchers are devising new types of solar cells that can harvest energy from the indoor lights we’re already using.

The dominant material used in today’s solar cells, crystalline silicon, doesn’t perform as well under lamps as it does beneath the blazing sun. But emerging alternatives—such as perovskite solar cells and dye-sensitized materials—may prove to be significantly more efficient at converting artificial lighting to electrical power. 

A group of researchers from Italy, Germany, and Colombia is developing flexible perovskite solar cells specifically for indoor devices. In recent tests, their thin-film solar cell delivered power conversion efficiencies of more than 20 percent under 200 lux, the typical amount of illuminance in homes. That’s about triple the indoor efficiency of polycrystalline silicon, according to Thomas Brown, a project leader and engineering professor at the University of Rome Tor Vergata.

Slow, Steady Progress for Two U.S. Nuclear Power Projects

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/energy/nuclear/slow-steady-progress-for-two-us-nuclear-power-projects

There are 53 nuclear reactors currently under construction around the world. Only two are in the United States, once the world’s leader in nuclear energy development. And those two reactors represent expansions of a preexisting two-reactor facility, Plant Vogtle in Waynesboro, Ga.

Separately, a company in Portland, Ore., called NuScale Power is now working with the U.S. Nuclear Regulatory Commission to develop a next-generation reactor built around a smaller-scale, modular design.

These two projects together represent the leading edge of commercial U.S. nuclear-fission reactor development today. The fact that there are only two raises questions about the direction of this once-booming energy sector. Is the United States redirecting its focus onto fusion and leaving fission behind? Or could a fission renaissance be yet to come?

Congress upped the U.S. Department of Energy’s nuclear fusion budget from US $564 million to $671 million for fiscal year 2020. And such companies as AGNI Energy in Washington state and Commonwealth Fusion Systems in Massachusetts (alongside Tokamak Energy and General Fusion in the United Kingdom and Canada, respectively) are courting venture capital for their multimillion-dollar visions of fusion’s bright future.

Meanwhile, in March, construction workers at the Vogtle fission plant hoisted a 680,000-kilogram steel-and-concrete structure to cap one of the containment vessels for the new AP1000 reactors. As John Kraft, a spokesperson for Georgia Power, explained, “The shield building is a unique feature of the AP1000 reactor design for Vogtle 3 and 4, providing an additional layer of safety around the containment vessel and nuclear reactor to protect the structure from any potential impacts.”

The AP1000 pressurized-water reactor, designed by Westinghouse, is a 21st-century “new” reactor. It’s been deployed in just two other places, in China, with two AP1000 reactors at the Sanmen Nuclear Power Station in Zheijang province and two at the Haiyang Nuclear Power Plant in Shandong province. According to the International Atomic Energy Agency (IAEA), the AP1000 reactors at these locations operate at 1,157 megawatts and 1,126 MW, respectively.

In 2005, the Nuclear Regulatory Commission (NRC) certified the AP1000 design, clearing the way for its sale and installation at these three sites more than a decade later. Last year, Dan Brouillette, the U.S. secretary of energy, wrote in a blog post: “The U.S. Department of Energy (DOE) is all in on new nuclear energy.”

NuScale’s modular design—with 12 smaller reactors, each operating at a projected 60 MW—met NRC Phase 4 approval at the end of last year. According to Diane Hughes, vice president of marketing and communications for NuScale, “This means that the technical review by the NRC is essentially complete and that the final design approval is expected on schedule by September 2020.”

NuScale’s first customer, the Utah Associated Municipal Power Systems, plans to install a power plant with NuScale reactors at the Idaho National Laboratory site in Idaho Falls. The plant, Hughes said, is “slated for operation by the mid-2020s based on the NRC’s approved design.”

The idea of harnessing multiple smaller reactors in a single design is not new, dating back as far as the 1940s. At the time, the economics of the smaller, modular design could not compete with bigger, individual reactors, says M.V. Ramana, a nuclear physicist and professor at the University of British Columbia’s School of Public Policy and Global Affairs.

“Nuclear power is unlike almost any other energy technology, in that it’s the one tech where the costs have gone up, not down, with experience,” he said. “The way to think about it is that the more experience we have with nuclear power, the more we learn about potential vulnerabilities that can lead to catastrophic accidents.”

However, Hughes of NuScale counters that, unlike the 54 competing small modular reactor designs that the IAEA has records of, NuScale is “the first ever small modular reactor technology to undergo…NRC design certification review.”

And in 2018, an interdisciplinary MIT report on nuclear energy found that NuScale’s reactor is “quite innovative in its design. It has virtually eliminated the need for active systems to accomplish safety functions, relying instead on a combination of passive systems and the inherent features of its geometry and materials.”

Of course, while the number of catastrophic nuclear accidents (such as Three Mile Island, Chernobyl, and Fukushima) is small for the amount of energy that nuclear power has generated over the past 70 years, Ramana adds, the cost of each accident is astronomical—sacrificing human lives and uprooting untold many more from disaster zones as well as requiring cleanups that cost hundreds of billions of dollars. “One every other decade is not good enough,” Ramana said.

This article appears in the June 2020 print issue as “Limited Progress for U.S. Nuclear.”

Team Sonnenwagen Prep for Race Across the Outback

Post Syndicated from Harwin original https://spectrum.ieee.org/energy/environment/team_sonnenwage_prep_for_race_across_the_outback

Harwin’s Interconnect Guru caught up with the team from RWTH Aachen University as they prepared to embark on a journey across the globe, travelling from North West Germany to Darwin Australia for the Bridgestone World Solar Challenge.

What is their motivation and what lessons they’ve learnt from last year that will help them to gain a pole position?

 “Climate change and resource depletion are threatening our civilization and emphasize the importance of developing renewable energy alternatives. Our intention with Sonnenwagen is not only to bring these two issues to light, but also show the potential of efficient solar technology. If you speak to any team member, they’ll say they want to be a part of a real-world application that promotes an environmentally-friendly renewable approach.

All of us are looking to make the most of our time in university and being involved in a project combating climate change and helping protect the planet is very rewarding.”

“The aerodynamics are vital. We spent 18 months performing computational fluid dynamics simulations and carried out multiple wind tunnel tests to determine the optimal design, while still considering chassis structure. Simulations were also done on various carbon fiber-reinforced composites and geometries. Data from all of these activities was then compiled to create a final digital prototype.”

Physical Modeling of Supercapacitors and Lithium-Ion Capacitors

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/physical_modeling_of_supercapacitors_and_lithium-ion_capacitors

Lithium

While lithium-ion batteries are the dominant rechargeable devices visible to the public and in the headlines, other electrochemical energy storage technologies will have a complementary role in industrial decarbonization. Supercapacitors, also called electrochemical double-layer capacitors, typically store less energy than lithium-ion batteries, but allow rapid delivery of very high power with high cycling stability. Researchers are increasingly exploring new hybrid devices, such as lithium-ion capacitors (LIC), which can exploit features of both traditional supercapacitors and lithium-ion batteries.

Can Electric Cars on the Highway Emulate Air-to-Air Refueling?

Post Syndicated from Philip E. Ross original https://spectrum.ieee.org/cars-that-think/energy/batteries-storage/will-electric-cars-on-the-highway-emulate-airtoair-refueling

Jet fighters can’t carry a huge tank of fuel because it would slow them down. Instead they have recourse to air-to-air refueling, using massive tanker planes as their gas stations. 

What if electric cars could do the same thing, while zooming down the highway? One car with charge to spare could get in front of another that was short of juice, and the two could extend telescopic charging booms until they linked together magnetically. The charge-rich car would then share some of its largesse. And to complete the aerial analogy, just add a few big “tanker trucks” bearing enormous batteries to beef up the power reserves of an entire flotilla of EVs.

The advantages of the concept are clear: It uses a lot of battery capacity that would otherwise go untapped, and it allows cars to save time in transit by recharging on the go, without taking detours or sitting still while topping off.

Yeah, and the tooth fairy leaves presents under your pillow. We’re too far into the month for this kind of story. Right?

Maybe it’s no April Fools joke. Maybe sharing charge is the way forward, not just for electric cars and trucks on the highways but for other mobile vehicles. That’s the brief of professor Swarup Bhunia and his colleagues in the department of electrical and computer engineering at the University of Florida, in Gainesville.

Bhunia is no mere enthusiast: He has written three features so far for IEEE Spectrum (this, this and this). And his group has published their new proposal in arXiv, an online forum for preprints that have been vetted, if not yet fully peer-reviewed. The researchers call the concept peer-to-peer car charging.

The point is to make a given amount of battery go further and thus to solve the two main problems of electric vehicles—high cost and range anxiety. In 2019 batteries made up about a third of the cost of a mid-size electric car, down from half just a few years ago, but still a huge expense. And though most drivers typically cover only short distances, they usually want to be able to go very far if need be.

Mobile charging works by dividing a car’s battery pack into independent banks of cells. One bank runs the motor, the other one accepts charge. If the power source is a battery-bearing truck, you can move a great deal of power —enough for “an extra 20 miles of range,” Bhunia suggests. True, even a monster truck can charge only one car at a time, but each newly topped-off car will then be in a position to spare a few watt-hours for other cars it meets down the road.

We already have the semblance of such a battery truck. We recently wrote about a concept from Volkswagen to use mobile robots to haul batteries to stranded EVs. And even now you can buy a car-to-car charge-sharing system from the Italian company Andromeda, although so far no one seems to have tried using it while in motion.

If all the cars participated (yes, a big “if”), then you’d get huge gains. In computer modeling, done with the traffic simulator SUMO, the researchers found that EVs had to stop to recharge only about a third as often. What’s more, they could manage things with nearly a quarter less battery capacity.

A few disadvantages suggest themselves. First off, how do two EVs dock while  barreling down the freeway? Bhunia says it would be no strain at all for a self-driving car, when that much-awaited creature finally comes. Nothing can hold cars in tandem more precisely than a robot. But even a human being, assisted by an automatic system, might be able to pull of the feat, just as pilots do during in-flight refueling. 

Then there is the question of reciprocity. How many people are likely to lend their battery to a perfect stranger? 

“They wouldn’t donate power,” explains Bhunia. “They’d be getting the credit back when they are in need. If you’re receiving charge within a network”—like ride-sharing drivers for companies like Uber and Lyft—“one central management system can do it. But if you want to share between networks, that transaction can be stored in a bank as a credit and paid back later, in kind or in cash.”

In their proposal, the researchers envisage a central management system that operates in the cloud.

Any mobile device that moves about on its own can benefit from a share-and-share-alike charging scheme. Delivery bots would make a lot more sense if they didn’t have to spend half their time searching for a wall socket. And being able to recharge a UAV from a moving truck would greatly benefit any operator of a fleet of cargo drones, as Amazon appears to want to do.

Sure, road-safety regulators would pop their corks at the very mention of high-speed energy swapping. And, yes, one big advance in battery technology would send the idea out the window, as it were. Still, if aviators could share fuel as early as 1923, drivers might well try their hand at it a century later.

Visiting the City That Built the Hanford Nuclear Site

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/nuclear/visiting-the-city-that-build-hanford-nuclear-site

As my plane approaches the Tri-Cities Airport in south-central Washington state, the sandy expanse outside my window gives way to hundreds of bright green circles and squares. From this semi-arid terrain springs an irrigated oasis of potatoes, hops, peaches, and sweet corn. Just beyond our view is one of the most contaminated places in the world: the Hanford Site, home to 177 aging tanks of radioactive waste.

A Bright Spot for Solar Windows Powered By Perovskites

Post Syndicated from Sandy Ong original https://spectrum.ieee.org/energywise/energy/renewables/solar-windows-powered-perovskites

To most of us, windows are little more than glass panes that let light in and keep bad weather out. But to some scientists, windows represent possibility—the chance to take passive parts of a building and transform them into active power generators.

Anthony Chesman is one researcher working to develop such solar windows. “There are a lot of windows in the world that aren’t being used for any other purpose than to allow lighting in and for people to see through,” says Chesman, who is from Australia’s national science agency CSIRO. “But really, there’s a huge opportunity there in turning those windows into a space that can also generate electricity,” he says.

Here Are the U.S. Regions Most Vulnerable to Solar Storms

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/us-regions-most-vulnerable-solar-storms

A new study about solar-induced power outages in the U.S. electric grid finds that a few key regions—a portion of the American midwest and Eastern U.S. seaboard—appear to be more vulnerable than others. 

The good news is that a few preventative measures could drastically reduce the damage done when a solar storm hits Earth. Those include stockpiling electrical transformers in national strategic reserves.

Jeffrey Love is a research geophysicist at the U.S. Geological Survey (USGS) in Golden, Colorado and co-author of the new USGS solar geoelectric hazard study. He’s one of many voices in the worldwide geophysical community warning that geoelectric “perfect storms” will happen—it’s not a question of if, but when. Such storms can last between one and three days. 

Love explains that solar flares and other solar mass ejections that travel through space can slam into Earth’s atmosphere and generate powerful electric and magnetic fields. These magnetic storms can occasionally be intense enough to interfere with the operation of high-voltage electricity lines. 

Depending on the geology of a given region, the currents a geomagnetic storm induces in the power lines can destabilize the power grid’s operation and cause damage to (or even destroy) transformers. 

Peak Oil, a Specimen Case of Apocalypic Thinking

Post Syndicated from Vaclav Smil original https://spectrum.ieee.org/energy/fossil-fuels/peak-oil-specimen-case-apocalypic-thinking

Predictions of when we would run out of oil have been around for a century, but the idea that peak production was rapidly approaching and that it would be followed by ruinous decline gained wider acceptance thanks to the work of M. King Hubbert, an American geologist who worked for Shell in Houston.

In 1956, he predicted that U.S. oil output would top out during the late 1960s; in 1969 he placed it in the first half of the 1970s. In 1970, when the actual peak came—or appeared to come—it was nearly 20 percent above Hubbert’s prediction. But few paid attention to the miss—the timing was enough to make his name as a prophet and give credence to his notion that the production curve of a mineral resource was symmetrical, with the decline being a mirror image of the ascent.

But reality does not follow perfect models, and by the year 2000, actual U.S. oil output was 2.3 times as much as indicated by Hubbert’s symmetrically declining forecast. Similarly, his forecasts of global peak oil (either in 1990 or in 2000) had soon unraveled. But that made no difference to a group of geologists, most notably Colin Campbell, Kenneth Deffeyes, L.F. Ivanhoe, and Jean Laherrère, who saw global oil peaking in 2000 or 2003. Deffeyes set it, with ridiculous accuracy, on Thanksgiving Day, 24 November 2005.

These analysts then predicted unprecedented economic disruptions. Ivanhoe went so far as to speak of “the inevitable doomsday” followed by “economic implosion” that would make “many of the world’s developed societies look more like today’s Russia than the United States.” Richard C. Duncan, an electrical engineer, proffered his “Olduvai theory,” which held that declining oil extraction would plunge humanity into a life comparable to that of the hunter-gatherers who lived near the famous Kenyan gorge some 2.5 million years ago.

In 2006, I reacted [PDF]to these prophecies by noting that “the recent obsession with an imminent peak of oil extraction has all the marks of a catastrophist apocalyptic cult.” I concluded that “conventional oil will become relatively a less important part of the world’s primary energy supply. But this spells no imminent end of the oil era as very large volumes of the fuel, both from traditional and nonconventional sources, will remain on the world market during the first half of the 21st century.”

And so they have. With the exception of a tiny (0.2 percent) dip in 2007 and a larger (2.5 percent) decline in 2009 (following the economic downturn), global crude oil extraction has set new records every year. In 2018, at nearly 4.5 billion metric tons, it was nearly 14 percent higher than in 2005.

A large part of this gain has been due to expansion in the United States, where the combination of horizontal drilling and hydraulic fracturing made the country, once again, the world’s largest producer of crude oil, about 16 percent ahead of Saudi Arabia and 19 percent ahead of Russia. Instead of following a perfect bell-shaped curve, since 2009 the trajectory of the U.S. crude oil extraction has been on the rise, and it is now surpassing the record set in 1970.

As for the global economic product, in 2019 it was 82 percent higher, in current prices, than in 2005, a rise enabled by the adequate supply of energy in general and crude oil in particular. I’d like to think that there are many lessons to be learned from this peak oil–mongering, but I have no illusions: Those who put models ahead of reality are bound to make the same false calls again and again.

This article appears in the May 2020 print issue as “Peak Oil: A Retrospective.”

How Engineers Kept the Power On in India

Post Syndicated from Edd Gent original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/how-engineers-kept-power-india

Earlier this month, Indian Prime Minister Narendra Modi asked the entire country to simultaneously switch off the lights. The gesture was meant to be a show of solidarity during the coronavirus pandemic, but Modi’s request left power engineers scrambling to prevent a nationwide blackout. 

In a televised address on the afternoon of Friday, 3 April, Modi called on Indians to switch off their lights for nine minutes beginning at 9PM on Sunday, 5 April. Within hours, experts raised concerns that the massive drop in electricity demand, followed by a sudden surge nine minutes later, could debilitate the grid and trigger widespread blackouts.

Despite the warnings, the government stood by its request. So India’s power sector had just two days to come up with a strategy to protect the grid. “It was a very challenging situation,” says Sushil Kumar Soonee, a former CEO of grid operator Power System Operation Corporation (POSOCO) who still advises the organization and was involved in planning the response. (India’s Ministry of Power didn’t respond to a request for comment.)