Tag Archives: energy

How an Australian State Fought Back Against Grid-Sparked Wildfires

Post Syndicated from Peter Fairley original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/how-an-australian-state-faced-devastation-from-gridsparked-wildfires

Nine years before Paradise, California burned to the ground, a similar tragedy unfolded in Australia. On a searing, windy day in 2009 that came to be known as “Black Saturday,” hundreds of fires erupted in the state of Victoria. One of the worst razed the bucolic mountain town of Marysville, northeast of Melbourne. And just as sparks from a Pacific Gas & Electric (PG&E) power line launched the Camp Fire that destroyed Paradise, Marysville’s undoing began with high-voltage current.

In all, the Black Saturday fires killed 173 people and caused an estimated AUS $4 billion ($2.75 billion) in damage. Fires started by power lines caused 159 of the deaths.

California’s wildfires have “brought it all back,” says Tony Marxsen, an electrical engineering professor at Monash University in Australia. His parents honeymooned in Marysville. “It was a lovely little town nestled up in the hills. To see it destroyed was just wrenching,” he recalls.

Marxsen says faded memories increased Marysville’s death toll. “It had been 26 years since Australia’s last major suite of deadly fires,” he says. “People had come to believe that they could defend their house against a firestorm. Some stayed, and they all died.”

While they go by different names, California’s wildfires and Victoria’s bushfires are driven by the same combination of electrical networks and extreme weather, stoked by climate change. How Victoria responded after the Black Saturday fires—work that continues today—differs significantly from what is happening in California today, especially in PG&E’s territory.

How Tech From Australia Could Prevent California Wildfires and PG&E Blackouts

Post Syndicated from Peter Fairley original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/breaking-pges-cycle-of-blackouts-and-wildfires

California utility Pacific Gas & Electric (PG&E) delivered a bitter pill last month when it said that deliberate blackouts to keep its lines from sparking wildfires could be the new normal for millions of customers for the next decade—a dangerous disruption to power-dependent communities that California governor Gavin Newsom says “no state in the 21st Century should experience.” Grid experts say Newsom is right, because technology available today can slash the risk of grid-induced fires, reducing or eliminating the need for PG&E’s “public safety power shutoffs.”

Equipment to slash grid-related fire risk isn’t cheap or problem-free, but could be preferable to the most commonly-advanced solutions: putting lines underground or equipping California with thousands of “microgrids” to reduce reliance on big lines. Widespread undergrounding and microgrids will be costly. And the latter could create inequalities and weaken investment in the big grids as communities with means isolate themselves from power shutoffs with solar systems and batteries.

Some of the most innovative fire-beating grid technologies are the products of an R&D program funded by the state of Victoria in Australia, prompted by deadly grid-sparked bushfires there 10 years ago. Early this year, utilities in Victoria began a massive rollout of one solution: power diverters that are expected to protect all of the substations serving the state’s high fire risk areas by 2024.

A Retired JPL Engineer’s Journey: From Space Probes to Carbon-Neutral Farming

Post Syndicated from Jay Schmuecker original https://spectrum.ieee.org/energy/environment/a-retired-jpl-engineers-journey-from-space-probes-to-carbonneutral-farming

You could say that farming is in my blood: My grandparents on both sides ran large, prosperous farms in Iowa. One of my fondest childhood memories is of visiting my maternal grandparents’ farm and watching the intricate moving mechanisms of the threshing machine. I guess it’s not surprising that I eventually decided to study mechanical engineering at MIT. I never really considered a career in farming.

Shortly after I graduated in 1957 and took a job with the California Institute of Technology’s Jet Propulsion Lab, the Soviets launched Sputnik. I was at the right place at the right time. JPL was soon transferred to the newly formed NASA. And for more than 50 years, I worked with some of the brightest engineers in the world to send unmanned spacecraft—including Mariner, Viking, and Voyager—to all the other planets in the solar system.

But my love of farms and farming never went away, and in 1999, I purchased my paternal grandfather’s 130-hectare (320-acre) property, Pinehurst Farm, which had been out of the family for 55 years. I wasn’t exactly sure what I’d do with the place, but by the time I retired in 2007, there was more and more talk about climate change due to human-caused carbon emissions. I knew that agriculture has a large carbon footprint, and I wondered if there was a way to make farming more sustainable. After all, the most recent numbers are alarming: The World Meteorological Organization reports that the planet is on course for a rise in temperature of 3 to 5 °C by 2100. The U.S. Environmental Protection Agency estimates that agriculture and forestry accounted for almost 10 percent of greenhouse gas emissions in 2016. While a significant share of those are livestock emissions (that is, belches and flatulence), much of it comes from burning fuel to grow, harvest, and transport food, as well as fertilizer production.

I recalled a conversation I’d had with my dad and his friend, Roy McAlister, right after I acquired the farm. Roy was the president of the American Hydrogen Association, and he owned a hydrogen-powered Nissan pickup truck. Both men were vocal advocates for replacing fossil fuels with hydrogen to reduce the United States’ dependence on oil imports. The same transition would also have a big impact on carbon emissions.

And so, in 2008, I decided to create a solar-hydrogen system for Pinehurst Farm as a memorial to my father. I’d use solar power to run the equipment that would generate fuel for a hydrogen-burning tractor. Several years into the project, I decided to also make ammonia (nitrogen trihydride, or NH3) to use as tractor fuel and crop fertilizer.

My aim is to make the public—especially farmers—aware that we will need to develop such alternative fuels and fertilizers as fossil fuels become depleted and more expensive, and as climate change worsens. Developing local manufacturing processes to generate carbon-free fuel and fertilizer and powering those processes with renewable energy sources like solar and wind will eliminate farmers’ reliance on fossil fuels. And doing this all locally will remove much of the cost of transporting large amounts of fuel and fertilizers as well. At our demonstration project at Pinehurst, my colleague David Toyne, an engineer based in Tujunga, Calif., and I have shown that sustainable farming is possible. But much like designing spacecraft, the effort has taken a little longer and presented many more challenges than we initially expected.

The system that we now have in place includes several main components: a retrofitted tractor that can use either hydrogen or ammonia as fuel; generators to create pure hydrogen and pure nitrogen, plus a reactor to combine the two into ammonia; tanks to store the various gases; and a grid-tied solar array to power the equipment. When I started, there were no other solar-hydrogen farms on which I could model my farm, so every aspect had to be painstakingly engineered from scratch, with plenty of revisions, mishaps, and discoveries along the way.

The work began in earnest in 2009. Before actually starting to build anything, I crunched the numbers to see what would be needed to pull off the project. I found that a 112-kilowatt (150-horsepower) tractor burns about 47 liters per hectare (5 gallons per acre) if you’re raising corn and about two-thirds that amount for soybeans. The same area would require 5 kilograms of hydrogen fuel. That meant we needed roughly 1,400 kg of hydrogen to fuel the tractor and other farm vehicles from planting to harvest. Dennis Crow, who farms the Pinehurst land, told me about half the fuel would go toward spring planting and half for fall harvesting. The growing season in Iowa is about 150 days, so we’d need to make about 4.5 kg of hydrogen per day to have 700 kg of hydrogen for the harvest. Spring planting would be easier—we would have 215 days of the year to make the remaining fuel.

To generate the hydrogen, we would split water into hydrogen and oxygen. By my calculations, running the hydrogen generator and related equipment would require about 80 kW of solar power. I decided to use two-axis solar arrays, which track the sun to boost the collection capacity by 30 percent. Based on the efficiency of commodity photovoltaic panels in 2008, we’d need 30 solar arrays, with each array holding 12 solar panels.

That’s a lot of solar panels to install, operate, and maintain, and a lot of hydrogen to generate and store. I soon realized I could not afford to build a complete operational system. Instead, I focused on creating a demonstration system at one-tenth scale, with three solar arrays instead of 30. While the tractor would be full size, we would make only 10 percent of the hydrogen needed to fuel it. I decided that even a limited demonstration would be a worthwhile proof of concept. Now we had to figure out how to make it happen, starting with the tractor.

As it turns out, I wasn’t the first to think of using hydrogen as a tractor fuel. Back in 1959, machinery manufacturer Allis-Chalmers demonstrated a tractor powered by hydrogen fuel cells. Fifty-two years later, New Holland Agriculture did the same. Unfortunately, neither company produced a commercial model. After some further research, I decided that fuel cells were (and still are) far too expensive. Instead, I would have to buy a regular diesel tractor and convert it to run on hydrogen.

Tom Hurd, an architect in Mason City, Iowa, who specializes in renewable-energy installations, assisted with the farm’s overall design. At his suggestion, I contacted the Hydrogen Engine Center in nearby Algona, Iowa. The company’s specialty was modifying internal combustion engines to burn hydrogen, natural gas, or propane. Ted Hollinger, the center’s president, agreed to provide a hydrogen-fueled engine for the tractor.

Hollinger’s design started with a gasoline-fueled Ford 460 V-8 engine block. He suggested that we include a small propane tank as backup in case the tractor ran out of hydrogen out in the field. Several months later, though, he recommended that we use ammonia instead of propane, to avoid fossil fuels completely. Since the idea was to reduce the farm’s carbon footprint, I liked the ammonia idea.

Scott McMains, who looks after the old cars that I store on the farm, located a used 7810 John Deere tractor as well as a Ford 460 engine. The work of installing the Ford engine into the tractor was done by Russ Hughes, who lives in Monticello, Iowa, and was already restoring my 1947 Buick Roadmaster sedan.

The tractor would need to carry several large, heavy fuel tanks for the hydrogen and ammonia. Bob Bamford, a retired JPL structural-design analyst, took a look at my plans for the fuel tanks’ support structure and redesigned it. In my original design, the support structure was bolted together, but Bamford’s design used welds for increased strength. I had the new and improved design fabricated in California.

The completed tractor was delivered to the farm in late 2014. With the flick of a switch in the cab, our tractor can toggle between burning pure hydrogen and burning a mixture of hydrogen and ammonia gas. Pure ammonia won’t burn in an internal combustion engine; you first need to mix it with about 10 percent hydrogen. The energy content of a gallon of ammonia is about 35 percent that of diesel. The fuel is then mixed with the intake air and injected into the tractor’s computer-controlled, spark-ignited engine cylinders. The tractor can run for 6 hours at full power before it needs to be refueled.

While work on the tractor proceeded, we were also figuring out how to generate the hydrogen and ammonia it would burn.

Ramsey Creek Woodworks of Kalona, Iowa, modified the farm’s old hog shed to house the hydrogen generators, control equipment, and the tractor itself. The company also installed the solar trackers and the solar arrays.

We constructed a smaller building to house the pumps that would compress the hydrogen for high-pressure storage. Hydrogen is of course incredibly flammable. For safety, I designed low slots in the walls on two sides so that air could enter and vent out the top, taking with it any leaked hydrogen.

So how does the system actually produce hydrogen? The generator I purchased, from a Connecticut company called Proton OnSite, creates hydrogen and oxygen by splitting water that we pipe in from an on-site well. It is rated to make 90 grams (3 ounces) of hydrogen per hour. With the amount of sunlight Iowa receives, I can make an average of 450 grams of hydrogen per day. We can make more on a summer day, when we have more daylight, than we can in winter.

The generator was designed to operate continuously. But we’d be relying on solar power, which is intermittent, so David Toyne, who specializes in factory automation and customized systems, worked with Proton to modify it. Now the generator makes less hydrogen on overcast days and enters standby when the solar arrays’ output is too low. At the end of each day, the generator automatically turns off after being on standby for 20 minutes.

Generating ammonia posed some other challenges. I wanted to make the ammonia on-site, so that I could show it was possible for a farm to produce its fuel and fertilizer with no carbon emissions.

A substantial percentage of the world’s population depends on food grown using nitrogen-based fertilizers, including ammonia. It’s hard to beat for boosting crop yields. For example, Adam Sylvester, Pinehurst’s farm manager, told me that if we did not use nitrogen-based fertilizers on our cornfields, the yield would be about 250 bushels per hectare (100 bushels per acre), instead of the 500 bushels we get now. Clearly, the advantages to producing ammonia on location extend beyond just fuel.

But ammonia production also accounts for about 1 percent of all greenhouse emissions, largely from the fossil fuels powering most reactors. And just like hydrogen, ammonia comes with safety concerns. Ammonia is an irritant to the eyes, respiratory tract, mucus membranes, and skin.

Even so, ammonia has been used for years in refrigeration as well as fertilizer. It’s also an attractive carbon-free fuel. A ruptured ammonia tank won’t explode or catch fire as a propane tank will, and the liquid is stored at a much lower pressure than is hydrogen gas (1 megapascal for ammonia versus 70 MPa for hydrogen).

While attending the NH3 Fuel Conference in Sacramento in 2013, I had dinner with Bill Ayres, a director for the NH3 Fuel Association, and we discussed my interest in making ammonia in a self-contained system. Ayres pointed me to Doug Carpenter, who had developed a way to make ammonia on a small scale—provided you already have the hydrogen. Which I did. Carpenter delivered the reactor in 2016, several months before his untimely passing.

We turned again to Ramsey Creek to construct the ammonia-generation building. The 9-square-meter building, similar in design to the hydrogen shed, houses the pumps, valves, controls, ammonia reactor, collector tanks, and 10 high-pressure storage tanks. We make nitrogen by flowing compressed air through a nitrogen generator and removing the atmospheric oxygen. Before entering the reactor, the hydrogen and nitrogen are compressed to 24 MPa (3,500 pounds per square inch).

It’s been a process of trial and error to get the system right. When we first started making ammonia, we found it took too long for the reactor’s preheater to heat the hydrogen and nitrogen, so we added electrical band heaters around the outside of the unit. Unfortunately, the additional heat weakened the outer steel shell, and the next time we attempted to make ammonia, the outer shell split open. The mixed gases, which were under pressure at 24 MPa, caught fire. Toyne was in the equipment room at the time and noticed the pressure dropping. He made it out to the ammonia building in time to take pictures of the flames. After a few minutes, the gas had all vented through the top of the building. Luckily, only the reactor was damaged, and no one was hurt.

After that incident, we redesigned the ammonia reactor to add internal electrical heaters, which warm the apparatus before the gases are introduced. We also insulated the outer pressure shell from the heated inside components. Once started, the reaction forming the ammonia needs no additional heat.

Our ammonia system, like our hydrogen and nitrogen systems, is hooked up to the solar panels, so we cannot run it round the clock. Also, because of the limited amount of solar power we have, we can make either hydrogen or nitrogen on any given day. Once we have enough of both, we can produce a batch of ammonia. At first, we had difficulty producing nitrogen pure enough for ammonia production, but we solved that problem by mixing in a bit of hydrogen. The hydrogen bonds with the oxygen to create water vapor, which is far easier to remove than atmospheric oxygen.

We’ve estimated that our system uses a total of 14 kilowatt-hours to make a liter of ammonia, which contains 3.8 kWh of energy. This may seem inefficient, but it’s comparable to the amount of usable energy we could get from a diesel-powered tractor. About two-thirds of the electrical energy is used to make the hydrogen, one-quarter is used to make the nitrogen, and the remainder is for the ammonia.

Each batch of ammonia is about 38 liters (10 gallons). It takes 10 batches to make enough ammonia to fertilize 1.2 hectares of the farm’s nearly 61 hectares (3 of 150 acres) of corn. Thankfully, we can use the same ammonia for either application—it has to be liquid regardless of whether we’re using it for fertilizer or fuel.

We now have the basis of an on-site carbon-emission-free system for fueling a tractor and generating fertilizer, but there’s still plenty to improve. The solar arrays were sized to generate only hydrogen. We need additional solar panels or perhaps wind turbines to make more hydrogen, nitrogen, and ammonia. In order to make these improvements, we’ve created the Schmuecker Renewable Energy System, a nonprofit organization that accepts donations.

Toyne compares our system to the Wright brothers’ airplane: It is the initial demonstration of what is possible. Hydrogen and ammonia fuels will become more viable as the equipment costs decrease and more people gain experience working with them. I’ve spent more than US $2 million of my retirement savings on the effort. But much of the expense was due to the custom nature of the work: We estimate that to replicate the farm’s current setup would cost a third to half as much and would be more efficient with today’s improved equipment.

We’ve gotten a lot of interest about what we’ve installed so far. Our tractor has drawn attention from other farmers in Iowa. We’ve received inquiries from Europe, South Africa, Saudi Arabia, and Australia about making ammonia with no carbon emissions. In May 2018, we were showing our system to two employees of the U.S. Department of Energy, and they were so intrigued they invited us to present at an Advanced Research Projects Agency–Energy (ARPA-E) program on renewable, carbon-free energy generation that July.

Humankind needs to develop renewable, carbon-emission-free systems like the one we’ve demonstrated. If we do not harness other energy sources to address climate change and replace fossil fuels, future farmers will find it harder and harder to feed everyone. Our warming world will become one in which famine is an everyday occurrence.

This article appears in the November 2019 print issue as “The Carbon-Free Farm.”

About the Author

Jay Schmuecker worked for more than 50 years building planetary spacecraft at NASA’s Jet Propulsion Laboratory. Since retiring, he has been developing a solar-powered hydrogen fueling and fertilization system at Pinehurst Farm in eastern Iowa.

For Two Power Grid Experts, Hurricane Maria Became a Huge Experiment

Post Syndicated from Jean Kumagai original https://spectrum.ieee.org/energywise/energy/renewables/for-two-power-grid-experts-hurricane-maria-became-a-huge-experiment

In research, sometimes the investigator becomes part of the experiment. That’s exactly what happened to Efraín O’Neill-Carrillo and Agustín Irizarry-Rivera, both professors of electrical engineering at the University of Puerto Rico Mayagüez, when Hurricane Maria hit Puerto Rico on 20 September 2017. Along with every other resident of the island, they lost power in an islandwide blackout that lasted for months.

The two have studied Puerto Rico’s fragile electricity infrastructure for nearly two decades and, considering the island’s location in a hurricane zone, had been proposing ways to make it more resilient.

They also practice what they preach. Back in 2008, O’Neill-Carrillo outfitted his home with a 1.1-kilowatt rooftop photovoltaic system and a 5.4-kilowatt-hour battery bank that could operate independently of the main grid. He was on a business trip when Maria struck, but he worried a bit less knowing that his family would have power.

Irizarry-Rivera [top] wasn’t so lucky. His home in San Germán also had solar panels. “But it was a grid-tied system,” he says, “so of course it wasn’t working.” It didn’t have storage or the necessary control electronics to allow his household to draw electricity directly from the solar panels, he explains.

“I estimated I wouldn’t get [grid] power until March,” Irizarry-Rivera says. “It came back in February, so I wasn’t too far off.” In the meantime, he spent more than a month acquiring and installing batteries, charge controllers, and a new stand-alone inverter. His family then relied exclusively on solar power for 101 days, until grid power was restored.

In “How to Harden Puerto Rico’s Grid Against Hurricanes,” the two engineers describe how Puerto Rico could benefit from community microgrids made up of similar small PV systems. The amount of power they produce wouldn’t meet the average Puerto Rican household’s typical demand. But, Irizarry-Rivera points out, you quickly learn to get by with less.

“We got a lot of things done with 4 kilowatt-hours a day,” he says of his own household. “We had lighting and our personal electronics working, we could wash our clothes, run our refrigerator. Everything else is just luxuries and conveniences.”

This article appears in the November 2019 print issue as “After Maria.”

How to Harden Puerto Rico’s Grid Against Hurricanes

Post Syndicated from Efraín O’Neill-Carrillo original https://spectrum.ieee.org/energy/policy/how-to-harden-puerto-ricos-grid-against-hurricanes

Another devastating hurricane season winds down in the Caribbean. As in previous years, we are left with haunting images of entire neighborhoods flattened, flooded streets, and ruined communities. This time it was the Bahamas, where damage was estimated at US $7 billion and at least 50 people were confirmed dead, with the possibility of many more fatalities yet to be discovered.

A little over two years ago, even greater devastation was wreaked upon Puerto Rico. The back-to-back calamity of Hurricanes Irma and Maria killed nearly 3,000 people and triggered the longest blackout in U.S. history. All 1.5 million customers of the Puerto Rico Electric Power Authority lost power. Thanks to heroic efforts by emergency utility crews, about 95 percent of customers had their service restored after about 6 months. But the remaining 5 percent—representing some 250,000 people—had to wait nearly a year.

After the hurricanes, many observers were stunned by the ravages to Puerto Rico’s centralized power grid: Twenty-five percent of the island’s electric transmission towers were severely damaged, as were 40 percent of the 334 substations. Power lines all over the island were downed, including the critical north-south transmission lines that cross the island’s mountainous interior and move electricity generated by large power plants on Puerto Rico’s south shore to the more populated north.

In the weeks and months following the hurricane, many of the 3.3 million inhabitants of Puerto Rico, who are all U.S. citizens, were forced to rely on noisy, noxious diesel- or gasoline-fired generators. The generators were expensive to operate, and people had to wait in long lines just to get enough fuel to last a few hours. Government emergency services were slow to reach people, and many residents found assistance instead from within their own communities, from family and friends.

The two of us weren’t surprised that the hurricane caused such intense and long-lasting havoc. For more than 20 years, our group at the University of Puerto Rico Mayagüez has studied Puerto Rico’s vulnerable electricity network and considered alternatives that would better serve the island’s communities.

Hurricanes are a fact of life in the Caribbean. Preparing for natural disaster is what any responsible government should do. And yet, even before the storm, we had become increasingly concerned at how the Puerto Rico Electric Power Authority, or PREPA, had bowed to partisan politics and allowed the island’s electrical infrastructure to fall into disrepair. Worse, PREPA, a once well-regarded public power company, chose not to invest in new technology and organizational innovations that would have made the grid more durable, efficient, and sustainable.

In our research, we’ve tried to answer such questions as these: What would it take to make the island’s electricity network more resilient in the face of a natural disaster? Would a more decentralized system provide better service than the single central grid and large fossil-fuel power plants that Puerto Rico now relies on? Hurricane Maria turned our academic questions into a huge, open-air experiment that included millions of unwilling subjects—ourselves included. [For more on our experiences during the storm, see “For Two Power Grid Experts, Hurricane Maria Became a Huge Experiment.”]

As Puerto Rico rebuilds, there is an extraordinary opportunity to rethink the island’s power grid and move toward a flexible, robust system capable of withstanding punishing storms. Based on our years of study and analysis, we have devised a comprehensive plan for such a grid, one that would be much better suited to the conditions and risks faced by island populations. This grid would rely heavily on microgrids, distributed solar photovoltaics, and battery storage to give utilities and residents much greater resilience than could ever be achieved with a conventional grid. We are confident our ideas could benefit island communities in any part of the world marked by powerful storms and other unpredictable threats.

As is typical throughout the world, Puerto Rico designed its electricity infrastructure around large power plants that feed into an interconnected network of high-voltage transmission lines and lower-voltage distribution lines. When this system was built, large-scale energy storage was very limited. So then, as now, the grid’s control systems had to constantly match generation with demand at all times while maintaining a desired voltage and frequency across the network. About 70 percent of Puerto Rico’s fossil-fuel generation is located along the island’s south coast, while 70 percent of the demand is concentrated in the north, which necessitated building transmission lines across the tropical mountainous interior.

The hurricane vividly exposed the system’s vulnerability. Officials finally acknowledged that it made no sense for a heavily populated island sitting squarely in the Caribbean’s hurricane zone to rely on a centralized infrastructure that was developed for continent-wide systems, and based on technology, assumptions, and economics from the last century. After Maria, many electricity experts called for Puerto Rico to move toward a more decentralized grid.

It was a bittersweet moment for us, because we’d been saying the same thing for more than a decade. Back in 2008, for instance, our group at the university assessed the potential for renewable energy [PDF] on the island. We looked at biomass, microhydropower, ocean, photovoltaics (PV), solar thermal, wind, and fuel cells. Of these, rooftop PV stood out. We estimated that equipping about two-thirds of residential roofs with photovoltaics would be enough to meet the total daytime peak demand—about 3 gigawatts—for the entire island.

To be sure, interconnecting so much distributed energy generation to the power grid would be an enormous challenge, as we stated in the report. However, in the 11 years since that study, PV technology—as well as energy storage, PV inverters, and control software—has gotten much better and less costly. Now, more than ever, distributed-solar PV is the way to go for Puerto Rico.

Sadly, though, renewable energy did not take off in Puerto Rico. Right before Maria, renewable sources were supplying just 2.4 percent of the island’s electricity, from a combination of rooftop PV, several onshore wind and solar-power farms, and a few small outdated hydropower plants.

Progress has been hamstrung by PREPA. The utility was founded as a government corporation in 1941 to interconnect the existing isolated electric systems and achieve islandwide electrification at a reasonable cost. By the early 1970s, it had succeeded.

Meanwhile, generous tax incentives had induced many large companies to locate their factories and other facilities in Puerto Rico. The utility relied heavily on those large customers, which paid on time and helped finance PREPA’s infrastructure improvements. But in the late 1990s, a change in U.S. tax code led to the departure of nearly 60 percent of PREPA’s industrial clients. To close the gap between its revenues and operating costs, PREPA periodically issued new municipal bonds. It wasn’t enough. The utility’s operating and management practices failed to adapt to the new reality of more environmental controls, the rise of renewable energy, and demands for better customer service. Having accumulated $9 billion in debt, PREPA filed for bankruptcy in July 2017.

Then the hurricane struck. After the debris was cleared came the recognition—finally—that the technological options for supplying electricity have multiplied. For starters, distributed energy resources like rooftop PV and battery storage are now economically competitive with grid power in Puerto Rico. Over the last 10 years, the residential retail price of electricity has fluctuated between 20 and 27 U.S. cents per kilowatt-hour; for comparison, the average price in the rest of the United States is about 13 cents per kWh. When you factor in the additional rate increases that will be needed to service PREPA’s debt, the price will eventually exceed 30 cents per kWh. That’s more than the levelized cost of electricity (LCOE) from a rooftop PV system plus battery storage, at 24 to 29 cents per kWh, depending on financing and battery type. And if these solar-plus-storage systems were purchased in bulk, the LCOE would be even less.

Also, the technology now exists to match supply and demand locally, by using energy storage and by selectively lowering demand through improved efficiency, conservation, and demand-response actions. We have new control and communications systems that allow these distributed energy resources to be interconnected into a community network capable of meeting the electricity needs of a village or neighborhood.

Such a system is called a community microgrid. It is basically a small electrical network that connects electricity consumers—for example, dozens or hundreds of homes—with one or more sources of electricity, such as solar panels, along with inverters, control electronics, and some energy storage. In the event of an outage, disconnect switches enable this small grid to be quickly isolated from the larger grid that surrounds it or from neighboring microgrids, as the case may be.

Here’s how Puerto Rico’s grid could be refashioned from the bottom up. In each community microgrid, users would collectively install enough solar panels to satisfy local demand. These distributed resources and the related loads would be connected to one another and also tied to the main grid.

Over time, community microgrids could interconnect to form a regional grid. Eventually, Puerto Rico’s single centralized power grid could even be replaced by interconnecting regional grids and community microgrids. If a storm or some other calamity threatens one or more microgrids, neighboring ones could disconnect and operate independently. Studies of how grids are affected by storms have repeatedly shown that a large percentage of power outages are caused by relatively tiny areas of grid damage. So the ability to quickly isolate the areas of damage, as a system of microgrids is able to do, can be enormously beneficial in coping with storms. The upshot is that an interconnection of microgrids would be far more resilient and reliable than Puerto Rico’s current grid and also more sustainable and economical.

Could such a model actually work in Puerto Rico? It certainly could. Starting in 2009, our research group developed a model for a microgrid that would serve a typical community in Puerto Rico. In the latest version, the overall microgrid serves 700 houses, divided into 70 groups of 10 houses. Each of these groups is connected to its own distribution transformer, which serves as the connection point to the rest of the community microgrid. All of the transformers are connected by 4.16-kilovolt lines in a radial network. [See diagram, “A Grid of Microgrids.”]

Each group within the community microgrid would be equipped with solar panels, inverters, batteries, control and communications systems, and protective devices. For the 10 homes in each group, there would be an aggregate PV supply of 10 to 20 kW, or 1 to 2 kW per house. The aggregate battery storage per group is 128 kWh, which is enough to get the homes through most nights without requiring power from the larger grid. (The amounts of storage and supply in our model are based on measurements of energy demand and variations in solar irradiance in an actual Puerto Rican town; obviously, they could be scaled up or down, according to local needs.)

In our tests, we assume that each community microgrid remains connected to the central grid (or rather, a new and improved version of Puerto Rico’s central grid) under normal conditions but also manages its own energy resources. We also assume that individual households and businesses have taken significant steps to improve their energy conservation and efficiency—through the use of higher-efficiency appliances, for instance. Electricity demand must still be balanced with generation, but that balancing is made easier due to the presence of battery storage.

That capability means the microgrids in our model can make use of demand response, a technique that enables customers to cut their electricity consumption by a predefined amount during times of peak usage or crisis. In exchange for cutting demand, the customer receives preferential rates, and the central grid benefits by limiting its peak demand. Many utilities around the world now use some form of demand response to reduce their reliance on fast-starting generating facilities, typically fired by natural gas, that provide additional capacity at times of peak demand. PREPA’s antiquated grid, however, isn’t yet set up for demand response.

During any disruption that knocks out all or part of the central grid, our model’s community microgrids would disconnect from the main grid. In this “islanded” mode, the local community would continue to receive electricity from the batteries and solar panels for essential loads, such as refrigeration. Like demand response, this capability would be built into and managed by the communications and control systems. Such technology exists, but not yet in Puerto Rico.

Besides the modeling and simulation, our research group has been working with several communities in Puerto Rico that are interested in developing local microgrids and distributed-energy resources. We have helped one community secure funding to install ten 2-kW rooftop PV systems, which they eventually hope to connect into a community microgrid based on our design.

Other communities in central Puerto Rico have installed similar systems since the hurricane. The largest of these consists of 28 small PV systems in Toro Negro, a town in the municipality of Ciales. Most are rooftop PV systems serving a single household, but a few serve two or three houses, which share the resources.

Another project at the University of Puerto Rico Mayagüez built five stand-alone PV kiosks, which were deployed in rural locations that had no electricity for months after Maria. University staff, students, and faculty all contributed to this effort. The kiosks address the simple fact that rural and otherwise isolated communities are usually the last to be reconnected to the power grid after blackouts caused by natural disasters.

Taking this idea one step further, a member of our group, Marcel J. Castro-Sitiriche, recently proposed that the 200,000 households that were the last to be reconnected to the grid following the hurricane should receive rooftop PV and battery storage systems, to be paid for out of grid-reconstruction funds. If those households had had such systems and thus been able to weather the storm with no interruption in service, the blackout would have lasted for 6 months instead of a year. The cost of materials and installation for a 2-kW PV system with 10 kWh of batteries comes to about $7,000, assuming $3 per watt for the PV systems and $100/kWh for lead-acid batteries. Many households and small businesses spent nearly that much on diesel fuel to power generators during the months they had no grid connection.

To outfit all 200,000 of those households would come to $1.4 billion, a sizable sum. But it’s just a fraction of what the Puerto Rico government has proposed spending on an enhanced central grid. Rather than merely rebuilding PREPA’s grid, Castro-Sitiriche argues, the government should focus its attention on protecting those most vulnerable to any future natural disaster.

As engineers, we’re of course interested in the details of distributed-energy resources and microgrid technology. But our fieldwork has taught us the importance of considering the social implications and the end users.

One big advantage of the distributed-microgrid approach is that it’s centered on Puerto Rico’s most reliable social structures: families, friends, and local community. When all else failed after Hurricane Maria, those were the networks that rose to the many challenges Puerto Ricans faced. We think it makes sense to build a resilient electricity grid around this key resource. With proper training, local residents and businesspeople could learn to operate and maintain their community microgrid.

A move toward community microgrids would be more than a technical solution—it would be a socioeconomic development strategy. That’s because a greater reliance on distributed energy would favor small and medium-size businesses, which tend to invest in their communities, pay taxes locally, and generate jobs.

There is a precedent for this model: Over 200 communities in Puerto Rico extract and treat their own potable water, through arrangements known as acueductos comunitarios, or community aqueducts. A key component to this arrangement is having a solid governance agreement among community members. Our social-science colleagues at the university have studied how community aqueducts are managed, and from them we have learned some best practices that have influenced the design of our community microgrid concept. Perhaps most important is that the community agrees to manage electricity demand in a flexible way. This can help minimize the amount of battery storage needed and therefore the overall cost of the microgrid.

During outages and emergencies, for instance, when the microgrid is running in islanded mode, users would be expected to be conservative and flexible about their electricity usage. They might have to agree to run their washing machines only on sunny days. For less conscientious users, sensors monitoring their energy usage could trigger a signal to their cellphones, reminding them to curtail their consumption. That strategy has already been successfully implemented as part of demand-response programs elsewhere in the world.

Readers living in the mainland United States or Western Europe, accustomed to reliable, round-the-clock electricity, might consider such measures highly inconvenient. But the residents of Puerto Rico, we believe, would be more accepting. Overnight, we went from being a fully electrified, modern society to having no electricity at all. The memory is still raw. A community microgrid that compels people to occasionally cut their electricity consumption and to take greater responsibility over the local electricity infrastructure would be far more preferable.

This model is applicable beyond Puerto Rico—it could benefit other islands in the tropics and subtropics, as well as polar regions and other areas that have weak or no grid connections. For those locales, it no longer makes sense to invest millions or billions of dollars to extend and maintain a centralized electric system. Thanks to the advance of solar, power electronics, control, and energy-storage technologies, community-based, distributed-energy initiatives are already challenging the dominant centralized energy model in many parts of the world. More than two years after Hurricane Maria, it’s finally time for Puerto Rico to see the light.

About the Authors

Efraín O’Neill-Carrillo and Agustín Irizarry-Rivera are professors of electrical engineering at the University of Puerto Rico Mayagüez.

Alphabet’s Makani Tests Wind Energy Kites in the North Sea

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/energywise/energy/renewables/alphabets-makani-tests-wind-energy-kites-in-the-north-sea

The idea is simple: Send kites or tethered drones hundreds of meters up in the sky to generate electricity from the persistent winds aloft. With such technologies, it might even be possible to produce wind energy around the clock. However, the engineering required to realize this vision is still very much a work in progress.

Dozens of companies and researchers devoted to developing technologies that produce wind power while adrift high in the sky gathered at a conference in Glasgow, Scotland last week. They presented studies, experiments, field tests, and simulations describing the efficiency and cost-effectiveness of various technologies collectively described as airborne wind energy (AWE).

In August, Alameda, Calif.-based Makani Technologies ran demonstration flights of its airborne wind turbines—which the company calls energy kites—in the North Sea, some 10 kilometers off the coast of Norway. According to Makani CEO Fort Felker, the North Sea tests consisted of a launch and “landing” test for the flyer followed by a flight test, in which the kite stayed aloft for an hour in “robust crosswind(s).” The flights were the first offshore tests of the company’s kite-and-buoy setup. The company has, however, been conducting onshore flights of various incarnations of their energy kites in California and Hawaii.

Wind Turbines Just Keep Getting Bigger, But There’s a Limit

Post Syndicated from Vaclav Smil original https://spectrum.ieee.org/energy/renewables/wind-turbines-just-keep-getting-bigger-but-theres-a-limit

Wind turbines have certainly grown up. When the Danish firm Vestas began the trend toward gigantism, in 1981, its three-blade machines were capable of a mere 55 kilowatts. That figure rose to 500 kW in 1995, reached 2 MW in 1999, and today stands at 5.6 MW. In 2021, MHI Vestas Offshore Wind’s V164 will rise 105 meters high at the hub, swing 80-meter blades, and generate up to 10 MW, making it the first commercially available double-digit turbine ever. Not to be left behind, General Electric’s Renewable Energy is developing a 12-MW machine with a 260-meter tower and 107-meter blades, also rolling out by 2021.

That is clearly pushing the envelope, although it must be noted that still larger designs have been considered. In 2011, the UpWind project released what it called a predesign of a 20-MW offshore machine with a rotor diameter of 252 meters (three times the wingspan of an Airbus A380) and a hub diameter of 6 meters. So far, the limit of the largest conceptual designs stands at 50 MW, with height exceeding 300 meters and with 200-meter blades that could flex (much like palm fronds) in furious winds.

To imply, as an enthusiastic promoter did, that building such a structure would pose no fundamental technical problems because it stands no higher than the Eiffel tower, constructed 130 years ago, is to choose an inappropriate comparison. If the constructible height of an artifact were the determinant of wind-turbine design then we might as well refer to the Burj Khalifa in Dubai, a skyscraper that topped 800 meters in 2010, or to the Jeddah Tower, which will reach 1,000 meters in 2021. Erecting a tall tower is no great problem; it’s quite another proposition, however, to engineer a tall tower that can support a massive nacelle and rotating blades for many years of safe operation.

Larger turbines must face the inescapable effects of scaling. Turbine power increases with the square of the radius swept by its blades: A turbine with blades twice as long would, theoretically, be four times as powerful. But the expansion of the surface swept by the rotor puts a greater strain on the entire assembly, and because blade mass should (at first glance) increase as a cube of blade length, larger designs should be extraordinarily heavy. In reality, designs using lightweight synthetic materials and balsa can keep the actual exponent to as little as 2.3.

Even so, the mass (and hence the cost) adds up. Each of the three blades of Vestas’s 10-MW machine will weigh 35 metric tons, and the nacelle will come to nearly 400 tons. GE’s record-breaking design will have blades of 55 tons, a nacelle of 600 tons, and a tower of 2,550 tons. Merely transporting such long and massive blades is an unusual challenge, although it could be made easier by using a segmented design.

Exploring likely limits of commercial capacity is more useful than forecasting specific maxima for given dates. Available wind turbine power [PDF] is equal to half the density of the air (which is 1.23 kilograms per cubic meter) times the area swept by the blades (pi times the radius squared) times the cube of wind velocity. Assuming a wind velocity of 12 meters per second and an energy-conversion coefficient of 0.4, then a 100-MW turbine would require rotors nearly 550 meters in diameter.

To predict when we’ll get such a machine, just answer this question: When will we be able to produce 275-meter blades of plastic composites and balsa, figure out their transport and their coupling to nacelles hanging 300 meters above the ground, ensure their survival in cyclonic winds, and guarantee their reliable operation for at least 15 or 20 years? Not soon.

This article appears in the November 2019 print issue as “Wind Turbines: How Big?”

Is the World Ready for Floating Nuclear Power Stations?

Post Syndicated from John Boyd original https://spectrum.ieee.org/energywise/energy/nuclear/is-the-world-ready-for-floating-nuclear-power-stations

The world’s first floating nuclear power plant (FNPP) docked at Pevek, Chukotka, in Russia’s remote Far East on 14 September. It completed a journey of some 9,000 kilometers from where it was constructed in a St. Petersburg shipyard. First, it was towed to the city of Murmansk, where its nuclear fuel was loaded, and from there took the North Sea Route to the other side of Russia’s Arctic coast

The FNPP will replace an aging land-based nuclear plant and a brown coal-fired plant, reducing some 50,000 tons of CO2 emissions a year, according to Rosatom, the project’s creator and Russia’s state nuclear corporation. The reactor is slated to begin operations this December.

The co-generation plant, named the Akademik Lomonosov, consists of a non-motorized barge, two pressurized-water KLT-40S reactors similar to those powering Russian nuclear icebreakers, and two steam turbine plants.

The FNPP can generate up to 70 megawatts (MW) of electricity and 50 gigacalories of heat an hour. That is sufficient to power the electric grids of the resource-rich region—where some 50,000 people live and work—and also deliver steam heat to the supply lines of Pevek city. The plant will manage this second feat by using steam extracted from the turbines to heat its intermediate circuit water system, which circulates between the reactor units and the coastal facilities, from 70 to 130 degrees C.

Construction of the floating reactor began in 2007 and had to overcome a messy financial situation including the threat of bankruptcy in 2011. The venture is based on the small modular reactor (SMR) design: a type of nuclear fission reactor that is smaller than conventional reactors. Such reactors can be built from start to finish at a plant and then shipped—fully-assembled, tested, and ready to operate—to remote sites where normal construction would be difficult to manage.

Andrey Zolotkov, head of the Murmansk, Russia office of Bellona Foundation, an environmental organization based in Oslo, Norway, acknowledges the practicability of the SMR design. But he is one of many who questions its necessity in this particular case. 

“The same plant could be built on the ground there (in Chukotka) without resorting to creating a floating structure,” says Zolotkov. “After all, the [nuclear power plant] presently in use was built on land there and has been operating for decades.” 

The floating design has raised both environmental and safety concerns, given that the plant will operate in the pristine Arctic and must endure its harsh winters and choppy seas. Greenpeace has dubbed it a “floating Chernobyl,” and “a nuclear Titanic.”

Rosatom rejects such criticism, saying the plant meets safety standards put forth by Russia and the International Atomic Energy Agency. The company notes the same kind of reactors have been used in icebreakers and submarines for decades. And, Rosatom states on its website, the “FNPP will be moored and secured to a special pier,” and operate without the need for “motor or propeller functions.”

Coastal structures, dams, and breakwaters have also been built to protect the vessel against tsunamis and icebergs.

The plant employs a number of active and passive safety systems, including an electrically-driven automated system and a passive system that uses gravity to insert control rods into the reactor core to ensure the reactor remains at subcritical levels in emergencies. The reactors also use low enriched uranium in a concentration below 20 percent of Uranium-235. This makes the fuel unsuitable for producing nuclear weapons.

Given such safety measures, Rosatom says on its site that a peer-reviewed probabilistic safety assessment modeling of possible damage to the FNPP finds the chances of a serious accident happening at the FNPP “are less than one hundred thousandth of a percent.”

Zolotkov, who worked in various capacities—including radiation safety officer—for 35 years in Russia’s civilian nuclear fleet, also notes that there have been no serious incidents on such ships since 1975. “In the event of an accident in the FNPP, the consequences, I believe, would be localized within its structure, so the release of radioactive substances will be minimal,” he says. 

The plant’s nuclear fuel has to be replaced every three years. The unloaded fuel is held in onboard storage pools, and later in dry containers also kept on board. Every 10 to 12 years during its 40-year life cycle (possibly extendable to 50 years), the FNPP will be towed to a special facility for maintenance.

After decommissioning, the plant will be towed to a deconstruction and recycling facility. Rosatom says on its site, “No spent nuclear fuel or radioactive waste is planned to be left in the Arctic—spent fuel will be taken to the special storage facilities in mainland Russia.” 

Rosatom has not disclosed the cost of the venture, calling it a pilot project. It is currently working on a next-generation version that will use two RITM-200M reactors, each rated at 50 MW. Improvement targets include a more compact design, longer periods between refueling, flexible load-following capabilities, and multipurpose uses that include water desalination and district heating. 

Provided Rosatom receives sufficient orders, it says it aims to compete in price with plants based on fossil fuels and renewable energy.

The company, however, may face challenges other than marketing and operating its novel design. “These FNPPs will eventually carry spent nuclear fuel and are not yet recognized by international maritime law,” says Zolotkov. “So Rosatom may face problems obtaining permits and insurance when it comes to towing them along certain sea routes.”

U.S. Battery Producer Celgard Files Lawsuit for IP Theft

Post Syndicated from Dexter Johnson original https://spectrum.ieee.org/energywise/energy/renewables/battery-producer-files-massive-lawsuit-for-ip-theft

In a lawsuit filed in California, U.S. battery manufacturer Celgard, a subsidiary of Polypore International, has sued Shenzhen Senior Technology Material Co., Ltd. (Senior) for patent infringement and for misappropriating Celgard’s trade secrets and confidential information.

The suit has a bit of a spy novel twist in that Celgard alleges in its complaint that one of its senior scientists left the company in October 2016 and moved to China to join Senior, after which he changed his name to cover up his identity. This scientist is alleged to be the source through which Senior acquired Celgard’s intellectual property.

The Ultimate Optimization Problem: How to Best Use Every Square Meter of the Earth’s Surface

Post Syndicated from Eliza Strickland original https://spectrum.ieee.org/tech-talk/energy/environment/the-ultimate-optimization-problem-how-to-best-use-every-square-meter-of-the-earths-surface

Lucas Joppa thinks big. Even while gazing down into his cup of tea in his modest office on Microsoft’s campus in Redmond, Washington, he seems to see the entire planet bobbing in there like a spherical tea bag. 

As Microsoft’s first chief environmental officer, Joppa came up with the company’s AI for Earth program, a five-year effort that’s spending US $50 million on AI-powered solutions to global environmental challenges.

The program is not just about specific deliverables, though. It’s also about mindset, Joppa told IEEE Spectrum in an interview in July. “It’s a plea for people to think about the Earth in the same way they think about the technologies they’re developing,” he says. “You start with an objective. So what’s our objective function for Earth?” (In computer science, an objective function describes the parameter or parameters you are trying to maximize or minimize for optimal results.)

AI for Earth launched in December 2017, and Joppa’s team has since given grants to more than 400 organizations around the world. In addition to receiving funding, some grantees get help from Microsoft’s data scientists and access to the company’s computing resources. 

In a wide-ranging interview about the program, Joppa described his vision of the “ultimate optimization problem”—figuring out which parts of the planet should be used for farming, cities, wilderness reserves, energy production, and so on. 

Every square meter of land and water on Earth has an infinite number of possible utility functions. It’s the job of Homo sapiens to describe our overall objective for the Earth. Then it’s the job of computers to produce optimization results that are aligned with the human-defined objective.

I don’t think we’re close at all to being able to do this. I think we’re closer from a technology perspective—being able to run the model—than we are from a social perspective—being able to make decisions about what the objective should be. What do we want to do with the Earth’s surface?

Carbon Emissions Rising Despite European and North American Efforts to Reduce Them

Post Syndicated from Vaclav Smil original https://spectrum.ieee.org/energy/fossil-fuels/carbon-emissions-rising-despite-european-and-north-american-efforts-to-reduce-them

In 1896, Svante Arrhenius, of Sweden, became the first scientist to quantify the effects [PDF] of man-made carbon dioxide on global temperatures. He calculated that doubling the atmospheric level of the gas from its concentration in his time would raise the average midlatitude temperature by 5 to 6 degrees Celsius. That’s not too far from the latest results, obtained by computer models running more than 200,000 lines of code.

The United Nations held its first Framework Convention on Climate Change in 1992, and this was followed by a series of meetings and climate treaties. But the global emissions of carbon dioxide have been rising steadily just the same.

At the beginning of the 19th century, when the United Kingdom was the only major coal producer, global emissions of carbon from fossil fuel combustion were minuscule, at less than 10 million metric tons a year. (To express them in terms of carbon dioxide, just multiply by 3.66.) By century’s end, emissions surpassed half a billion metric tons of carbon. By 1950, they had topped 1.5 billion metric tons. The postwar economic expansion in Europe, North America, the U.S.S.R., and Japan, along with the post-1980 economic rise of China, quadrupled emissions thereafter, to about 6.5 billion metric tons of carbon by the year 2000.

The new century has seen a significant divergence. By 2017, emissions had declined by about 15 percent in the European Union, with its slower economic growth and aging population, and also in the United States, thanks largely to the increasing use of natural gas instead of coal. However, all these gains were outbalanced by Chinese carbon emissions, which rose from about 1 billion to about 3 billion metric tons—enough to increase the worldwide total by nearly 45 percent, to 10.1 billion metric tons.

By burning huge stocks of carbon that fossilized ages ago, human beings have pushed carbon dioxide concentrations to levels not seen for about 3 million years. The sampling of air locked in tiny bubbles in cores drilled into Antarctic and Greenland ice has enabled us to reconstruct carbon dioxide concentrations going back some 800,000 years. Back then the atmospheric levels of the gas fluctuated between 180 and 280 parts per million (that is, from 0.018 to 0.028 percent). During the past millennium, the concentrations remained fairly stable, ranging from 275 ppm in the early 1600s to about 285 ppm before the end of the 19th century. Continuous measurements of the gas began near the top of Mauna Loa, in Hawaii, in 1958: The 1959 mean was 316 ppm, the 2015 average reached 400 ppm, and 415 ppm was first recorded in May 2019.

Emissions will continue to decline in affluent countries, and the rate at which they grow in China has begun to slow down. However, it is speeding up in India and Africa, and hence it is unlikely that we will see any substantial global declines anytime soon.

The Paris agreement of 2015 was lauded as the first accord containing specific national commitments to reduce future emissions, but even if all its targets were met by 2030, carbon emissions would still rise to nearly 50 percent above the 2017 level. According to the 2018 study by the Intergovernmental Panel for Climate Change, the only way to keep the average world temperature rise to no more than 1.5 °C would be to put emissions almost immediately into a decline steep enough to bring them to zero by 2050.

That is not impossible—but it is very unlikely. The contrast between the expressed concerns about global warming and the continued releases of record volumes of carbon could not be starker.

This article appears in the October 2019 print issue as “The Carbon Century.”

Heat Pumps Could Shrink the Carbon Footprint of Buildings

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/environment/heat-pumps-could-shrink-the-carbon-footprint-of-buildings

Buildings use more than one-third of the world’s energy, most of it for heating spaces and water. Most of this heat is generated by burning natural gas, oil, or propane. And where these fossil fuels are consumed, greenhouse gas emissions are a given.

Electric heat pumps, first widely used in the 1970s in Europe, could be the best solution to cut that fossil fuel use. They could slash the carbon emissions of buildings by half. And if powered by renewables, emissions can potentially go down to zero.

Cutting carbon emissions from heating and cooling will be critical to keep global average temperatures from rising by more than 1.5 degrees Celsius above preindustrial levels. Already, anthropogenic climate change has caused average global temperatures to rise by approximately 1 degree C, according to the Intergovernmental Panel on Climate Change. At the United Nations Climate Action Summit next week in New York, world leaders will discuss concrete steps to meet climate targets.

Liquid Air Could Store Renewable Energy and Reduce Emissions

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/renewables/liquid-air-could-store-renewable-energy-and-cut-food-industry-emissions

Keeping food cold is an energy-gobbling endeavor. Refrigerated food warehouses and factories consume immense amounts of energy, and this cooling demand is expected to increase as the climate warms while global incomes and food consumption rise. A team of researchers and companies in Europe are now developing a cryogenic energy storage system that could reduce carbon emissions from the food sector while providing a convenient way to store wind and solar power.

The CryoHub project will use extra wind and solar electricity to freeze air to cryogenic temperatures, where it becomes liquid, and in the process shrinks by 700 times in volume. The liquid air is stored in insulated low-pressure tanks similar to ones used for liquid nitrogen and natural gas.

When the grid needs electricity, the subzero liquid is pumped into an evaporator where it expands back into a gas that can spin a turbine for electricity. As it expands, the liquid also sucks heat from surrounding air. “So you can basically provide free cooling for food storage,” says Judith Evans, a professor of air conditioning and refrigeration engineering at London South Bank University who is coordinating the CryoHub project.

Green Data: The Next Step to Zero-Emissions Data Centers

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/energywise/energy/environment/green-data-the-next-step-to-zeroemissions-data-centers

Data centers consume just two to three percent of the planet’s total electricity usage. So reducing data centers’ climate footprint may not seem, at first blush, to be a high priority as world leaders gather in New York next week to consider practical climate change solutions at the UN Climate Action Summit.

However there are at least two reasons why data centers will likely play a key role in any attempt to curb global emissions. First, as cloud computing becomes more energy-efficient and increasingly relies on renewable sources, other sectors such as manufacturing, transportation, and buildings could turn to green data centers to reduce their own emissions. For example—a car manufacturer might outsource all of its in-house computing to zero-emission data centers.

Even without such partnerships, though, data centers will likely play an important part in the climate’s future. The rise of AI, machine learning, big data, and the Internet of Things mean that data centers’ global electricity consumption will continue to increase. By one estimate, consumption could jump to as much as 13 percent of the world’s total electricity demand by 2030.

For these reasons, says Johan Falk, senior innovation fellow at the Stockholm Resilience Center in Sweden, data centers will have outsized importance in climate change mitigation efforts. And the more progress society makes in the near term, the sooner the benefits will begin to multiply.

Egypt’s Massive 1.8-Gigawatt Benban Solar Park Nears Completion

Post Syndicated from Amy Nordrum original https://spectrum.ieee.org/energywise/energy/renewables/egypts-massive-18gw-benban-solar-park-nears-completion

Amid the sand dunes of the western Sahara, workers are putting the finishing touches on one of the world’s largest solar installations. There, as many as 7.2 million photovoltaic panels will make up Benban Solar Park—a renewable energy project so massive, it will be visible from space. 

The 1.8-gigawatt installation is the first utility-scale PV plant in Egypt, a nation blessed with some of the best solar resources on the planet. The ambitious project is part of Egypt’s efforts to increase its generation capacity and incorporate more renewable sources into the mix. 

“I think Benban Solar Park is the first real step to put Egypt on the solar production world map,” says Mohamed Orabi, a professor of power electronics at Aswan University.

How Inexpensive Must Energy Storage Be for Utilities to Switch to 100 Percent Renewables?

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/renewables/what-energy-storage-would-have-to-cost-for-a-renewable-grid

Last week, the city of Los Angeles inked a deal for a solar-plus-storage system at a record-low price. The 400-MW Eland solar power project will be capable of storing 1,200 megawatt-hours of energy in lithium-ion batteries to meet demand at night. The project is a part of the city’s climate commitment to reach 100 percent renewable energy by 2045.

Electricity and heat production are the largest sources of greenhouse gas emissions in the world. Carbon-free electricity will be critical for keeping the average global temperature rise to within the United Nations’ target of 1.5 degrees Celsius and avoid the worst effects of climate change. As world leaders meet at the United Nations Climate Action Summit next week, boosting renewable energy and energy storage will be major priorities.

Satellites Spot Carbon Pollution From Oil and Gas Wells

Post Syndicated from Peter Fairley original https://spectrum.ieee.org/energywise/energy/fossil-fuels/eyes-high-in-the-sky-track-carbon-pollution

Berkeley, California banned the installation of natural gas pipes to new residential construction projects last month. The city is committed to slashing its carbon footprint, and natural gas is a carbon double-whammy: when burned, it releases carbon dioxide and, when leaked, its main ingredient, methane, is a far more potent greenhouse gas than CO2.

Several dozen California cities appear set to follow Berkeley’s lead. Which helps explain why the U.S. government’s plan to scrap mandatory monitoring for methane leaks is getting few cheers from the oil and gas industry. Paying attention to gas leaks is critical to defending their product’s social license.

Those leaks, meanwhile, may soon have nowhere to hide thanks to a growing wave of private, methane-detecting satellites being placed in orbit. Canada’s GHGSat led the charge in 2016 with its carbon-tracking Claire microsatellite, and the company now has a second-generation microsat ready to launch. Several more methane-detecting satellites are coming, including one from the Environmental Defense Fund. If gas producers don’t find and squelch their own pollution, this proliferation of remote observers will make it increasingly likely that others will shine a spotlight on it.

China’s Grid Architect Proposes a “Made in China” Upgrade to North America’s Power System

Post Syndicated from Peter Fairley original https://spectrum.ieee.org/energywise/energy/renewables/chinas-grid-architect-offers-made-in-china-upgrade-to-north-americas-power-system

Transmission lines in the United States and Canada require approval from every state and province traversed, and that political fragmentation hinders deployment of long power links of the type connecting vast swaths of territory in regions such as China, India, and Brazil. As a result, few studies detail how technologies that efficiently move power over thousands of kilometers, such as ultrahigh-voltage direct current (UHV DC) systems, might perform in North America. Earlier this week, the Beijing-based Global Energy Interconnection Development and Cooperation Organization (GEIDCO) stepped in to fill that gap, outlining an ambitious upgrade for North America’s grids.

GEIDCO’s plan promises to greatly shrink North America’s carbon footprint, but its boldest prescriptions represent technical and economic optimizations that run counter to political interests and recent trends. “Thinking out of the box is how you solve complicated, difficult problems,” said former Southern California Edison CEO Ted Craver in response to the plan. But GEIDCO’s approach, he said, raises concerns about energy sovereignty that could prove difficult to settle. As Craver put it: “There’s theory and then there’s practice.”

The proposed North American transmission scheme was unveiled on Tuesday at an international transmission forum in Vancouver, Canada, by Liu Zhenya, the former State Grid Corp. of China chairman who launched GEIDCO in 2016. While at State Grid, Liu championed the development of the world’s first 800- and 1,100-kilovolt UHV DC lines and the first 1,000-kV, UHV AC transmission. State Grid has deployed them to create a brawny hybrid AC-DC electricity system that taps far-flung energy resources to power China’s densely-populated and industrialized seaboard.

Through GEIDCO, Liu is proselytizing for UHV deployment worldwide. At the Vancouver meeting, Liu warned of “unimaginable damage to mankind” if greenhouse gas emissions continued at their current pace. He argued that beefy grids moving power across and between continents are a prerequisite for accessing and sharing the world’s best wind, solar, and hydropower resources, and thus dialing-down fossil fuel consumption.

GEIDCO’s plan for North America mirrors the combination of UHV AC and DC transmission deployed by State Grid in China. In that scheme, a series of 800-kV UHV DC lines running east to west across the U.S. would share wind and solar power widely, while north-south lines would provide continent-wide access to Canada’s giant hydropower plants [see map below]. One more UHV DC line—a 5,200-kilometer stretch from Mexico to Peru—would enable power exchanges with South America.

As in China, UHV AC lines would be added atop North America’s five existing AC grids, strengthening them so they could safely absorb the 8-gigawatt output of each DC line. In two cases, UHV AC would eventually cross boundaries between distinct AC grids to create larger and more stable synchronous zones: Mexico’s grid would fold into the larger grid that currently covers the Western U.S. and Canada; and Texas’ grid would merge with North America’s big Eastern grid.

Consolidating grids helps explain the benefit of GEIDCO’s plan. Texas’ grid is only weakly linked to its neighbors at present, which limits its ability to share resources such as extra wind power that may go to waste for lack of in-state demand or to important renewable power when its wind farms are still. GEIDCO’s UHV build-out unlocks such resources by enabling each power plant to serve a larger area.

The payoffs are numerous. Electrification to decarbonize vehicles, home heating, and industries would proceed 60 percent faster than the most ambitious U.S. electrification scenarios projected by the Electric Power Research Institute. Renewable energy installation would also accelerate, pushing the share of zero-carbon energy on the continent from 40 percent of the power supply in 2017, to 64 percent in 2035, and 74 percent in 2050. Nuclear energy’s contribution would fall by nearly half to just 11 percent of generation in 2050—mostly due to its higher cost, according to GEIDCO director of energy planning Gao Yi. Overall, energy-related greenhouse gas emissions would drop 80 percent by 2050, relative to a business-as-usual scenario—even as average power generating costs drop.

However, consolidating grids is also where the political sensitivity begins. Texas fiercely guards its independent AC grid, which shields its power industry from oversight by federal regulators. Craver’s take on the prospects for bringing Texas into the fold: “Good luck on that.”

Broader political concerns, meanwhile, could hold up international UHV links. Deeper integration of power supplies across borders implies a high level of trust between nations. That cuts against the recent trend toward populist governments advocating more nationalist agendas, as exemplified by the U.K.’s ongoing effort to leave the European Union.

A populist Mexican President elected last year has blocked international investment in renewable energy and could undo recent efforts to expand the country’s grid interconnections. U.S. President Trump has decreased trust in the United States as an international partner with his America-first trade policies and plans to withdraw the U.S. from global pacts such as the Paris Agreement on Climate Change.

At the Vancouver forum, organized by the Edison Electric Institute, a Washington-based trade group, most grid experts identified politics and social acceptance as the greatest challenges to power network expansion. In the U.S., climate and energy policy has made it more difficult for government researchers to even study North American grid integration.

Last year, political appointees at the U.S. Department of Energy blocked publication of a modeling study exploring integration of the Eastern and Western grids via DC lines. The study, led by the grid modeling group at the National Renewable Energy Lab (NREL) in Boulder, Colo., projected that long DC lines linking the grids would reduce power costs and accelerate renewable energy development. According to the study’s leader, it showed that “building national-scale transmission makes sense.”

NREL’s grid modelers are now wrapping up a larger continent-wide study looking at the challenges and opportunities for grid integration of very large levels of wind, solar and hydropower. That North American Renewable Integration Study is a collaborative effort by the U.S., Mexican, and Canadian governments. It was supposed to be completed this month, but remains under review by the three governments.