All posts by Maria Gallucci

Hanford Has a Radioactive Capsule Problem

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/aerospace/military/hanford-has-a-radioactive-capsule-problem

At the vast reservation known as the Hanford Site in south-central Washington state, much of the activity these days concerns its 212 million liters (56 million gallons) of radioactive sludge. From World War II through the Cold War, the site produced plutonium for more than 60,000 nuclear weapons, creating enough toxic by-products to fill 177 giant underground tanks. The U.S. Department of Energy (DOE), which controls Hanford, is pushing to start “vitrifying,” or glassifying, some of that waste within two years. The monumental undertaking is the nation’s—and possibly the world’s—largest environmental cleanup effort. It has been going on for decades and will take decades more to complete.

But the tanks are not the only outsize radioactive hazard at Hanford. The site also houses nearly 2,000 capsules of highly radioactive cesium and strontium. Each of the double-walled, stainless-steel capsules weighs 11 kilograms and is roughly the size of a rolled-up yoga mat. Together, they contain over a third of the total radioactivity at Hanford.

For decades, the capsules have resided in a two-story building called the Waste Encapsulation and Storage Facility (WESF). Inside, the capsules sit beneath 4 meters of cooling water in concrete cells lined with stainless steel. The water surrounding the capsules glows neon blue as the cesium and strontium decay, a phenomenon known as Cherenkov radiation.

Built in 1973, the facility is well beyond its 30-year design life. In 2013, nuclear specialists in neighboring Oregon warned that the concrete walls of the pools had lost structural integrity due to gamma radiation emitted by the capsules. Hanford is located just 56 kilometers (35 miles) from Oregon’s border and sits beside the Columbia River. After leaving the site, the river flows through Oregon farms and fisheries and eventually through Portland, the state’s biggest city.

In 2014, the DOE’s Office of the Inspector General concluded that the WESF poses the “greatest risk” for serious accident of any DOE facility that’s beyond its design life. In the event of a severe earthquake, for instance, the degraded basins would likely collapse, draining the cooling water. In a worst-case scenario, the capsules would then overheat and break, releasing radioactivity that would contaminate the ground and air and render parts of the Hanford Site inaccessible for years and potentially reach nearby cities.

“If it’s bad enough, it means all cleanup essentially stops,” says Dirk Dunning, an engineer and retired Hanford expert who worked for the Oregon Department of Energy and who helped flag initial concerns about the concrete. “We can’t fix it, we can’t stop it. It just becomes a horrible, intractable problem.”

To avoid such a catastrophe, in 2015 the DOE began taking steps to transfer capsules out of the basins and into dry casks on an outdoor storage pad. The plan is to place six capsules inside a cylindrical metal sleeve; air inside the cylinder is displaced with helium to dissipate heat from the capsules. The sleeves are then fitted inside a series of shielded canisters, like a nuclear nesting doll. The final vessel is a 3.3-meter-tall cylindrical cask made of a special steel alloy and reinforced concrete. A passive cooling system draws cool air into the cask and expels warm air, without the need for fans or pools of water. The cask will sit vertically on the concrete pad. Eventually, there will be 16 to 20 casks. Similar systems are used to store spent nuclear fuel at commercial power plants, including the Columbia Generating Station at Hanford. The agency has until 31 August 2025 to complete the work, according to a legal agreement between the DOE, the state of Washington, and the U.S. Environmental Protection Agency.

When the transfer is completed, DOE estimates the new facility will save more than US $6 million per year in operating costs. But it’s intended only as a temporary fix. After 50 years in dry storage—around 2075, in other words—the capsules’ contents could be vitrified as well, or else buried in an unspecified deep geologic repository.

Even that timeline may be too ambitious. At a congressional hearing in March, DOE officials said that treatment of the tank waste was the “highest priority” and sought to defer the capsule-transfer work and other cleanup efforts at Hanford. They also proposed slashing Hanford’s annual budget by $700 million in fiscal year 2021. The DOE Office of Environmental Management’s “strategic vision” for 2020–2030 [PDF] noted only that the agency “will continue to evaluate” the transfer of capsules currently stored at the WESF.

And the COVID-19 pandemic has further complicated the department’s plans. The DOE now says it “will be assessing potential impacts on all projects” resulting from reduced operations due to the pandemic. The department’s FY2021 budget proposal calls for “safely” deferring work on the WESF capsule transfers for one year, while supporting “continued maintenance, monitoring, and assessment activities at WESF,” according to a written response sent to IEEE Spectrum.

Unsurprisingly, community leaders and state policymakers oppose the potential slowdowns and budget cuts. They argue that Hanford’s cleanup—now over three decades in the making—cannot be delayed further. David Reeploeg of the Tri-City Development Council (TRIDEC) says the DOE’s strategic vision and proposed budget cuts add to the “collective frustration” at “this pattern of kicking the can down the road.” TRIDEC advocates for Hanford-related priorities in the adjacent communities of Richland, Kennewick, and Pasco, Wash. Reeploeg adds that congressional support over the years has been key to increasing Hanford cleanup funding beyond the DOE’s request levels.

How did Hanford end up with 1,936 capsules of radioactive waste?

The cesium and strontium inside the capsules were once part of the toxic mix stored in Hanford’s giant underground tanks. The heat given off by these elements as they decayed was causing the high-level radioactive waste to dangerously overheat to the point of boiling. And so from 1967 to 1985, technicians extracted the elements from the tanks and put them in capsules.

Initially, the DOE believed that such materials, especially cesium-137, could be put to useful work, in thermoelectric power supplies, to calibrate industrial instruments, or to extend the shelf life of pork, wheat, and spices (though consumers are generally wary of irradiated foods). The department leased hundreds of capsules to private companies around the United States.

One of those companies was Radiation Sterilizers, which used Hanford’s cesium capsules to sterilize medical supplies at its facilities in Decatur, Ga., and Westerville, Ohio. In 1988, a capsule in Decatur developed a pinhole leak, and 0.02 percent of its contents escaped—a mess that took the DOE four years and $47 million to clean up. Federal investigators concluded that moving the capsules in and out of water more than 7,000 times caused temperature changes that damaged the steel. Radiation Sterilizers had removed temperature-measuring systems in its facility, among other failures cited by the DOE. The company, though, blamed the government for shipping a damaged capsule. Whatever the cause, the DOE recalled all capsules and returned them to the WESF.

The WESF now contains 1,335 capsules of cesium, in the form of cesium chloride. Most of that waste consists of nonradioactive isotopes of cesium; of the radioactive isotopes, cesium-137 dominates, with lesser amounts of cesium-135. Another 601 capsules contain strontium, in the form of strontium fluoride, with the main radioactive isotope being strontium-90.

Cesium-137 and strontium-90 have half-lives of 30 years and 29 years, respectively—relatively short periods compared with the half-lives of other materials in the nation’s nuclear inventory, such as uranium and plutonium. However, the present radioactivity of the capsules “is so great” that it will take more than 800 years for the strontium capsules to decay enough to be classified as low-level waste, according to a 2003 report by the U.S. National Research Council. And while the radioactivity of the cesium-137 will diminish significantly after several hundred years, cesium-135 has a half-life of 2.3 million years, which means that the isotope will eventually become the dominant source of radioactivity in the cesium capsules, the report said.

Workers at Hanford continue to monitor the condition of the capsules by periodically shaking the containers using a long metal gripping tool. If they hear a “clunk,” it means the inner stainless-steel pipe is moving freely and is thus considered to be in good condition. Some capsules, though, fail the clunk test, which indicates the inner pipe is damaged, rusty, or swollen, and thus can’t move. About two dozen of the failed capsules have been “overpacked”—that is, sealed in a larger stainless-steel container and held separately.

Moving the capsules from wet storage to dry is only temporary

The DOE has made substantial progress on the capsule-transfer work in recent years. In August 2019, CH2M Hill Plateau Remediation Company, one of the main environmental cleanup contractors at Hanford, completed designs to modify the WESF for removal of the capsules. In the weeks before COVID-19 temporarily shut down the site in late March, crews had started fabricating equipment to load capsules into sleeves, transfer them into casks, and move them outside. A team cleaned and painted part of the WESF to make way for the loading crane. At the nearby Maintenance and Storage Facility, workers were building a mock-up system to allow people to train and test equipment.

During the lockdown, employees working remotely continued with technical and design reviews and nuclear-safety assessments. With Hanford now in a phased reopening, CH2M Hill workers recently broke ground on the site of the future dry cask storage pad and have resumed construction at the mock-up facility. Last October, the DOE awarded Intermech, a construction firm owned by Emcor, a nearly $5.6 million contract to build a reinforced-concrete pad surrounded by two chain-link fences, along with utility infrastructure and a heavy-duty road connecting the WESF to the pad.

However, plans for fiscal year 2021, which starts in October, are less certain. In its budget request to Congress in February, the DOE proposed shrinking Hanford’s annual cleanup budget from $2.5 billion to about $1.8 billion. Officials sought no funding for WESF modification and storage work, eliminating $11 million from the current budget. Meanwhile, the agency sought to boost funding for tank-waste vitrification from $15 million to $50 million. Under its legal agreements, the DOE is required to start glassifying Hanford’s low-activity waste by 2023.

Reeploeg of the Tri-City Development Council says the budget cuts, if approved, would make it harder for the capsule-transfer project to stay on track.

Along with vitrification, he told Spectrum, “we think WESF is a top priority, too. Considering that the potential consequences of an event there are so significant, we want those capsules out of the pool and into dry-cask storage as quickly as possible.”

Reeploeg said the failure of another aging Hanford facility should have been a wake-up call. In 2017, a tunnel that runs into the Plutonium Uranium Extraction Plant partially collapsed, exposing highly radioactive materials. Officials had been aware of the tunnel’s structural problems since the 1970s. Ultimately, no airborne radiation leaks were detected, and no workers were hurt. But in a February 2020 report, the Government Accountability Office said the DOE hadn’t done enough to prevent such an event.

Hanford experts at Washington state’s Department of Ecology said a short-term delay on the WESF mission won’t significantly increase the threat to the environment or workers.

“We don’t believe that there’s an immediate health risk from a slowdown of work,” says Alex Smith, the department’s nuclear waste program manager. So long as conditions are properly maintained in the pool cells, the capsules shouldn’t see any noticeable aging or decay in the near-term, she says, but it still makes sense to transfer the capsules to reduce the risk of a worst-case disaster.

In an email to Spectrum, the DOE noted that routine daily inspections of the WESF pool walls haven’t revealed any visible degradation or spalling—flaking that occurs due to moisture in the concrete.

Still, for Hanford watchdogs, the possibility of any new delays compounds the seemingly endless nature of the environmental cleanup mission. Ever since Hanford shuttered its last nuclear reactor in 1987, efforts to extract, treat, contain, and demolish radioactive waste and buildings have proceeded in fits and starts, marked by a few successes—such as the recent removal of 27 cubic meters of radioactive sludge near the Columbia River—but also budgeting issues, technical hurdles, and the occasional accident.

“There are all these competing [cleanup projects], but the clock is running on all of them,” says Dunning, the Oregon nuclear expert. “And you don’t know when it’s going to run out.”

Autonomous Robots Could Mine the Deep Seafloor

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/automaton/robotics/robotics-software/autonomous-robots-could-mine-the-deep-seafloor

A battle is brewing over the fate of the deep ocean. Huge swaths of seafloor are rich in metals—nickel, copper, cobalt, zinc—that are key to making electric vehicle batteries, solar panels, and smartphones. Mining companies have proposed scraping and vacuuming the dark expanse to provide supplies for metal-intensive technologies. Marine scientists and environmentalists oppose such plans, warning of huge and potentially permanent damage to fragile ecosystems.

Pietro Filardo is among the technology developers who are working to find common ground.

His company, Pliant Energy Systems, has built what looks like a black mechanical stingray. Its soft, rippling fins use hyperbolic geometry to move in a traveling wave pattern, propelling the skateboard-sized device through water. From an airy waterfront lab in Brooklyn, New York, Filardo’s team is developing tools and algorithms to transform the robot into an autonomous device equipped with grippers. Their goal is to pluck polymetallic nodules—potato-sized deposits of precious ores—off the seafloor without disrupting precious habitats.

“On the one hand, we need these metals to electrify and decarbonize. On the other hand, people worry we’re going to destroy deep ocean ecosystems that we know very little about,” Filardo said. He described deep sea mining as the “killer app” for Pliant’s robot—a potentially lucrative use for the startup’s minimally invasive design.

How deep seas will be mined, and where, is ultimately up to the International Seabed Authority (ISA), a group of 168 member countries. In October, the intergovernmental body is expected to adopt a sweeping set of technical and environmental standards, known as the Mining Code, that could pave the way for private companies to access large tracts of seafloor. 

The ISA has already awarded 30 exploratory permits to contractors in sections of the Atlantic, Pacific, and Indian Oceans. Over half the permits are for prospecting polymetallic nodules, primarily in the Clarion-Clipperton Zone, a hotspot south of Hawaii and west of Mexico.

Researchers have tested nodule mining technology since the 1970s, mainly in national waters. Existing approaches include sweeping the seafloor with hydraulic suction dredges to pump up sediment, filter out minerals, and dump the resulting slurry in the ocean or tailing ponds. In India, the National Institute of Ocean Technology is building a tracked “crawler” vehicle with a large scoop to collect, crush, and pump nodules up to a mother ship.

Mining proponents say such techniques are better for people and the environment than dangerous, exploitative land-based mining practices. Yet ocean experts warn that stirring up sediment and displacing organisms that live on nodules could destroy deep sea habitats that took millions of years to develop. 

“One thing I often talk about is, ‘How do we fix it if we break it? How are we going to know we broke it?’” said Cindy Lee Van Dover, a deep sea biologist and professor at Duke University’s Nicholas School of the Environment. She said much more research is required to understand the potential effects on ocean ecosystems, which foster fisheries, absorb carbon dioxide, and produce most of the Earth’s oxygen.

Significant work is also needed to transform robots into metal collectors that can operate some 6,000 meters below the ocean surface.

Pliant’s first prototype, called Velox, can navigate the depths of a swimming pool and the shallow ocean “surf zone” where waves crash into the sand. Inside Velox, an onboard CPU distributes power to actuators that drive the undulating motions in the flexible fins. Unlike a propeller thruster, which uses a rapidly rotating blade to move small jets of water at high velocity, Pliant’s undulating fins move large volumes of water at low velocity. By using the water’s large surface area, the robot can make rapid local maneuvers using relatively little battery input, allowing the device to operate for longer periods before needing to recharge, Filardo said. 

The design also stirs up less sediment on the seafloor, a potential advantage in sensitive deep sea environments, he added.

The Brooklyn company is partnering with the Massachusetts Institute of Technology to develop a larger next-generation robot, called C-Ray. The highly maneuverable device will twist and roll like a sea otter. Using metal detectors and a mix of camera hardware and computer algorithms, C-Ray will likely be used to surveil the surf zone for potential hazards to the U.S. Navy, who is sponsoring the research program.

The partners ultimately aim to deploy “swarms” of autonomous C-Rays that communicate via a “hive mind”—applications that would also serve to mine polymetallic nodules. Pliant envisions launching hundreds of gripper-equipped robots that roam the seafloor and place nodules in cages that float to the surface on gas-filled lift bags. Filardo suggested that C-Ray could also swap nodules with lower-value stones, allowing organisms to regrow on the seafloor.

A separate project in Italy may also yield new tools for plucking the metal-rich orbs.

SILVER2 is a six-legged robot that can feel its way around the dark and turbid seafloor, without the aid of cameras or lasers, by pushing its legs in repeated, frequent cycles.

“We started by looking at what crabs did underwater,” said Marcello Calisti, an assistant professor at the BioRobotics Institute, in the Sant’Anna School of Advanced Studies. He likened the movements to people walking waist-deep in water and using the sand as leverage, or the “punter” on a flat-bottomed river boat who uses a long wooden pole to propel the vessel forward.

Calisti and colleagues spent most of July at a seaside lab in Livorno, Italy, testing the 20-kilogram prototype in shallow water. SILVER2 is equipped with a soft elastic gripper that gently envelopes objects, as if cupping them in the palm of a hand. Researchers used the crab-like robot to collect plastic litter on the seabed and deposit the debris in a central collection bin.

Although SILVER2 isn’t intended for deep sea mining, Calisti said he could foresee potential applications in the sector if his team can scale the technology.

For developers like Pliant, their ability to raise funding and achieve their mining robots will largely depend on the International Seabed Authority’s next meeting. Opponents of ocean mining are pushing to pause discussions on the Mining Code to give scientists more time to evaluate risks, and to allow companies like Tesla or Apple to devise technologies that require fewer or different metal parts. Such regulatory uncertainty could dissuade investors from backing new mining approaches that might never be used.

The biologist Van Dover said she doesn’t outright oppose the Mining Code; rather, rules should include stringent stipulations, such as requirements to monitor environmental impacts and immediately stop operations once damage is detected. “I don’t see why the code couldn’t be so well-written that it would not allow the ISA to make a mistake,” she said.

Turning Bricks Into Supercapacitors

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/batteries-storage/turning-red-bricks-into-energy-storage-devices

As solar panels and wind turbines multiply, the big problem is how with how to store all the excess electricity produced when the sun is up or the wind blowing so it can be used at other times. Potential solutions have been suggested in many forms, including massive battery banks, fast-spinning flywheels, and underground vaults of air. Now a team of researchers say a classic construction material—the red fired brick—could be a contender in the quest for energy storage.

The common brick is porous like a sponge, and it’s red color comes from pigmentation that is rich in iron oxide. Both features provide ideal conditions for growing and hosting conductive polymers, Julio D’Arcy and colleagues have found. The team at Washington University in St. Louis transformed basic blocks into supercapacitors that can illuminate a light-emitting diode. 

Supercapacitors are of interest because, unlike batteries, they can deliver blindingly fast bursts of power and they recharge quickly. The downside is that, kilogram for kilogram, they store relatively little energy compared to batteries. In an electric vehicle, a supercapacitor supports acceleration, but the lithium-ion module is what provides power for hundreds of miles. Yet many scientists and technology developers are hoping supercapacitors can replace conventional batteries in many applications, owing to the steep environmental toll of mining and disposing of metals. 

The building brick proof-of-concept project presents new possibilities for the world’s many brick walls and structures, said D’Arcy, an assistant professor of chemistry at Washington University. Rooftop solar panels connected by wires could charge the bricks, which in turn could provide in-house backup power for emergency lighting or other applications.

“If we’re successful [in scaling up], you’d no longer need batteries in your house,” he said by phone. “The brick itself would be the battery.”

The novel device, described in Nature Communications on Tuesday, is a far cry from the megawatt-scale storage projects underway in places like California’s desert and China’s countryside. But D’Arcy said the paper shows, for the first time, that bricks can store electrical energy. It offers “food for thought” in a sector that’s searching for ideas, he noted. 

Researchers began by buying armfuls of 65-cent red bricks at a big-box hardware store. At the lab, they studied the material’s microstructure and filled the bricks’ many pores with vapors. Next, bricks went into an oven heated to 160° Celsius. The iron oxide triggered a chemical reaction, coating the bricks’ cavities with thin layers of PEDOT, the polymer known as poly(3,4- ethylenedioxythiophene). 

Bricks emerged from the oven with a blackish-blue hue—and the ability to conduct electricity.

D’Arcy’s team then attached copper leads to two coated bricks. To stop the blocks from shorting out while stacked together, the researchers separated the blocks with a thin plastic sheet of polypropylene. A sulfuric-acid based solution was used as a liquid electrolyte, and the bricks were connected via the copper leads to a AAA battery for about one minute. Once charged, the bricks could power a white LED for 11 minutes.  

If applied to 50 bricks, the supercapacitor could power 3 watts’ worth of lights for about 50 minutes, D’Arcy said. The current set-up can be recharged 10,000 times and still retain about 90 percent of its original capacitance. Researchers are developing the polymer’s chemistry further in an effort to reach 100,000 recharges. 

However, the St. Louis researchers are not alone in the quest to use everyday (if unusual) materials to make supercapacitors.

In Scotland, a team at the University of Glasgow has developed a flexible device that can be fully charged with human sweat. Researchers applied a thin layer of PEDOT to a piece of polyester cellulose cloth that absorbs the wearer’s perspiration, creating an electrochemical reaction and generating electricity. The idea is that these coated cloths could power wearable electronics, using a tiny amount of sweat to keep running.

The Indian Institute of Technology-Hyderabad is exploring the use of corn husks in high-voltage supercapacitors. India’s corn producing states generate substantial amounts of husk waste, which researchers say can be converted into activated carbon electrodes. The biomass offers a potentially cheaper and simpler alternative to electrodes derived from polymers and similar materials, according to a recent study in Journal of Power Sources.

However, to really make inroads into the dominance of batteries, where a chemical reaction drives creation of a voltage, supercapacitors will need to significantly increase their energy density. D’Arcy said his electrically charged bricks are “two orders of magnitude away” from lithium-ion batteries, in terms of the amount of energy they can store. 

“That’s another thing we’re trying to do—make our polymer store more energy,” he said. “A lot of groups are trying to do this,” he added, “but they didn’t do it in bricks.”

Zimbabwe Hopes Rural Electrification Can Stop Deforestation. Here’s Why It Might Not Work

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/environment/zimbabwe-hopes-rural-electrification-stop-deforestation-it-might-not-work

In Zimbabwe, where access to the electrical grid is sparse and unreliable, millions of people still burn wood to cook food and heat their homes. The practice is partly to blame for worsening deforestation in the landlocked country. In recent years, government officials have proposed a seemingly straightforward solution: Extend the electric grid into rural villages, and reduce the use of wood for fuel.

But Ellen Fungisai Chipango, a Zimbabwe-born researcher, says that rural electrification isn’t likely to provide any quick fixes. That’s because adding poles, wires, and even off-grid solar systems will do little to alleviate the crushing poverty that leads people to cut large swaths of trees. In her field work, she found that initiatives to expand energy access in Zimbabwe often overlook the larger political and economic forces at play.

Chipango is among researchers worldwide who are closely examining long-held assumptions that electrifying rural homes can boost family incomes, help children study, reduce indoor air pollution, or protect the environment. Stakeholders including scrappy solar startups, major oil and gas companies, and the United Nations have all pledged to work toward improving energy access for one or more of those reasons. But recent studies suggest that, in order to deliver real benefits, programs must be more comprehensive.

“Without addressing these underlying factors, just extending the grid to rural people will be tantamount to an empty gesture of goodwill,” Chipango said by Skype.

Chipango is a postdoctoral research fellow at the University of Johannesburg in South Africa. She recently discussed her findings in an article in The Conversation. Her piece, published in July, updates a case study she conducted in late 2016 and early 2017 in a district of Zimbabwe’s southeastern Manicaland province. 

Over two-thirds of Zimbabwe’s 16.2 million people live in rural areas, and roughly 80 percent of those residents can’t access electricity. In 2002, Zimbabwe created the Rural Electrification Agency to rapidly electrify the countryside. So far, the agency has connected thousands of schools and rural health centers. Yet few power lines extend beyond institutional buildings, and only about 10 percent of villages are electrified.

In Chipango’s study, most participants said they burned wood for cooking and heating. But personal energy needs were only part of the reason why they chopped down trees. Many people are clear-cutting forests out of sheer desperation. Zimbabwe’s economy is on the brink of collapse, and a prolonged drought—made worse by climate change—has brought about the worst hunger crisis in a decade. Rural residents stave off grinding poverty by selling wood. Their customers include city dwellers, who, because they can’t afford diesel generators or battery backup systems, burn the fuel during frequent and prolonged power outages.

When Chipango interviewed participants again earlier this year, she found their economic situations had worsened since 2017. Meanwhile, urban demand for wood has surged. Interviewees said they’d still keep cutting trees even if power lines finally arrived in their villages. One resident, when asked if off-grid solar would help instead, told Chipango: “It will give us light, but light does not put food on the table.”

Poverty also limits the potential of rural energy initiatives in other ways. If residents can’t afford to buy electric stoves, heaters, or other appliances, they can’t take full advantage of the electrons flowing into their homes. Certainly, enterprising people with a little cash on hand can grow their income by setting up neighborhood phone-charging shops or converting their yards into makeshift movie theaters. But many people won’t likely see their living conditions improve so easily.

“For almost all the traditional outcomes that people talk about when expanding energy access, there are two or three other things that people need for that outcome to actually be realized,” said Ken Lee, who leads the India division of the Energy Policy Institute at the University of Chicago (EPIC). “You can’t eat electricity,” he added.

Lee and colleagues led an experiment in Western Kenya comparing the experiences of rural “under grid” households, meaning homes that are located next to, but not connected to, utility infrastructure. Kenya’s Rural Electrification Authority connected randomly selected households, at a cost of more than US $1,000 each; the rest remained disconnected. After 18 months, researchers found no obvious differences between the socioeconomic living standards of both groups. (Aspects of the Kenya experiment are ongoing.)

The initial results, published in March, surprised the team, which expected to see more tangible gains among the electrified households. Not only did budget constraints keep many participants from buying appliances and using the new electricity supply, researchers also found little improvement in children’s test scores. Even if kids could study under a lightbulb at night, they still attended underfunded schools in the morning. 

Utility mismanagement further undermined electrification efforts. During the rural grid expansion, nearly one-fourth of all utility poles were apparently stolen, possibly leaving the connected residents with a less reliable power supply in the long-run.

Globally, the problem of faulty equipment and lack of maintenance plagues many rural initiatives, including efforts to replace polluting indoor cookstoves with cleaner or electric models. In northern India, studies found that households revert to traditional stoves when new stoves break down. Sometimes, they use both at once, in a process known as “stacking” that undermines the health benefits of alternative models.

Lee and fellow researchers Catherine Wolfram and Edward Miguel, both of the University of California at Berkeley, also reviewed studies from other locations globally and reached a similar conclusion: Access to electricity alone isn’t enough to improve economic and noneconomic outcomes in a meaningful way. 

Still, Lee stressed, that doesn’t mean utilities, philanthropists, and companies should stop pursuing programs to bring grid power or off-grid technologies into rural and impoverished places. But it does clarify the need to design initiatives that do more than simply install infrastructure and assume the rest—rising incomes, better education—will naturally follow.

Chipango, reflecting on her Zimbabwe study, put it this way: “Energy access is not the mere presence of a grid. It’s the ability to use that energy.”

 

The Privatization of Puerto Rico’s Power Grid Is Mired in Controversy

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/policy/the-privatization-of-puerto-rico-power-grid-mired-in-controversy

When Hurricane Maria razed Puerto Rico in September 2017, the storm laid bare the serious flaws and pervasive neglect of the island’s electricity system. Nearly all 3.4 million residents lost power for weeks, months, or longer—a disaster unto itself that affected hospitals and schools and shut down businesses and factories.

The following January, then-Gov. Ricardo Rosselló signaled plans to sell off parts of the Puerto Rico Electric Power Authority (PREPA), leaving private companies to do what the state-run utility had failed to accomplish. Rosselló, who resigned last year, said it would take about 18 months to complete the transition.

“Our objective is simple: provide better service, one that’s more efficient and that allows us to jump into new energy models,” he said that June, after signing a law to start the process.

Yet privatization to date has been slow, piecemeal, and mired in controversy. Recent efforts seem unlikely to move the U.S. territory toward a cleaner, more resilient system, power experts say.

As the region braces for an “unusually active” 2020 hurricane season, the aging grid remains vulnerable to disruption, despite US $3.2 billion in post-Maria repairs. 

Puerto Rico relies primarily on large fossil fuel power plants and long transmission lines to carry electricity into mountains, coastlines, and urban centers. When storms mow down key power lines, or earthquakes destroy generating units—as was the case in January—outages cascade across the island. Lately, frequent brownouts caused by faulty infrastructure have complicated efforts to confront the COVID-19 outbreak. 

“In most of the emergencies that we’ve had, the centralized grid has failed,” says Lionel Orama Exclusa, an electrical engineering professor at the University of Puerto Rico-Mayagüez and member of Puerto Rico’s National Institute of Energy and Island Sustainability.  

He and many others have called for building smaller regional grids that can operate independently if other parts fail. Giant oil- and gas-fired power plants should similarly give way to renewable energy projects distributed near or within neighborhoods. Last year, Puerto Rico adopted a mandate to get to 100 percent renewable energy by 2050. (Solar, wind, and hydropower supply just 2.3 percent of today’s total generation.)

So far, however, PREPA’s contracts to private companies have mainly focused on retooling existing infrastructure—not reimagining the monolithic system. The companies are also tied to the U.S. natural gas industry, which has targeted Puerto Rico as a place to offload mainland supplies.

In June, Luma Energy signed a 15-year contract to operate and maintain PREPA’s transmission and distribution system. Luma is a newly formed joint venture between infrastructure company Quanta Services and Canadian Utilities Limited. The contract is valued between $70 million and $105 million per year, plus up to $20 million in annual “incentive fees.”

Wayne Stensby, president and CEO of Luma, said his vision for Puerto Rico includes wind, solar, and natural gas and is “somewhere down the middle” between a centralized and decentralized grid, Greentech Media reported. “It makes no sense to abandon the existing grid,” he told the news site in June, adding that Luma’s role is to “effectively optimize that reinvestment.”

Orama Exclusa says he has “mixed feelings” about the contract.

If the private consortium can effectively use federal disaster funding to fix crumbling poles and power lines, that could significantly improve the system’s reliability, he says. But the arrangement still doesn’t address the “fundamental” problem of centralization.

He also is also concerned that the Luma deal lacks transparency. Former utility leaders and consumer watchdogs have noted that regulators did not include public stakeholders in the 18-month selection process. They say they’re wary Puerto Rico may be repeating missteps made in the wake of Hurricane Maria.

As millions of Puerto Ricans recovered in the dark, PREPA quietly inked a no-bid, one-year contract for $300 million with Whitefish Energy Holdings, a two-person Montana firm with ties to then-U.S. Interior Secretary Ryan Zinke. Cobra Acquisitions, a fracking company subsidiary, secured $1.8 billion in federal contracts to repair the battered grid. Last September, U.S. prosecutors charged Cobra’s president and two officials in the Federal Emergency Management Agency with bribery and fraud

A more recent deal with another private U.S. firm is drawing further scrutiny.

In March 2019, New Fortress Energy won a five-year, $1.5 billion contract to supply natural gas to PREPA and convert two units (totaling 440 megawatts) at the utility’s San Juan power plant from diesel to gas. The company, founded by billionaire CEO Wes Edens, completed the project this May, nearly a year behind schedule. It also finished construction of a liquefied natural gas (LNG) import terminal in the capital city’s harbor. 

“This is another step forward in our energy transformation,” Gov. Wanda Vázquez Garced said in May during a tour of the new facilities. Converting the San Juan units “will allow for a cheaper and cleaner fuel” and reduce monthly utility costs for PREPA customers, she said.

Critics have called for canceling the project, which originated after New Fortress submitted an unsolicited proposal to PREPA in late 2017. The ensuing deal gave New Fortress an “unfair advantage,” was full of irregularities, and didn’t undergo sufficient legal review or financial oversight, according to a June report by CAMBIO, a Puerto Rico-based environmental nonprofit, and the Institute for Energy Economics and Financial Analysis.

The project “would continue to lock in fossil fuels on the island and would prevent the aggressive integration of renewable energy,” Ingrid Vila Biaggi, president of CAMBIO, told the independent news program Democracy Now!

The U.S. Federal Regulatory Commission, which oversees the transmission and wholesale sale of electricity and natural gas, also raised questions about the LNG import terminal.

On 18 June, the agency issued a rare show-cause order demanding that New Fortress explain why it didn’t seek prior approval before building the infrastructure at the Port of San Juan. New Fortress has 30 days to explain its failure to seek the agency’s authorization.

Concerns over contracts are among the many challenges to revitalizing Puerto Rico’s grid. The island has been mired in a recession since 2006, amid a series of budget shortfalls, financial crises, and mismanagement—which contributed to PREPA filing for bankruptcy in 2017, just months before Maria struck. The COVID-19 pandemic is further eroding the economy, with Puerto Ricans facing widespread unemployment and rising poverty

The coming months—typically those with the most extreme weather—will show if recent efforts to privatize the grid will alleviate, or exacerbate, Puerto Rico’s electricity problems.

New Microsatellite Will Focus on Industrial Methane Emissions

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/aerospace/satellites/microsatellite-industrial-methane-emissions

Claire, a microsatellite, was monitoring a mud volcano in Central Asia when a mysterious plume appeared in its peripheral view. The 15-kilogram spacecraft had spotted a massive leak of methane—a powerful climate pollutant—erupting from an oil and gas facility in western Turkmenistan. The sighting in January 2019 eventually spurred the operator to fix its equipment, plugging one of the world’s largest reported methane leaks to date.

Canadian startup GHGSat launched Claire four years ago to begin tracking greenhouse gas emissions. Now the company is ready to send its second satellite into orbit. On 20 June, the next-generation Iris satellite is expected to hitch a ride on Arianespace’s Vega 16 rocket from a site in French Guiana. The launch follows back-to-back delays due to a rocket failure last year and the COVID-19 outbreak.

GHGSat is part of a larger global effort by startups, energy companies, and environmental groups to develop new technologies for spotting and quantifying methane emissions. 

Although the phrase “greenhouse gas emissions” is almost synonymous with carbon dioxide, it refers to a collection of gases, including methane.  Methane traps significantly more heat in the atmosphere than carbon dioxide, and it’s responsible for about one-fourth of total atmospheric warming to date. While mud volcanoes, bogs, and permafrost are natural methane emitters, a rising share is linked to human activities, including cattle operations, landfills, and the production, storage, and transportation of natural gas. In February, a scientific study found that human-caused methane emissions might be 25 to 40 percent higher than previously estimated.

Iris’s launch also comes as the Trump administration works to ease regulations on U.S. fossil fuel companies. The U.S. Environmental Protection Agency in May sought to expedite a rollback of federal methane rules on oil and gas sites. The move could lead to an extra 5 million tons of methane emissions every year, according to the Environmental Defense Fund.

Stéphane Germain, president of Montreal-based GHGSat, said the much-improved Iris satellite will enhance the startup’s ability to document methane in North America and beyond.
 

“We’re expecting 10 times the performance relative to Claire, in terms of detection,” he said ahead of the planned launch date.

The older satellite is designed to spot light absorption patterns for both carbon dioxide and methane. But, as Germain explained, the broader spectral detection range requires some compromise on the precision and quality of measurements. Iris’s spectrometer, by contrast, is optimized for only methane plumes, which allows it to spot smaller emission sources in fewer measurements.

Claire also collects about 25 percent of the stray light from outside its field of view, which impinges on its detector. It also experiences “ghosting,” or the internal light reflections within the camera and lens that lead to spots or mirror images. And space radiation has caused more damage to the microsat’s detector than developers initially expected. 

With Iris, GHGSat has tweaked the optical equipment and added radiation shielding to minimize such issues on the new satellite, Germain said.

Other technology upgrades include a calibration feature that corrects for any dead or defective pixels that might mar the observational data. Iris will test an experimental computing system with 10 times the memory and four times the processing power of Claire. The new satellite will also test optical communications downlink, allowing the satellite to bypass shared radio frequencies. The laser-based, 1-gigabit-per-second downlink promises to be more than a thousand times faster than current radio transmission.

GHGSat is one of several ventures aiming to monitor methane from orbit. Silicon Valley startup Bluefield Technologies plans to launch a backpack-sized microsatellite in 2020, following a high-altitude balloon test of its methane sensors at nearly 31,000 meters. MethaneSAT, an independent subsidiary of the Environmental Defense Fund, expects to complete its satellite by 2022. 

The satellites could become a “big game changer” for methane-monitoring, said Arvind Ravikumar, an assistant professor of energy engineering at the Harrisburg University of Science and Technology in Pennsylvania. 

“The advantage of something like satellites is that it can be done remotely,” he said. “You don’t need to go and ask permission from an operator — you can just ask a satellite to point to a site and see what its emissions are. We’re not relying on the industry to report what their emissions are.”

Such transparency “puts a lot of public pressure on companies that are not managing their methane emissions well,” he added.

Ravikumar recently participated in two research initiatives to test methane-monitoring equipment on trucks, drones, and airplanes. The Mobile Monitoring Challenge, led by Stanford University’s Natural Gas Initiative and the Environmental Defense Fund, studied 10 technologies at controlled test sites in Colorado and California. The Alberta Methane Field Challenge, an industry-backed effort, studied similar equipment at active oil-and-gas production sites in Alberta, Canada.

Both studies suggest that a combination of technologies is needed to effectively identify leaks from wellheads, pipelines, tanks, and other equipment. A plane can quickly spot methane plumes during a flyover, but more precise equipment, such as a handheld optical-gas-imaging camera, might be necessary to further clarify the data.

GHGSat’s technology could play a similarly complementary role with government-led research missions, Germain said. 

Climate-monitoring satellites run by space agencies tend to have “very coarse resolutions, because they’re designed to monitor the whole planet all the time to inform climate change models. Whereas ours are designed to monitor individual facilities,” he said. The larger satellites can spot large leaks faster, while Iris or Claire could help pinpoint the exact point source.

After Iris, GHGSat plans to launch a third satellite in December, and it’s working to add an additional eight spacecraft — the first in a “constellation” of pollution-monitoring satellites. “The goal ultimately is to track every single source of carbon dioxide and methane in the world, routinely,” Germaine said.

Next-Gen Solar Cells Can Harvest Indoor Lighting for IoT Devices

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/renewables/next-gen-solar-cells-harvest-indoor-lighting-iot-devices

Billions of Internet-connected devices now adorn our walls and ceilings, sensing, monitoring, and transmitting data to smartphones and far-flung servers. As gadgets proliferate, so too does their electricity demand and need for household batteries, most of which wind up in landfills. To combat waste, researchers are devising new types of solar cells that can harvest energy from the indoor lights we’re already using.

The dominant material used in today’s solar cells, crystalline silicon, doesn’t perform as well under lamps as it does beneath the blazing sun. But emerging alternatives—such as perovskite solar cells and dye-sensitized materials—may prove to be significantly more efficient at converting artificial lighting to electrical power. 

A group of researchers from Italy, Germany, and Colombia is developing flexible perovskite solar cells specifically for indoor devices. In recent tests, their thin-film solar cell delivered power conversion efficiencies of more than 20 percent under 200 lux, the typical amount of illuminance in homes. That’s about triple the indoor efficiency of polycrystalline silicon, according to Thomas Brown, a project leader and engineering professor at the University of Rome Tor Vergata.

‘Hydrogen-On-Tap’ Device Turns Trucks Into Fuel-Efficient Vehicles

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/transportation/alternative-transportation/hydrogen-on-tap-device-trucks-fuel-efficient-vehicles

The city of Carmel, Ind., has trucks for plowing snow, salting streets, and carrying landscaping equipment. But one cherry-red pickup can do something no other vehicle can: produce its own hydrogen.

A 45-kilogram metal box sits in the bed of the work truck. When a driver starts the engine, the device automatically begins concocting the colorless, odorless gas, which feeds into the engine’s intake manifold. This prevents the truck from guzzling gasoline until the hydrogen supply runs out. The pickup has no fuel cell module, a standard component in most hydrogen vehicles. No high-pressure storage tanks or refueling pumps are needed, either.

Visiting the City That Built the Hanford Nuclear Site

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/nuclear/visiting-the-city-that-build-hanford-nuclear-site

As my plane approaches the Tri-Cities Airport in south-central Washington state, the sandy expanse outside my window gives way to hundreds of bright green circles and squares. From this semi-arid terrain springs an irrigated oasis of potatoes, hops, peaches, and sweet corn. Just beyond our view is one of the most contaminated places in the world: the Hanford Site, home to 177 aging tanks of radioactive waste.

A Glass Nightmare: Cleaning Up the Cold War’s Nuclear Legacy at Hanford

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/aerospace/military/a-glass-nightmare-cleaning-up-the-cold-wars-nuclear-legacy-at-hanford

It’s a place of superlatives. Reporters have called it the most polluted place in the Western Hemisphere. It’s also the location of one of the largest construction projects in the world.

At the Hanford Site in south-central Washington state, 177 giant tanks sit below the sandy soil, brimming with the radioactive remnants of 44 years of nuclear-materials production. From World War II through the Cold War, Hanford churned out plutonium for more than 60,000 nuclear weapons, including the atomic bomb that razed Nagasaki, Japan, in August 1945. The sprawling enterprise eventually contaminated the soil and groundwater and left behind 212 million liters of toxic waste—enough to fill 85 Olympic-size swimming pools. Decades after the site stopped producing plutonium, the U.S. government is still grappling with how to clean it all up.

Today the 1,500-square-kilometer site, roughly half the size of Rhode Island, is a quiet expanse of sagebrush and wispy grasses outside Richland, Wash. The underground steel-and-reinforced-concrete tanks are grouped in “farms” beneath a central plateau, while shuttered nuclear reactors stand like sentinels on the periphery. Scientists have identified some 1,800 contaminants inside the tanks, including plutonium, uranium, cesium, aluminum, iodine, and mercury. Watery liquids rest atop goop as thick as peanut butter and salt cakes resembling wet beach sand.

The waste is what’s left of an intense period in wartime and Cold War innovation. Starting in 1943, Hanford experts pioneered industrial-scale methods for chemically separating plutonium from irradiated uranium, and doing so safely. Their original bismuth-phosphate process yielded hockey-puck-size “buttons” of plutonium, which were then formed into spherical cores and used in the 1945 Trinity atomic bomb test in New Mexico and then the Nagasaki bomb. Over the years, five more processes followed, culminating with plutonium uranium extraction (PUREX), which became the global standard for processing nuclear fuels.

Each of these methods produced its own distinct waste streams, which were stored on-site and then pumped into underground storage tanks. When some of the older single-shell tanks started leaking years later, workers pumped the liquids into newer, sturdier double-shell tanks. Chemical reactions ensued as the different waste products mixed together, leaving each tank filled with its own complex aggregation of liquids, solids, and sludges.

The upshot is that by 1987, when Hanford stopped producing plutonium, the tank farms contained a deadly brew of chemicals, metals, and long-lasting radionuclides. No two of the 177 tanks contain exactly the same concoction, but they all pose a significant public risk. The site borders the Columbia River, which nourishes the region’s potato crops and vineyards, serves as a breeding ground for salmon, and provides drinking water for millions of people. So far, the aging, corroding vessels have leaked roughly 4 million liters. Some experts have said it’s only a matter of time before more waste seeps through the cracks.

The U.S. Department of Energy (DOE), which controls Hanford, has for decades had a goal of treating and “vitrifying,” or glassifying, the tank waste for safer disposal. Vitrification is a time-tested method for immobilizing radioactive waste by turning it into glass blocks. With the waste thus encased, the harmful radionuclides cannot leach into rivers or underground water tables. To enhance the isolation, the most radioactive blocks are put in steel containers, which can then be deposited in a dry and geologically stable underground vault. Vitrification plants have been built and successfully operated in Belgium, France, Germany, Japan, Russia, the United Kingdom, and the United States.

But Hanford’s waste is unique among the world’s nuclear leftovers, in both composition and volume. Before they can turn it into glass, workers must first figure out exactly what is inside each tank and then develop glassmaking formulas for each batch.

It is a monumental task, and it’s just one facet of one of the biggest engineering projects in the world. The centerpiece of the work is a series of vast facilities called the Waste Treatment and Immobilization Plant, also known as the Hanford Vit Plant, sprawled over some 25 hectares (65 acres). The DOE currently estimates that it will cost US $16.8 billion to finish the plant, which is being built by Bechtel National and a host of subcontractors. Even as scientists continue to puzzle over Hanford’s tank waste, and as contractors flip the lights on in shiny new buildings, concerns about massive cost overruns, contractor lapses, and missed deadlines weigh heavily on the project. Hanford, born and built feverishly in the heat of World War II, now seems to be in a slow, meandering slog toward an unseen finish line.

“Hanford is unique,” says Will Eaton, who leads the vitrification task force at the DOE’s Pacific Northwest National Laboratory (PNNL) in Richland. “There’s been lots of work done on the details, to make sure we have the highest likelihood of real, efficient success when we get going. Because it’s a long mission.” Eaton, who is 53 years old, adds, “My goal is that the plant actually starts up before I retire.”

I visited Hanford in July 2019 to get a better understanding of the many challenges facing the beleaguered vitrification project. I met Eaton on a blindingly sunny afternoon on the PNNL campus, which sits in an oasis of green trees amid the desert scrub. Hanford begins directly across the street, stretching out toward the flat ridge of Rattlesnake Mountain.

Eaton held up a clear plexiglass vessel, about 13 centimeters in diameter. In May 2018, his team used containers like this to glassify 11 liters of waste from two of the Hanford tanks. As a safety precaution, the experiment was conducted beneath a radioisotope fume hood. Those vessels contain the largest volume of Hanford waste that’s been vitrified so far, after three decades and billions of dollars. Just 211,999,989 more liters to go.

After I met with Eaton, I set off to visit Hanford. The DOE wasn’t letting individual journalists visit the Vit Plant, so I opted for the next best thing: I joined a public tour of the Hanford cleanup site. About a dozen passengers and I rode in an air-conditioned bus through the reservation, most of which resembles arid parkland. Tall bluffs stand off in the distance, carved by ancient rivers. Herds of elk sought shade among spindly trees near an abandoned schoolhouse.

It’s an incongruous but resonant sight. In 1943, as part of the Manhattan Project, the U.S. government seized a vast swath of land, including the towns of White Bluffs and Hanford, to build a nuclear manufacturing complex. The government ordered 1,500 homesteaders to leave their farms and towns, and Native American tribes were barred from visiting sacred fishing, hunting, and ceremonial grounds. To the west, members of the Wanapum tribe still live in a community that overlooks Hanford.

As the bus ascends the central plateau, sweeping vistas give way to rumbling forklifts, workers in hard hats, and buildings wrapped in scaffolding. Our tour guide notes that his great-nephew works here as a welder, a member of the 2,800-person construction crew.

The Vit Plant was born out of a comprehensive 1989 cleanup agreement among the DOE, the U.S. Environmental Protection Agency, and the state of Washington’s Department of Ecology. Construction began in 2002 and was supposed to wrap up by 2011, at a cost of $4.3 billion. But a series of major unforeseen problems soon cropped up, including dangerous hydrogen accumulation in piping and ancillary vessels, and inadequate ventilation for managing radon and other gases that are produced as the radioactive waste material breaks down. Cost estimates soared, and timelines stretched.

Today, the Vit Plant is a complex of buildings the size of a small town. Its 56 systems require an electric power grid that could light up 2,250 houses; a chilled-water system could supply air-conditioning to 23,500 houses. A 1.3-million-liter storage tank can hold enough diesel fuel to fill the tanks of 19,000 cars at once.

Even after the Vit Plant is completed, the actual cleanup will take decades more. In its 2019 Hanford Lifecycle Scope, Schedule and Cost Report [PDF], the DOE estimated that the process of vitrifying and disposing of Hanford’s waste could cost as much as $550 billion and last 60 years.

The plan calls for tank waste to flow via underground pipes to a massive pretreatment facility. This facility will eventually rise 12 stories, although during my tour it’s still just an outline of metal frames, above which hovers a motionless yellow crane. Inside sealed tanks, pulse-jet mixers, working like turkey basters, will suck up the waste and eject it at high velocity, to keep the whole tank mixed and prevent solid particles from settling. Ion exchangers will remove highly radioactive isotopes, dividing the waste stream into two groups. High-level radioactive waste makes up only about 10 percent of the total waste by volume but accounts for 90 percent of the radioactivity, Eaton says. The remaining waste is considered low-activity waste, containing very small amounts of radionuclides.

The appropriate streams will flow to separate high-level and low-activity vitrification facilities. In both, technicians will mix the waste with silica and other glass-forming materials and then pour the lot into a ceramic-lined melter. Immersed electrodes will heat the melter’s tank to nearly 1,150 °C, turning the mixture into a red-hot goop of molten glass. Low-activity waste will be poured into a container made of stainless steel, where it will cool and harden into a 2.3-meter-tall, 1.2-meter-diameter log. High-level waste will go into longer, skinnier 4.4-meter-tall, 0.6-meter-diameter canisters, also made of stainless steel.

Off-gases, including steam and nitrogen oxides, will exit through a nozzle in the melter’s roof, to be collected and treated to remove radioactive isotopes and keep pollutants out of the environment.

Up to 1,000 steel-encased logs of low-activity waste will be produced each year and then buried in nearby trenches. The Vit Plant complex also includes an analytical laboratory, which will test some 3,000 glass samples of low-activity waste each year, ensuring that the vitrified waste meets regulatory requirements.

Once completed, the high-level waste plant is slated to produce some 640 canisters per year. The vitrified high-level waste is considered too dangerous to keep on-site, even inside the steel canisters. Instead, that waste will be sent to an as-yet-unidentified off-site location. The original plan called for the high-level waste to be buried in a deep geologic repository such as the proposed and long-delayed Yucca Mountain site in Nevada. Construction on Yucca Mountain began in 1994 but was halted during the Obama administration amid fierce resistance from Nevada politicians, Native American groups, environmentalists, and others. President Trump, who called for the revival of the project early in his administration, recently reversed his stance on the matter. At present, there are no plans to build a deep repository anywhere in the United States.

Meanwhile, Hanford cleanup experts are figuring out ways to dramatically reduce the number of vitrified logs they’ll need to produce and store. When workers began building the Vit Plant 18 years ago, for instance, researchers were designing glasses that contained no more than 10 percent waste, the rest being materials necessary for glass forming. By modeling different formulas, a team at PNNL found they could double the waste portion to 20 percent, in part by finding ways to accommodate more aluminum, chromium, and other chemicals. That could halve the number of glass logs that Hanford has to produce and store.

As the tour bus winds its way through the Hanford Site, empty dirt patches mark the footprints of demolished buildings from the plutonium-production period. Their scraps are now interred in a massive landfill, which holds more than 16 million metric tons of low-level radioactive, hazardous, and mixed wastes. A Hanford employee on the bus points to black pipes snaking along the road; these carry contaminated groundwater away from the Columbia River and toward a central treatment plant, we’re told.

During Hanford’s plutonium-production heyday, workers discharged some 1.7 trillion liters of waste liquids into soil disposal sites, which developed into vast underground plumes of toxic chemicals, including the carcinogens hexavalent chromium and carbon tetrachloride [PDF], that infiltrated aquifers. Today six underground pump-and-treat systems hydraulically push contaminants toward the 200 West Groundwater Treatment Plant, a cavernous space filled with silver tubes and tall gray bioreactors. The plant’s operator, CH2M Hill (now part of Jacobs Engineering Group), says it treats some 7.6 billion liters of groundwater every year. In September 2019, workers removed the last of the highly radioactive sludge that was being stored in underwater containers near the river.

Our tour complete, the bus heads back down the dusty plateau, past taco trucks and wisecracking signs: “Got Sludge? Yes We Do!”

The construction is “essentially complete,” the DOE says, on the Vit Plant’s low-activity vitrification facility, analytical laboratory, and most of the smaller support buildings. But work on the pretreatment facility has been “deferred,” as Hanford experts try to resolve technical questions regarding the separation and processing of waste and the design life of the facility’s equipment. In late 2016, officials also decided to halt construction on the high-level vitrification facility so they could focus on treating the low-activity waste.

To make progress on the low-activity waste, the DOE’s latest strategy calls for bypassing the pretreatment facility. Instead, the liquid waste will be pumped into a smaller system, near the tanks where the waste is being stored. This system will filter out large solids and remove radioactive cesium, which has a relatively short half-life but emits high amounts of tissue-damaging gamma radiation and is thus considered the most immediately dangerous of the radionuclides in the waste. The liquid will then flow directly to the low-activity waste vitrification plant to be glassified. An effluent-management facility will handle the liquid waste produced by the glass melters and off-gas treatment system.

The DOE’s Office of River Protection, which oversees the tank cleanup mission, says it is on track to start processing low-activity waste this way as soon as 2022. As part of the preparations, in May 2019, Hanford workers began installing two towering, 145-metric-ton vessels that will hold effluent.

Last August, officials from the DOE and Bechtel National celebrated the opening of a 1,860-square-meter annex to the low-activity waste facility. The building houses the control room and operations center, where workers will perform startup and testing activities.

At the ribbon-cutting ceremony, the Vit Plant’s project director, Valerie McCain, said, “We are getting closer to making low-activity-waste glass.”

It’s anybody’s guess when Hanford will start vitrifying the high-level waste. The DOE says the technical issues that stalled construction have mostly been resolved but that it “cannot project with certainty” when the pretreatment and high-level waste vitrification facilities will be completed and put into service. The answer depends on many variables, including federal funding, the efficiency of contractors, and the pace of technological advances. In September, the department warned regulators in the state of Washington that it is at “serious” risk of missing deadlines to start treating high-level waste by 2033 and have the plant fully operational by 2036. The deadlines are specified in legal agreements among the DOE, the state of Washington, and other interested parties.

Meanwhile, the DOE is also studying alternative methods for treating some of the waste, including filling the tanks with a concrete-like grout, to in effect immobilize the waste in place. Officials had considered such a strategy earlier in the cleanup mission, but they ultimately ruled that vitrification was the safest, surest path for treatment.

Regulators as well as activists say they are frustrated to be revisiting the glass-versus-grout debate, particularly given how much work is still left on the Vit Plant. “It can be hard on folks to feel like they’re beating their heads against a wall and not actually accomplishing the stuff they set out to accomplish,” says Alex Smith, the nuclear waste program manager for Washington state’s Department of Ecology.

Adding to the sense of inertia is the somber fact that most people working on Hanford cleanup today won’t be alive to see the end results. A person in her 40s now would be a centenarian in 2078, the year the DOE expects to conclude its cleanup work.

“It’s easy to say, ‘Well, what do you care? You’re not going to be here when the consequences of this decision hit,’ ” Smith adds. “It’s really a challenge for our workforce, for the DOE workforce, and for people who have been working at Hanford for a long time.”

To keep people aware of the Hanford mission, Smith’s department is increasing community outreach, through social media and school talks. She says public awareness is key to ensuring lawmakers continue to fund the cleanup—even if most U.S. taxpayers have never even heard of it. The waste may be buried in Washington state, but it’s the product of federal actions meant to safeguard the entire country, through nuclear weapons production.

“We feel that this is a national cleanup,” agrees Susan Leckband, who chairs the Hanford Advisory Board. The board offers policy advice to the DOE and regulators, and it includes local experts, current and former Hanford workers, representatives from neighboring Oregon, and members of three tribal governments: Nez Perce Tribe, Yakama Nation, and the Confederated Tribes of the Umatilla Indian Reservation.

Leckband acknowledges that people outside of Washington state don’t necessarily share the board’s perspective. “They have their own problems,” she says. “I get that. There are not unlimited funds.” She worries about a growing push for “faster and less expensive” solutions to the cleanup mission, rather than a “better and more permanent” approach.

John Vienna, a materials scientist at Pacific Northwest National Laboratory, hands me a shiny rectangular glass slab. The rusty red and orange stripes are iron, he says, of which there’s an abundance in Hanford’s high-level waste. Vienna’s team analyzes myriad materials to observe how they behave in glass. Inside the lab, cross-sections of metal canisters reveal glasslike obsidian, made from simulants of high-level tank waste. Chunks as green as emeralds contain low-activity waste simulants.

Vienna explains that the contaminants don’t sit inside the glass, like beer swishing in a bottle. Rather, they become part of the “bottle” itself, atomically bound in place until the glass dissolves—which won’t be for “upwards of a million years,” he says. By then, the troubling radionuclides will all have decayed to relatively benign levels.

The two waste types are challenging to treat for different reasons. High-level waste contains higher levels of the “cold chemicals,” such as aluminum, that were used in the more inefficient stages of plutonium production and that don’t dissolve easily in glass. Low-activity waste is mostly made of sodium salts, which can make glass less durable. Glass formulations must account for these distinct complications.

Scientists at PNNL’s sprawling campus have worked on waste vitrification for more than half a century. In the 1970s, for example, the lab developed the technology for the ceramic melters at the heart of Hanford’s high-level waste and low-activity waste facilities. Other U.S. locations as well as sites in Japan and Europe have used the technology to glassify their nuclear waste. Glassification began in 1996 at the DOE’s Savannah River Site in South Carolina, the United States’ other plutonium-production site, where some 133 million liters of radioactive liquid waste were stored. To date, a little over half of the waste has been processed. At the West Valley Demonstration Project, near Buffalo, N.Y., the DOE vitrified all 2.3 million liters of waste before demolishing the facility.

Compared to Hanford, those sites had less waste, and it was far more uniform in composition. For West Valley, scientists spent years developing one general formula that could be used to treat all of the waste, says Vienna, who worked on that project and several others. Given the sheer volume and complexity of Hanford’s 212 million liters of tank waste, experts have to take a different approach.

Researchers at PNNL are creating computational models based on the behavior of actual tank waste, chemically similar simulants, and lab tests. Inside clear cabinets, they study how glass samples are affected by extremely high and low temperatures and by water, so that they can verify the glass will dissolve slowly enough to outlive the radioactive hazard. To understand the effects of time, they’ve examined the structures of ancient glasses, including a 2- to 4-million-year-old piece of Icelandic basalt glass and a 1,800-year-old bowl handle recovered from a shipwreck in the Adriatic Sea. The idea is that when the Vit Plant becomes operational, experts will be able to refine the glass compositions on the fly, right up until the mixtures hit the melter. Vienna’s group is responsible for the modeling that will enable Hanford to double the amount of waste per glass log, for instance.

“Part of what our group does is understand how we can push the limits,” says Charmayne Lonergan, a PNNL materials scientist. “As you start doing that, you start cutting back on the number of years that processing all the waste may take. You start cutting back on costs, time, labor, facilities, and resources.”

Meanwhile, the clock is ticking, and an air of uncertainty still surrounds the Vit Plant. The DOE is moving to reclassify some of the nation’s nuclear waste as less dangerous, which could allow it to sidestep vitrification for some of Hanford’s tank waste.

In particular, the department said in June 2019 that it was changing the way it interpreted the definition of “high-level radioactive waste” at Hanford, Savannah River, and the Idaho National Laboratory. Traditionally, any by-products that result from processing highly radioactive nuclear fuels have been considered high level and must be buried in deep geological repositories. All of Hanford’s waste (before pretreatment) falls into this category. The department wants to instead categorize waste based not on how it was produced but on its chemical composition.

Under the revised definition, waste from fuel processing could be considered “low-level radioactive waste” if it doesn’t exceed certain radioactive concentration limits. The limit for cesium-137, for instance, is 4,600 curies (or 1.7 x 1014 bequerels) per cubic meter.

Under the new interpretation, low-level waste wouldn’t necessarily have to move through Hanford’s pretreatment and vitrification facilities. Some of it could potentially be turned into a groutlike form and trucked to a private waste repository in Texas. In other cases, Hanford workers could pour grout directly into tanks, as was done with seven underground vessels at Savannah River.

Federal officials and other proponents of this strategy say these steps could dramatically cut the time and cost required to treat Hanford’s tank waste. PNNL and five other DOE national laboratories have voiced “strong support” for the technical merits of the new interpretation.

Paul M. Dabbar, the DOE’s Under Secretary for Science, told reporters that the department will “analyze each waste stream and manage it in accordance with Nuclear Regulatory Commission standards, with the goal of getting the lower-level waste out of these states without sacrificing public safety.” He said that each tank considered for classification as low-level waste would require an environmental study, under the National Environmental Policy Act.

But critics, including Washington governor Jay Inslee and the state’s Department of Ecology, say that reclassifying Hanford’s waste will jeopardize environmental safety and give the DOE unilateral control over the cleanup mission. In a letter to the DOE, leaders of the Yakama Nation expressed their concern that the changes would lead to more contamination at the site and “a lower standard of clean-up.”

This latest controversy highlights the constant calculations that officials, regulators, activists, and citizens must make in confronting Hanford’s toxic legacy. Policy changes designed to accelerate cleanup have to be weighed against the safety and well-being of people who won’t be born for tens of thousands of years. Waste treatment methods are viewed through the prism of limited, and often dwindling, congressional funding. Scientific results don’t exist in a vacuum—they are interpreted according to political motives, public opinions, and business interests.

Leckband, the Hanford Advisory Board chair, says it’s important to take the long view. “Our mantra is, we want the best cleanup possible—for the public, the people who are paying for it, the people who will be drinking the water, breathing the air, and eating the vegetables in the entire Pacific Northwest as well as the country,” Leckband says. “It needs to be done not just for us, but also for future generations.”

This article appears in the May 2020 print issue as “What to Do With 177 Giant Tanks of Radioactive Sludge.”

EnergySails Aim to Harness Wind and Sun To Clean Up Cargo Ships

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/renewables/energysails-harness-wind-sun-clean-up-cargo-ships

The global shipping industry is experiencing a wind-powered revival. Metal cylinders now spin from the decks of a half-dozen cargo ships, easing the burden on diesel engines and curbing fuel consumption. Devices like giant towing kites, vertical suction wings, and telescoping masts are well underway, while canvas sails flutter once more on smaller vessels. 

The latest development in “wind-assisted propulsion” comes from Japan. Eco Marine Power (EMP) recently unveiled a full-scale version of its EnergySail system at the Onomichi Marine Tech Test Center in Hiroshima Prefecture. The rigid, rectangular device is slightly curved and can be positioned into the wind to create lift, helping propel vessels forward. Marine-grade solar panels along the face can supply electricity for onboard lighting and equipment.

Greg Atkinson, EMP’s chief technology officer, says the 4-meter-tall sail will undergo shore-based testing this year, in preparation for sea trials. The device will deliver 1-kilowatt in peak solar power, or kWp, though the startup is still evaluating which type of photovoltaic panel to use. The potential sail power is yet to be determined, he says.

The EnergySail is one piece of EMP’s larger technology platform. The Fukuoka-based firm is also developing an integrated system that includes deck-mounted solar panels; recyclable marine batteries; charging systems; and computer programs that automatically rotate sails to capture optimal amounts of wind, or lower the devices when not in use or during bad weather. Atkinson notes that moving an EnergySail (mainly to optimize its wind collection) may affect how much sunlight it receives, though the panels can still collect solar power when lying flat.

The startup’s ultimate goal is to hoist about a dozen EnergySails on a tanker or freighter that has the available deck space. An array of that size could deliver power savings of up to 15 percent, depending on wind conditions and the vessel’s size, models show.

Gavin Allwright, secretary of the International Windship Association, says that figure is in line with projections for other wind-assisted technologies, which can help watercraft achieve between 5 and 20 percent fuel savings compared to typical ships. (EMP is not a member of the association.) For instance, the Finnish company Norsepower recently outfitted a Maersk oil tanker with two spinning rotor sails. The devices lowered the vessel’s fuel use by 8.2 percent on average during a 12-month trial period.

Shipping companies are increasingly investing in clean energy as international regulators move to slash global greenhouse gas emissions. Nearly all commercial cargo ships use oil or gas to carry goods across the globe; together, they contribute up to 3 percent of the world’s total annual fossil fuel emissions. Zero-emission alternatives like hydrogen fuel cells and ammonia-burning engines are still years from commercialization. But wind-assisted propulsion represents a more immediate, if partial, solution. 

For its EnergySail unit, EMP partnered with Teramoto Iron Works, which built the first rigid sails in the 1980s. Those devices — called JAMDA sails after the Japan Marine Machinery Development Association—were shown to reduce ships’ fuel use by between 10 to 30 percent on smaller coastal vessels, despite some technical issues. However, the experiment was short-lived. Plunging oil prices eroded the business case for efficiency upgrades, and shipowners later took them down.

EMP is currently talking with several shipowners to start installing its full energy system, potentially later this year. For the sea trial, the startup plans to install a deck-mounted solar array with up to 25 kWp; battery packs; computer systems; and one or two EnergySails. Atkinson says it may take two to three years of testing to verify whether the equipment can weather harsh conditions, including fierce winds and corrosive saltwater. 

Separately, EMP has started testing the non-sail portion of its platform. In May 2019, the company installed a 1.2-kWp solar array on a large crane vessel owned by Singaporean carrier Masterbulk. The setup also includes a 3.6-kilowatt-hour VRLA (valve regulated lead acid) battery pack made by Furukawa Battery Co. An onboard monitoring system automatically reports and logs fuel-consumption data in real time and calculates daily emissions of carbon and sulfur dioxide.

EMP previously tested Furukawa’s batteries on a vessel in Greece. During the day, solar panels recharged the batteries, which keep the voltage stable and could directly power the vessel’s lighting load. The batteries could also store the excess solar power to keep the lights on at night. It took the partners about five years of testing to ensure the system was stable. 

Atkinson says that, so far, the COVID-19 pandemic hasn’t disrupted the company’s work or halted its plans for the year.

“We can do much of the design work remotely and by using cloud-based applications,” he says. “Also, we can use virtual wind tunnels and [Computer Aided Design] applications for much of the initial design work for the sea trials phase.”

Across the industry, however, the coronavirus outbreak is wreaking economic havoc. Allwright says that shipowner interest in wind-assisted propulsion was “absolutely crazy” until a few weeks ago. “Now, shipping companies are saying, ‘Look, we can’t invest in new technology right now because we’re trying to survive,’” he says. 

Still, some technology developers are nonetheless accelerating their design work, in the hopes of launching projects as soon as the industry bounces back. “This pause gives the providers an extra 12 months to get these things tested and ready for action,” Allwright says.

The Coronavirus Outbreak Is Curbing China’s CO2 Emissions

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/environment/coronavirus-outbreak-curbing-china-co2-emissions

The coronavirus outbreak has sent the global economy reeling as businesses shutter and billions of people hunker down. Air travel, vehicle traffic, and industrial production have swiftly declined in recent weeks, with much of the world frozen in place until the virus—which has killed more than 39,000 people globally—can be safely contained. One consequence of the crisis may be a sizable, if temporary, decline in heat-trapping emissions this year.

Global carbon dioxide emissions could fall by 0.3 percent to 1.2 percent in 2020, says Glen Peters, research director of the Center for International Climate Research in Norway. He based his estimates on new projections for slower economic growth in 2020. In Europe, CO2 emissions from large sources could plunge by more than 24 percent this year. That’s according to an early assessment of the Emissions Trading Scheme, which sets a cap on the European Union’s emissions. In Italy, France, and other nations under quarantine, power demand has dropped considerably since early March.

As experts look to the future, Lauri Myllyvirta is tracking how the new coronavirus is already affecting China—the world’s largest carbon emitter, where more than a dozen cities were on lockdown for nearly two months. Myllyvirta is an analyst at the Centre for Research on Energy and Clean Air, an independent organization. Previously based in Beijing, he now lives in Helsinki, where I recently reached him by phone. Our conversation is edited and condensed for clarity.

South Sudan Is Building Its Electric Grid Virtually From Scratch

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/policy/south-sudan-rebuilding-grid-from-scratch

As South Sudan emerges from the wreckage of civil war, its leaders are beginning to build the nation’s electric sector from the ground up. With only a handful of oil-fired power plants and crumbling poles and wires in place, the country is striving for a system that runs primarily on renewable energy and reaches more homes and businesses. 

Today, only about 1 percent of South Sudan’s 12.5 million people can access the electric grid, according to the state-run utility. Many people use rooftop solar arrays or noisy, polluting diesel generators to keep the lights on; still many more are left in the dark. Those who can access the grid must pay some of the highest electricity rates in the world for a spotty and unreliable service.

Prototype Offers High Hopes for High-Efficiency Solar Cells

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/renewables/prototype-high-efficiency-solar-cells-news

Scientists continue to tinker with recipes for turning sunlight into electricity. By testing new materials and components, in varying sizes and combinations, their goal is to produce solar cells that are more efficient and less expensive to manufacture, allowing for wider adoption of renewable energy. 

The latest development in that effort comes from researchers in St. Petersburg, Russia. The group recently created a tiny prototype of a high-efficiency solar cell using gallium phosphide and nitrogen. If successful, the cells could nearly double today’s efficiency rates—that is, the degree to which incoming solar energy is converted into electrical power.

The new approach could theoretically achieve efficiencies of up to 45 percent, the scientists said. By contrast, conventional silicon cells are typically less than 20 percent efficient. 

AltaRock Energy Melts Rock With Millimeter Waves for Geothermal Wells

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energy/renewables/altarock-energy-melts-rock-with-millimeter-waves-for-geothermal-wells

A vast supply of heat lies beneath our feet. Yet today’s drilling methods can barely push through dense rocks and high-pressure conditions to reach it. A new generation of “enhanced” drilling systems aims to obliterate those barriers and unlock unprecedented supplies of geothermal energy.

AltaRock Energy is leading an effort to melt and vaporize rocks with millimeter waves. Instead of grinding away with mechanical drills, scientists use a gyrotron—a specialized high-frequency microwave-beam generator—to open holes in slabs of hard rock. The goal is to penetrate rock at faster speeds, to greater depths, and at a lower cost than conventional drills do.

The Seattle-based company recently received a US $3.9 million grant from the U.S. Department of Energy’s Advanced Research Projects Agency–Energy (ARPA-E). The three-year initiative will enable scientists to demonstrate the technology at increasingly larger scales, from burning through hand-size samples to room-size slabs. Project partners say they hope to start drilling in real-world test sites before the grant period ends in September 2022.

AltaRock estimates that just 0.1 percent of the planet’s heat content could supply humanity’s total energy needs for 2 million years. Earth’s core, at a scorching 6,000 °C, radiates heat through layers of magma, continental crust, and sedimentary rock. At extreme depths, that heat is available in constant supply anywhere on the planet. But most geothermal projects don’t reach deeper than 3 kilometers, owing to technical or financial restrictions. Many wells tap heat from geysers or hot springs close to the surface.

That’s one reason why, despite its potential, geothermal energy accounts for only about 0.2 percent of global power capacity, according to the International Renewable Energy Association.

“Today we have an access problem,” says Carlos Araque, CEO of Quaise, an affiliate of AltaRock. “The promise is that, if we could drill 10 to 20 km deep, we’d basically have access to an infinite source of energy.”

The ARPA-E initiative uses technology first developed by Paul Woskov, a senior research engineer at MIT’s Plasma Science and Fusion Center. Since 2008, Woskov and his colleagues have used a 10-kilowatt gyrotron to produce millimeter waves at frequencies between 30 and 300 gigahertz. Elsewhere, millimeter waves are used for many purposes, including 5G wireless networks, airport security, and astronomy. While producing those waves requires only milliwatts of power, it takes several megawatts to drill through rocks.

To start, MIT researchers place a piece of rock in a test chamber, then blast it with high-powered, high-frequency beams. A metallic waveguide directs the beams to form holes. Compressed gas is injected to prevent plasma from breaking down and bursting into flames, which would hamper the process. In trials, millimeter waves have bored holes through granite, basalt, sandstone, and limestone.

The ARPA-E grant will allow the MIT team to develop their process using megawatt-size gyrotrons at Oak Ridge National Laboratory, in Tennessee. “We’re trying to bring forward a disruption in technology to open up the way for deep geothermal energy,” Araque says.

Other enhanced geothermal systems now under way use mechanical methods to extract energy from deeper wells and hotter sources. In Iceland, engineers are drilling 5 km deep into magma reservoirs, boring down between two tectonic plates. Demonstration projects in Australia, Japan, Mexico, and the U.S. West—including one by AltaRock—involve drilling artificial fractures into continental rocks. Engineers then inject water or liquid biomass into the fractures and pump it to the surface. When the liquid surpasses 374 °C and 22,100 kilopascals of pressure, it becomes a “supercritical” fluid, meaning it can transfer energy more efficiently and flow more easily than water from a typical well.

However, such efforts can trigger seismic activity, and projects in Switzerland and South Korea were shut down after earthquakes rattled surrounding cities. Such risks aren’t expected for millimeter-wave drilling. Araque says that while beams could spill outside their boreholes, any damage would be confined deep below ground.

Maria Richards, coordinator at Southern Methodist University’s Geothermal Laboratory, in Dallas, says that one advantage of using millimeter waves is that the drilling can occur almost anywhere—including alongside existing power plants. At shuttered coal facilities, deep geothermal wells could produce steam to drive the existing turbines.

The Texas laboratory previously explored using geothermal power to help natural-gas plants operate more efficiently. “In the end, it was too expensive. But if we could have drilled deeper and gotten higher temperatures, a project like ours would’ve been more profitable,” Richards says. She notes that millimeter-wave beams could also reach high-pressure offshore oil and gas reservoirs that are too dangerous for mechanical drills to tap.

This article appears in the March 2020 print issue as “AltaRock Melts Rock For Geothermal Wells.”

Capture Carbon in Concrete Made With CO2

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/fossil-fuels/carbon-capture-power-plant-co2-concrete

On a vast grassy field in northern Wyoming, a coal-fired power plant will soon do more than generate electricity. The hulking facility will also create construction materials by supplying scientists with carbon dioxide from its exhaust stream

A team from the University of California, Los Angeles, has developed a system that transforms “waste CO2” into gray blocks of concrete. In March, the researchers will relocate to the Wyoming Integrated Test Center, part of the Dry Fork power plant near the town of Gillette. During a three-month demonstration, the UCLA team plans to siphon half a ton of CO2 per day from the plant’s flue gas and produce 10 tons of concrete daily. 

“We’re building a first-of-a-kind system that will show how to do this at scale,” said Gaurav Sant, a civil engineering professor who leads the team. 

Carbon Upcycling UCLA is one of 10 teams competing in the final round of the NRG COSIA Carbon XPrize. The global competition aims to develop breakthrough technologies for converting carbon emissions into valuable products. Four more finalists are demonstrating projects in Wyoming, including CarbonCure, a Canadian startup making greener concrete, and Carbon Capture Machine, a Scottish venture focused on building materials. (Five other teams are competing at a natural gas plant in Alberta, Canada.)

Worldwide, hundreds of companies and research groups are working to keep CO2 out of the atmosphere and store it someplace else—including in deep geologic formations, soils, soda bubbles, and concrete blocks. By making waste CO2 into something marketable, entrepreneurs can begin raising revenues needed to scale their technologies, said Giana Amador, managing director of Carbon180, a nonprofit based in Oakland, California.

The potential global market for waste-CO2 products could be $5.9 trillion a year, of which $1.3 trillion includes cements, concretes, asphalts, and aggregates, according to Carbon180 [PDF]. Amador noted the constant and growing worldwide demand for building materials, and a rising movement within U.S. states and other countries to reduce construction-related emissions.

Cement, a key ingredient in concrete, has a particularly big footprint. It’s made by heating limestone with other materials, and the resulting chemical reactions can produce significant CO2 emissions. Scorching, energy-intensive kilns add even more. The world produces 4 billion tons of cement every year, and as a result, the industry generates about 8 percent of global CO2 emissions, according to think tank Chatham House. 

The cement industry is one that’s really difficult to decarbonize, and we don’t have a lot of cost-effective solutions today,” Amador said. Carbon “utilization” projects, she added, can start to fill that gap. 

The UCLA initiative began about six years ago, as researchers contemplated the chemistry of Hadrian’s Wall—the nearly 1,900-year-old Roman structure in northern England. Masons built the wall by mixing calcium oxide with water, then letting it absorb CO2 from the atmosphere. The resulting reactions produced calcium carbonate, or limestone. But that cementation process can take years or decades to complete, an unimaginably long wait by today’s standards. “We wanted to know, ‘How do you make these reactions go faster?’” Sant recalled. 

The answer was portlandite, or calcium hydroxide. The compound is combined with aggregates and other ingredients to create the initial building element. That element then goes into a reactor, where it comes in contact with the flue gas coming directly out of a power plant’s smokestack. The resulting carbonation reaction forms a solid building component akin to concrete. 

Sant likened the process to baking cookies. By tinkering with the ingredients, curing temperatures, and the flow of CO2, they found a way to, essentially, transform the wet dough into baked goods. “You stick it in a convection oven, and when they come out they’re ready to eat. This is exactly the same,” he said. 

The UCLA system is unique among green concrete technologies because it doesn’t require the expensive step of capturing and purifying CO2 emissions from power plants. Sant said his team’s approach is the only one so far that directly uses the flue gas stream. The group has formed a company, CO2Concrete, to commercialize their technology with construction companies and other industrial partners. 

After Wyoming, Sant and colleagues will dismantle the system and haul it to Wilsonville, Alabama. Starting in July, they’ll repeat the three-month pilot at the National Carbon Capture Center, a research facility sponsored by the U.S. Department of Energy. 

The UCLA team will learn in September if they’ve won a $7.5 million Carbon XPrize, though Sant said he’s not fretting about the outcome. “Winning is great, but what we’re really focused on is making a difference and [achieving] commercialization,” he said.

Puerto Rico Goes Dark (Again) as Earthquakes Rattle Island

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/environment/puerto-rico-earthquake-power-outages-prepa-news

A series of earthquakes left Puerto Rico in the dark this week as power outages swept nearly the entire island. About 80 percent of utility customers had power restored by Friday afternoon, yet authorities warned it could take weeks to stabilize the overall system. 

A 6.4-magnitude earthquake rocked the U.S. territory on 7 January following days of seismic activity. Temblors and aftershocks leveled buildings, split streets, and severely damaged the island’s largest power plant, Costa Sur. The blackouts hit a system still reeling from 2017’s Hurricane Maria—which knocked out the entire grid and required $3.2 billion in repairs.

Time-of-Use Electricity Rates May Hit Vulnerable Groups Harder, Study Finds

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/time-of-use-rates-may-hit-vulnerable-groups-harder

Electric utilities routinely adjust power supplies to match the peaks and troughs in demand. But more utilities are working to tweak customers’ habits, too, so that we don’t all gobble energy at the same time and strain the grid.

Measures like “time-of-use” tariffs are proliferating in the United States and globally, with utilities charging higher electricity rates during peak demand periods. In places like sunny California, the idea is to shift more energy usage to the afternoon—when solar power is abundant and cheap—and away from evenings, when utilities rely more on fossil fuel-fired power plants. 

Yet such initiatives may have unintended consequences. A new study in the journal Nature Energy found that one utility pilot hit some participants harder than others. Vulnerable groups, including elderly people and those with disabilities, saw disproportionately negative financial and health impacts as a result of paying time-of-use rates.

“You have this potentially really useful tool, but you need to make sure you’re not unintentionally making a worse situation for parts of the population,” said Lee White, the study’s lead author and a research fellow at Australian National University in Canberra.

About 14 percent of U.S. utilities offer residential time-of-use rates, according to the consulting firm Brattle Group. Rate designs can vary from place to place, as do climate conditions and consumer habits, so the study’s findings might not hold true everywhere. Still, the research highlights concerns worth heeding as utilities and regulators design such programs. 

“We need to be very careful about how we implement these rates,” White said. 

White and Nicole Sintov, an assistant professor at Ohio State University, analyzed data from 7,500 households that voluntarily joined a utility’s 2016 pilot in the southwestern United States. (The company asked to go unnamed.)

Participants were randomly assigned to a control group, or one of two time-of-use rates. The first group paid an extra 0.3451 cents per kilowatt-hour from 2 to 8 p.m. on weekdays. The second group saw tariffs of 0.5326 cents per kilowatt-hour from 5 to 8 p.m. on weekdays. 

Researchers studied results from July to September, a sweltering season. All participants paying time-of-use rates saw their bills increase. But households with elderly members or people with disabilities saw even greater bill increases relative to the rest. Elderly folks reported turning off their air-conditioning less than other groups; in general, older adults are especially vulnerable to heat-related illnesses. 

Participants with disabilities were more likely to seek medical attention for heat-related reasons when assigned to one of the time-of-use rates—as were customers identified as Hispanic. But researchers found that people within the disability, Hispanic, or low-income groups were more likely to report adverse health outcomes regardless of rates, even in the control group.

White said a “somewhat encouraging” finding is that low-income households and Hispanic participants saw lower bill increases compared to other groups. Yet any extra costs “could still cause additional tensions in the household budget,” she added. According to the U.S. Census, low-income households on average spend 8.2 percent of their income on energy bills—about three times as much as higher-earning households.

The study highlights gaps in “flexibility capital” among electricity users, said Michael Fell, a research associate at the UCL Energy Institute. For example, wealthier households might avoid higher rates by installing energy storage devices or smart appliances with sensors and timers. Healthier individuals can cope with using less AC or heating. But many people can’t spare the expense to their wallets or wellbeing. 

“There is already recognition amongst regulators that the transition to a flexible future may come with risks to those in vulnerable situations,” Fell wrote in Nature Energy. “White and Sintov’s study lends nuance to this concern.” 

Ryan Hledik, a principal at Brattle Group, said residential time-of-use rates are gaining momentum as smart meters become the norm in households nationwide. While many utilities are now using tariffs to integrate more wind and solar power into the electricity mix, in coming years, such programs could help keep electric-vehicle owners from charging batteries all at once, overtaxing local infrastructure.

“That’s definitely something utilities are going to need to confront, and time-of-use rates are one way to deal with that,” Hledik said.

Wave Energy Tech Is Ready to Plug Into a Real Grid

Post Syndicated from Maria Gallucci original https://spectrum.ieee.org/green-tech/geothermal-and-tidal/wave-energy-tech-is-ready-to-plug-into-a-real-grid

Ocean waves are powerful and perpetually replenished. But unlike the wind and sun, waves remain a largely untapped source of renewable energy, despite their enormous potential. A slew of projects is starting to change that, with large prototypes launching near coastlines worldwide.

In Hawaii, the OceanEnergy Buoy is slated to connect to the island of Oahu’s electric grid next month. The 749-metric-ton device was recently towed from Portland, Ore., to the U.S. Navy’s Wave Energy Test Site, where the bright yellow buoy will undergo a year of performance tests. The project builds on a decade of research and several smaller iterations, including a quarter-scale model that was tested for three years in Ireland’s Galway Bay.

“The difficulty has been in developing a technology that actually survives in the marine environment, which can be very harsh,” said John McCarthy, CEO of the Irish buoy maker OceanEnergy.

To limit seawater effects, McCarthy’s team designed a device that puts mechanical parts above the surface. The “oscillating water column” system features a semi-submerged chamber, inside of which an air pocket is trapped above a column of water. When waves crest and water enters the chamber, it forces the air upward, spinning a Siemens subsidiary’s turbine system to generate electricity. As water recedes, it creates a vacuum that sucks in outside air and continues driving the turbine.

The 1.25-megawatt buoy will be moored to a 60-meter-deep berth and should withstand gale-force winds and extreme waves. A subsea cable will link it to Hawaiian Electric’s grid, which still runs primarily on imported oil.

About 100 people built the buoy over 14 months at the Swan Island shipyard in Portland, said Tom Hickman of U.S. shipbuilder Vigor Industrial. Workers cut, formed, and welded steel plates into three massive sections to form the L-shaped hull, then installed mechanical and electrical components. On a crisp October morning, company leaders and dignitaries held a completion ceremony, days before a tugboat dragged the buoy up the Columbia River and across the Pacific Ocean.

Tyler Gaunt, a project manager for Vigor, said he was proud to have successfully finished the project but happy to see the device leave. Constructing a first-of-its-kind prototype at a large scale meant constantly solving problems under a relatively tight deadline. For instance, the supportive steel “stiffeners” that are typically applied inside ship hulls went on the buoy’s exterior, to avoid creating drag within the air chamber.

“It was essentially the opposite of how we would normally construct a ship,” he said from the shipyard.

Globally, about 19 megawatts of “wave energy converters” were deployed from 2010 to 2018, though some devices were decommissioned after pilot tests, according to Ocean Energy Europe [PDF] (an industry organization not connected with OceanEnergy). The bulk of projects have been in the United Kingdom and Western Europe, with other devices deployed in China, Australia, New Zealand, and the United States.

Wave energy is one of several technologies that harness the ocean’s natural features—tides, winds, water temperatures, salinity—and could provide significant amounts of clean electricity. Waves off U.S. shores represent some 2.64 trillion kilowatt-hours in theoretical annual energy potential—equivalent to about two-thirds of the nation’s electricity generation in 2018, according to an estimate by the U.S. Department of Energy. This resource is abundant at higher latitudes, where colder temperatures and weak sunlight make it harder to operate other renewables during certain months.

“Wind and solar are really cheap and ubiquitous on land, but there are challenges with those technologies,” said Bryson Robertson, codirector of the Pacific Marine Energy Center and an Oregon State University associate professor. “In places with very aggressive decarbonization agendas, we’re going to need all renewable resources to really start to mitigate our impact on the climate.”

Marine technologies still face significant hurdles to achieving commercial scale. It’s not yet clear how spinning turbines and rotating blades will affect wildlife. Supporting infrastructure, such as offshore grid connections, isn’t widely available. Licensing and permitting processes must first ensure that devices don’t obstruct commercial fishing, whale watching, or other activities.

Such issues have stifled investment, so public agencies and research institutions are leading the way, with over a dozen testing hubs worldwide. In Scotland’s Orkney Islands, the European Marine Energy Center has 13 grid-connected berths for wave and tidal devices. New sites are under way in Western Australia and Jeju Island, South Korea. At the U.S. Navy hub in Hawaii, three other developers—Columbia Power Technologies, Northwest Energy Innovations, and Oscilla Power—are also expected to test wave energy converters starting in 2021.

Robertson said OceanEnergy’s yellow buoy represents a valuable “data point” in the broader effort to improve performance and drastically reduce electricity costs from marine technologies. “We need to start putting these devices in the water so we can start to learn lessons,” he said.

This article appears in the December 2019 print issue as “ At Last, Wave Energy Tech Plugs Into the Grid.”