All posts by Prachi Patel

Co-designing electronics and microfluidics for a cooling boost

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/tech-talk/computing/hardware/codesigning-electronics-and-microfluidics-for-a-cooling-boost

The heat generated by today’s densely-packed electronics is a costly resource drain. To keep systems at the right temperature for optimal computational performance, data center cooling in the United States consumes the as much energy and water as all the residents of the city of Philadelphia. Now, by integrating liquid cooling channels directly into semiconductor chips, researchers hope to reduce that drain at least in power electronics devices, making them smaller, cheaper and less energy-intensive. 

Traditionally, the electronics and the heat management system are designed and made separately, says Elison Matioli, an electrical engineering professor at École Polytechnique Fédérale de Lausanne in Switzerland. That introduces a fundamental obstacle to improving cooling efficiency since heat has to propagate relatively long distances through multiple materials for removal. In today’s processors, for instance, thermal materials syphon heat away from the chip to a bulky, air-cooled copper heat sink.

For a more energy-efficient solution, Matioli and his colleagues have developed a low-cost process to put a 3D network of microfluidic cooling channels directly into a semiconductor chip. Liquids remove heat better than air, and the idea is to put coolant micrometers away from chip hot spots.

But unlike previously reported microfluidic cooling techniques, he says, “we design the electronics and the cooling together from the beginning.” So the microchannels are right underneath the active region of each transistor device, where it heats up the most, which increases cooling performance by a factor of 50. They reported their co-design concept in the journal Nature today.

Researchers first proposed microchannel cooling back in 1981, and startups such as Cooligy have pursued the idea for processors. But the semiconductor industry is moving from planar devices to 3D ones and towards future chips with stacked multi-layer architectures, which makes cooling channels impractical. “This type of embedded cooling solution is not meant for modern processors and chips, like the CPU,” says Tiwei Wei, who studies electronic cooling solutions at Interuniversity Microelectronics Centre and KU Leuven in Belgium.  Instead, this cooling technology makes the most sense for power electronics, he says.

Power electronics circuits manage and convert electrical energy, and are used widely in computers, data centers, solar panels, and electric vehicles, among other things. They use large-area discrete devices made from wide-bandgap semiconductors like gallium nitride. The power density of these devices has gone up dramatically over the years, which means they have to be “hooked to a massive heat sink,” Matioli says.

Superhigh-voltage Gallium Oxide Transistors Could Transform Power Electronics

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/tech-talk/semiconductors/materials/gallium-oxide-transistors-can-handle-over-8000-volts

A new gallium oxide transistor can withstand voltages of over 8,000 volts (V), the highest ever reported for a device of comparable size. The advance opens up exciting possibilities for compact, energy-efficient power electronics systems based on a technology that is only eight years old: the first gallium oxide transistors were reported in 2012.

“Those are extraordinary numbers compared to what’s reported,” says Uttam Singisetti, a professor of electrical engineering at the University of Buffalo who led the new device research published in IEEE Electron Device Letters. “Reaching 8kV in eight years is a big achievement.”

Coronavirus’s Economic Blow Forces Universities To Adapt

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/tech-talk/at-work/education/coronaviruss-economic-blow-universities-adapt

IEEE COVID-19 coverage logo, link to landing page

The economic slowdown from the coronavirus pandemic presents daunting financial challenges for public and private universities as they face their biggest crisis in decades.

The University of Kentucky is dealing with a more than $70 million shortfall in funds. The university’s engineering school faces a 10 percent budget cut, about the average for other schools at the university. Rudolph Buccheit, dean of the college of engineering, says that while the state budget appropriation is expected to be the same as last year, academic colleges including the engineering school have “picked up expenses that are above and beyond normal leading to a budget deficit.”

Increased expenses for colleges include the cost of technologies needed for distance learning, facilities upkeep and sanitization, and returning students’ room and board fees, among others.

“A lot of public universities are in similar sort of situations,” Buccheit says, facing increased expenses in addition to reduced funding due to state budget cuts. “We want to see if the federal stimulus package will include support for states to protect higher education.” The $14 billion that higher education institutions are receiving so far under the coronavirus relief bill is nowhere close to meeting their needs.

Another big hit could come from lower tuition revenue, given the uncertainty about fall enrollment numbers. “Economic circumstances have changed for some families and there’s uncertainty with health,” he says. The University of Kentucky is planning for 20 percent reduction in first year class enrollment.

Even private schools with large endowments will reel from the tuition loss. And this especially acute for science and engineering schools, since a large part of the student body is international, and those students typically pay higher tuition.

“Undergraduate tuition is the bread and butter,” says Karen Panetta, an IEEE Fellow and dean of graduate education for the school of engineering at Tufts University. “And now you’ve got students saying I think I might defer a year, which is sending shockwaves through research institutions. Right now schools are panicking over this huge loss of revenue.”

Being a Research 1 institution, Tufts also depends on federal research funding, and pandemic-related laboratory closures will affect those research dollars, she says.

Meanwhile, costs keep ratcheting up. Tufts is planning for an anticipated opening in the fall in which they would have to implement social distancing. That means the way everything is done in an academic has to change: dormitories, libraries, classrooms, common spaces. “So the big thing is not just financial loss because that’s global,” Panetta says, “but also how much is it going to cost us for face masks and sanitization.

Plus, she adds, “I took definitive action and made a conscious decision that even if we are open we’re going to have classes available online.” That’s because international students might not be able to get into the country in October. So all the Tufts engineering departments have already started working on courses being available online, which comes at a cost.

Long-term impact on finances might depend on how long the pandemic and its after-effects last. For now, says Buccheit, “we have reserves we can use to help get us through what we hope will be a one or two year fiscal problem.” That means they won’t have to suspend or cancel any programs, or merge smaller departments. In fact, they plan to continue with the launch of a new undergraduate biomedical engineering program this coming fall, something that had been in the works for two years.

Waste Natural Gas Powers Computers Seeking Coronavirus Cure

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/fossil-fuels/waste-natural-gas-is-powering-computers-looking-for-a-coronavirus-cure

In a partnership that seems par for the course in these strange pandemic times, waste natural gas is powering a computing project that’s searching for a COVID-19 therapy.

The natural gas, a byproduct of oil drilling, would otherwise be burned in air, a wasteful practice called flaring. It’s instead being converted to electricity that helps drive computationally intensive protein-folding simulations of the new coronavirus at Stanford University, thanks to Denver-based Crusoe Energy Systems, a company which “bridges the gap between the energy world and the high-performance computing world,” says CEO Chase Lochmiller.

Crusoe’s Digital Flare Mitigation technology is a fancy term for rugged, modified shipping containers that contain temperature-controlled racks of computers and data servers. The company launched in 2018 to mine cryptocurrency, which requires a tremendous amount of computing power. But when the novel coronavirus started spreading around the world, Lochmiller and his childhood friend Cully Cavness, who is the company’s president and co-founder, knew it was a chance to help.

Coronaviruses get their name from their crown of spiky proteins that attach to receptors on human cells. Proteins are complicated beasts that undergo convoluted twists and turns to take on unique structures. A recent Nature study showed that the new coronavirus the world is now battling, known as SARS-CoV-2, has a narrow ridge at its tip that helps it bind more strongly to human cells than previous similar viruses.

Understanding how spike proteins fold will help scientists find drugs that can block them. Stanford University’s [email protected] project is simulating these protein-folding dynamics. Studying the countless folding permutations and protein shapes requires enormous amounts of computations, so the project relies on crowd-sourced computing.

Enevate’s Silicon Anodes Could Yield EV Batteries That Run 400 km on a 5-Minute Charge

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/batteries-storage/enevates-silicon-anodes-could-give-batteries-that-run-400-km-on-a-5minute-charge

Battery makers have for years been trying to replace the graphite anode in lithium-ion batteries with a version made of silicon, which would give electric vehicles a much longer range. Some batteries with silicon anodes are getting close to market for wearables and electronics. The recipes for these silicon-rich anodes that a handful of companies are developing typically use silicon oxide or a mix of silicon and carbon.

But Irvine, CA-based Enevate is using an engineered porous film made mainly of pure silicon. In addition to being inexpensive, the new anode material, which founder and chief technology officer Benjamin Park has spent more than 10 years developing, will lead to an electric vehicle (EV) that has 30 percent more range on a single charge than today’s EVs. What’s more, the battery Enevate envisions could be charged up enough in five minutes to deliver 400 km of driving range.

Big names in the battery and automotive business are listening. Carmakers Renault, Nissan, and Mitsubishi, as well as battery-makers LG Chem and Samsung, are investors. And lithium battery pioneer and 2019 Chemistry Nobel Prize winner John Goodenough is on the company’s Advisory Board.

When lithium-ion batteries are charged, lithium ions move from the cathode to the anode. The more ions the anode can hold, the higher its energy capacity, and the longer the battery can run. Silicon can in theory hold ten times the energy of graphite. But it also expands and contracts dramatically, falling apart after a few charge cycles.

To get around that, battery makers such as Tesla today add just a tiny bit of silicon to graphite powder. The powder is mixed with a glue-like plastic called a binder and is coated on a thin copper foil to make the anode. But, says Park, lithium ions react with silicon first, before graphite. “The silicon still expands quite a bit, and that plastic binder is weak,” he says, explaining that the whole electrode is more likely to degrade as the amount of silicon is ramped up.

Enevate does not use plastic binders. Instead, its patented process creates the porous 10- to 60-µm-thick silicon film directly on a copper foil. The cherry on top is a nanometers-thick protective coating, which, says Park, “prevents the silicon from reacting with the electrolyte.” That type of reaction can also damage a battery.

The process does not require high-quality silicon, so anodes of this type cost less than their graphite counterparts of the same capacity. And because the material is mostly silicon, lithium ions can slip in and out very quickly, charging the battery to 75 percent of its capacity in five minutes, without causing much expansion. Park likens it to a high-capacity movie theater. “If you have a full movie theater it takes a long time to find the one empty seat. We have a theater with ten times more capacity. Even if we fill that theater halfway, [it still doesn’t take long] to find empty seats.”

The company’s roll-to-roll processing techniques can make silicon anodes quickly enough for high-volume manufacturing, says Park. By coupling the silicon anode with conventional cathode materials such as nickel-manganese-cobalt, they have made battery cells with energy densities as high as 350 watt-hours per kilogram, which is about 30 percent more than the specific energy of today’s lithium-ion batteries. Enevate says it is now working with multiple major automotive companies to develop standard-size battery cells for 2024-25 model year EVs.

It’s Still Early, but Potassium Batteries Are Showing Promise for Grid Storage

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energy/environment/its-still-early-but-potassium-batteries-are-showing-promise-for-grid-storage

Renewables are poised to expand by 50 percent in the next five years, according to the International Energy Agency. Much of that wind and solar power will need to be stored. But a growing electric-vehicle market might not leave enough lithium and cobalt for lithium-ion grid batteries.

Some battery researchers are taking a fresh look at lithium’s long-ignored cousin, potassium, for grid storage. Potassium is abundant, inexpensive, and could in ­theory enable a higher-power battery. However, efforts have lagged behind research on lithium and sodium batteries.

But potassium could catch up quickly, says Shinichi Komaba, who leads potassium-ion battery research at the Tokyo University of Science: “Although ­potassium-battery development has just been going on for five years, I believe that it is already competitive with sodium-ion batteries and expect it to be comparable and superior to lithium-ion.”

People have historically shied away from potassium because the metal is highly reactive and dangerous to handle. What’s more, finding electrode materials to hold the much heftier potassium ions is difficult.

Yet a flurry of reports in the past five years detail promising candidates for the cathode. Among the leaders are iron-based compounds with a crystalline structure similar to Prussian blue particles, which have wide open spaces for potassium ions to fill. A group from the University of Texas at Austin led by John Goodenough, coinventor of the lithium-ion battery and a winner of the 2019 Nobel Prize in Chemistry, has reported Prussian blue cathodes with an exceptionally high energy density of 510 watt-hours per kilogram, comparable to that of today’s lithium batteries.

But Prussian blue isn’t perfect. “The problem is, we don’t know how water content in the material affects energy density,” says Haegyeom Kim, a materials scientist at Lawrence Berkeley National Laboratory. “Another issue is that it’s difficult to control its chemical composition.”

Kim is placing bets on polyanionic compounds, which are made by combining potassium with any number of elements plucked from the periodic table. Potassium vanadium fluorophosphate seems to hold special promise. Kim and his colleagues have developed a cathode with the compounds that has an energy density of 450 Wh/kg.

Other researchers are looking at organic compounds for cathodes. These cost less than inorganic compounds, and their chemical bonds can stretch to take up potassium ions more easily.

While Goodenough is giving potassium a chance, his fellow ­lithium-battery inventor and Nobel Prize winner ­M. ­Stanley Whittingham, professor of chemistry at Binghamton University, in New York, isn’t sold. “It’s a scientific curiosity,” he says. “There’s no startup looking at potassium batteries.”

Potassium, says Whittingham, is not a practical technology because of its heft and volatility. Potassium also melts at a lower temperature than lithium or sodium, which can trigger reactions that lead to thermal runaway.

Those are valid concerns, says Vilas Pol, a professor of chemical engineering at Purdue University, in West Lafayette, Ind. But he points out that in a battery, potassium ions shuttle back and forth, not reactive potassium metal. Special binders on the electrode can tame the heat-producing reactions.

Developing the right electrolyte will be key to battery life and safety, says Komaba, of the Tokyo University of Science. Conventional electrolytes contain flammable solvents that, when combined with potassium’s reactivity, could be dangerous. Selecting the right solvents, potassium salts, salt concentration, and additives can prevent fires.

Komaba’s group has made electrolytes using potassium-fluoride salts, superconcentrated electrolytes that have fewer solvents than traditional mixes, and ionic liquid electrolytes that don’t use solvents. In January, materials scientist Zaiping Guo and her team from the University of Wollongong, Australia, reported a nonflammable electrolyte for potassium batteries. They added a flame retardant to the solvent.

Potassium enthusiasts point out that the technology is still at an early stage. It’s never going to match the high energy density of lithium, or be suitable for electric cars. Yet for immense grid batteries, cheap potassium might have an upper hand. “Potassium-ion [batteries] could have worked earlier, but there was no need for [them],” says Pol. “Lithium isn’t enough now.”

In the end, the sum will have to be as good as its parts. Most research has focused on the materials that go into the electrodes and the electrolyte. Put it all together in a battery cell and the energy density drops after just 100 charging cycles or so; practical batteries will need to withstand several hundred.

“It will take time to figure out the exact combination of electrolyte, cathode, and anode,” Pol says. “It might take another 15 years from now to get to the market.”

This article appears in the March 2020 print issue as “Potassium Batteries Show Promise.”

Ion Storage Systems Says Its Ceramic Electrolyte Could Be a Gamechanger for Solid-State Batteries

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/batteries-storage/ion-storage-systems-ceramic-electrolyte-news-solid-state-batteries

For years, experts have predicted that solid-state batteries will be the next-generation technology for electric vehicles (EVs). These batteries promise to be safer by relying on a solid electrolyte instead of the flammable liquids used in today’s lithium-ion batteries. They could also last longer and weigh less, with a 10 times higher energy density, because they use a lithium metal anode instead of graphite.

Ford, Hyundai, Nissan, Toyota, and Volkswagen are all investing in solid-state battery research. And startups in the space abound.

But Eric Wachsman says his company, Ion Storage Systems, stands out for a few reasons. The company’s strong, dense ceramic electrolyte is only about 10 micrometers thick, which is the same thickness as the plastic separators used in today’s lithium-ion batteries, and it conducts lithium ions as well as current liquid electrolytes. And according to Wachsman, it overcomes two key issues with solid-state batteries: high electrolyte resistance and a low current capability.

The Battery Design Smarts Behind Rolls Royce’s Ultrafast Electric Airplane

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/batteries-storage/the-battery-innovations-behind-rolls-royces-ultrafast-electric-airplane

Dozens 0f electric general aviation projects are underway around the world, not counting the urban air taxis that dominate the electric propulsion R&D scene. The first all-electric commercial aircraft, a seaplane intended for short flights, completed a 15-minute test flight in December.

Shortly after, luxury icon Rolls Royce unveiled what it hopes will be the world’s fastest electric aircraft. The current speed record for that type of plane is 335 kilometers per hour (210 mph). The new one-seater craft, slated to fly this spring, will top out at 480 km/h (300 mph). It should also be able to fly from London to Paris, about 320 km (200 miles), on a single charge.

That’s thanks to “the world’s most energy-dense flying battery pack,” according to Rolls Royce. The aircraft has three batteries powering three motors that will deliver 750kW to spin the propellers. Each 72 kilowatt-hour battery pack weighs 450kg and has 6,000 densely packed lithium-ion cells.

Getting all this power on board wasn’t easy, says Matheu Parr, project manager for the ACCEL project, short for Accelerating the Electrification of Flight. Careful thought and engineering went into each step, right from selecting the type of battery cell. Lithium-ion cells come in many forms, including pouches as well as  prismatic and cylindrical cells. Cylindrical ones turn out to be best for holding a lot of energy and discharging it quickly at high power, he says.

Next came the critical task of assembling the cells into a pack. Rolls Royce’s partner, Electroflight, a startup specializing in aviation batteries, began that effort by analyzing innovations in the relatively new all-electric auto-racing space.

“Really, the challenge for electric aviation is one of packaging,” Parr says. “So we’ve looked at how Formula E [air racing] tackles packaging and then taken it a step further.” By using lightweight materials—and only the bare minimum of those—the Formula E teams manage to cut their planes’ packaging-to –battery cell weight ratio in half compared with the amount of battery packaging an electric car has to carry around for each kilogram of battery cell.

The high-power, closely packed cells get pretty hot. So, designing an advanced active-cooling system was important. Instead of the air-cooling used in car batteries, Rolls Royce engineers chose a liquid-cooling system. All the cells directly contact a cooling plate through which a water-and-glycol mixture is piped.

Finally, the engineers built in safety features such as an ultra-strong outside case and continual monitoring of each battery’s temperature and voltage. Should something go wrong with one of the batteries, it would automatically be shut off. Better still, the airplane can land even if two of its three batteries are turned off.

The ACCEL battery comes out to a specific energy of 165 watt-hours per kilogram, which puts it on par with the  battery pack powering the Tesla Model 3. That’s still a long way from the 500 Wh/kg needed to compete with traditional jet-propulsion aircraft for commercial flights (aviation batteries are not expected to store that much energy per unit mass until 2030). For now, Rolls Royce and others believe all-electric propulsion will power smaller aircraft while larger planes will have hybrid fuel-electric systems. The company has teamed up with Airbus and Siemens to develop a hybrid airplane.

With its high-speed racing aircraft, Rolls Royce wants to pioneer the transition to the “third age of aviation, from propeller aircraft to jet aircraft to electric,” says Parr. The project will also provide know-how that will shape future designs. “We’re learning an awful lot that we want to see packed into a future aircraft. Innovations in the battery and system integration, packaging and management will all help us shape any future electric product, be it all-electric or hybrid.”

Long-lasting Lithium-Sulfur Battery Promises to Double EV Range

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/batteries-storage/lithium-sulfur-battery-news-ev-electric-vehicle-range

Lithium-sulfur batteries seem to be ideal successors to good old lithium-ion. They could in theory hold up to five times the energy per weight. Their low weight makes them ideal for electric airplanes: firms such as Sion Power and Oxis Energy are starting to test their lithium-sulfur batteries in aircraft. And they would be cheaper given their use of sulfur instead of the rare-earth metals used in the cathode today.

But the technology isn’t yet commercial mainly because of its short life span. The cathode starts falling apart after just 40 to 50 charge cycles.

By designing a novel robust cathode structure, researchers have now made a lithium-sulfur battery that can be recharged several hundred times. The cells have an energy capacity four times that of lithium-ion, which typically holds 150 to 200 watt-hours per kilogram (Wh/kg). If translatable to commercial devices, it could mean a battery that powers a phone for five days without needing to recharge, or quadruples the range of electric cars.

That’s unlikely to happen, since energy capacity drops when cells are strung together into battery packs. But the team still expects a “twofold increase at battery pack level when [the new battery is] introduced to the market,” says Mahdokht Shaibani, a mechanical and aerospace engineer at Australia’s Monash University who led the work published recently in the journal Science Advances.

Shaibani likens the sulfur cathode in a lithium-sulfur battery to a hard-working, overtaxed office worker. It can take on a lot, but the job demands cause stress and hurt productivity. In battery terms, during discharge the cathode soaks up a large amount of lithium ions, forming lithium sulfide. But in the process, it swells enormously, and then contracts when the ions leave during battery charging. This repeated volume change breaks down the cathode.

4 Products That Make Sense to Manufacture in Orbit

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/aerospace/space-flight/4-products-that-make-sense-to-manufacture-in-orbit

Space is open for business, and some entrepreneurs plan to make the final frontier into a manufacturing hub. There’s plenty of real estate. But it takes a few thousand dollars to launch a kilogram of stuff into space.

“The key question is: What is it that justifies the expense of doing these things in low Earth orbit?” says William Wagner, director of the University of Pittsburgh’s McGowan Institute for Regenerative Medicine, which will conduct biomedical research on the International Space Station (ISS).

Here are some technologies that might merit the “made in space” label.

  • Fiber-optic Cable

    Made from fluoride glass, a kind of fiber-optic cable called ZBLAN could have as little as one-tenth the signal loss of silica-based optical fibers.

    But quality ZBLAN fibers are hard to make on Earth. As the molten glass is stretched into fibers as thin as fishing line and then cooled, tiny crystals sometimes form, which can weaken signals. Microgravity suppresses the formation of these crystals, so fibers made in space would carry more data over longer distances.

    More data plus the need for fewer repeaters under the ocean would justify a higher price, says Austin Jordan of Made in Space, which plans to produce such fibers in space for terrestrial clients. “The math works. It would pay for itself and drive a profit,” he says.

    Two other companies, Fiber Optic Manufacturing in Space and Physical Optics Corp., also plan to make ZBLAN fibers in low Earth orbit.

  • Organs

    There are 120,000 people waiting for an organ transplant in the United States alone. “Most will never see one, there is such a shortage,” says Eugene Boland, chief scientist at Techshot, which wants to print human hearts in space.

    The heart, with its four empty chambers and highly organized muscle tissue made of different types of cells, is virtually impossible to print on the ground. On Earth, tissues printed with runny bioinks made of gel and human stem cells collapse under their own weight. Scientists must add toxic chemicals or a scaffold.

    Printing hearts and other organs in microgravity could be done using pure bioinks. “The cylindrical shape extruded from the nozzle is maintained, so you can build a more fragile 3D structure that would allow cells in the gels to secrete their own matrix and strengthen up,” says Wagner. And the printed layers fuse together without forming the striations seen in constructs printed on the ground, Boland says.

    Techshot, which is based in Greenville, Ind., is partnering with 3D-bioprinter manufacturer nScrypt. Their first bioprinter went to the ISS in July, but the small patch of heart muscle it printed didn’t survive reentry. The next mission, which launched in November, should result in thicker tissue that can be tested on Earth when it returns in January.

  • Metal Alloys

    Outer space is the perfect place to make metal alloys. Microgravity allows the metals and other elements to mix together more evenly.

    Magnesium alloys for medical implants have especially high potential. At half the weight of titanium alloys, magnesium alloys more closely match the density and strength of bone, and they harmlessly biodegrade in the body, says University of Pittsburgh bioengineering professor Prashant Kumta, who is working with Techshot to produce his patented alloys in a high-temperature furnace on the ISS.

    Making these alloys involves melting highly reactive magnesium with other elements such as calcium and zinc, keeping the melted materials in a vacuum for a long time so the elements mix evenly, and then cooling it all down.

    On Earth, impurities settle to the bottom, and the upper layer oxidizes to form an unusable skin. Both have to be thrown out. Even the usable middle layer has pores and pockets of unmixed elements and must be further processed to make a quality material. None of these problems occur when alloys are manufactured in microgravity.

  • Meat

    What Techshot and nScrypt want to do with human organs, Israeli food-tech startup Aleph Farms plans to do with meat. The two-year-old Rehovot-based company grows cultured beefsteaks that look and taste like the real thing. “While other companies use only muscle cell, we also grow connective tissue, blood vessels, and fat cells, which lets us make beefsteaks instead of patties,” says Yoav Reisler, external relations manager at the company.

    In September, the company teamed up with Russian company 3D Bioprinting Solutions to create the first tiny piece of meat on the ISS. It isn’t a huge technical advance, but it could feed astronauts on long-term crewed missions, as well as future space settlers as they set up a permanent base.

This article appears in the December 2019 print issue as “ 4 Products To Manufacture In Orbit.”

Heat Pumps Could Shrink the Carbon Footprint of Buildings

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/environment/heat-pumps-could-shrink-the-carbon-footprint-of-buildings

Buildings use more than one-third of the world’s energy, most of it for heating spaces and water. Most of this heat is generated by burning natural gas, oil, or propane. And where these fossil fuels are consumed, greenhouse gas emissions are a given.

Electric heat pumps, first widely used in the 1970s in Europe, could be the best solution to cut that fossil fuel use. They could slash the carbon emissions of buildings by half. And if powered by renewables, emissions can potentially go down to zero.

Cutting carbon emissions from heating and cooling will be critical to keep global average temperatures from rising by more than 1.5 degrees Celsius above preindustrial levels. Already, anthropogenic climate change has caused average global temperatures to rise by approximately 1 degree C, according to the Intergovernmental Panel on Climate Change. At the United Nations Climate Action Summit next week in New York, world leaders will discuss concrete steps to meet climate targets.

Liquid Air Could Store Renewable Energy and Reduce Emissions

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/renewables/liquid-air-could-store-renewable-energy-and-cut-food-industry-emissions

Keeping food cold is an energy-gobbling endeavor. Refrigerated food warehouses and factories consume immense amounts of energy, and this cooling demand is expected to increase as the climate warms while global incomes and food consumption rise. A team of researchers and companies in Europe are now developing a cryogenic energy storage system that could reduce carbon emissions from the food sector while providing a convenient way to store wind and solar power.

The CryoHub project will use extra wind and solar electricity to freeze air to cryogenic temperatures, where it becomes liquid, and in the process shrinks by 700 times in volume. The liquid air is stored in insulated low-pressure tanks similar to ones used for liquid nitrogen and natural gas.

When the grid needs electricity, the subzero liquid is pumped into an evaporator where it expands back into a gas that can spin a turbine for electricity. As it expands, the liquid also sucks heat from surrounding air. “So you can basically provide free cooling for food storage,” says Judith Evans, a professor of air conditioning and refrigeration engineering at London South Bank University who is coordinating the CryoHub project.

How Inexpensive Must Energy Storage Be for Utilities to Switch to 100 Percent Renewables?

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/renewables/what-energy-storage-would-have-to-cost-for-a-renewable-grid

Last week, the city of Los Angeles inked a deal for a solar-plus-storage system at a record-low price. The 400-MW Eland solar power project will be capable of storing 1,200 megawatt-hours of energy in lithium-ion batteries to meet demand at night. The project is a part of the city’s climate commitment to reach 100 percent renewable energy by 2045.

Electricity and heat production are the largest sources of greenhouse gas emissions in the world. Carbon-free electricity will be critical for keeping the average global temperature rise to within the United Nations’ target of 1.5 degrees Celsius and avoid the worst effects of climate change. As world leaders meet at the United Nations Climate Action Summit next week, boosting renewable energy and energy storage will be major priorities.

Robots Will Navigate the Moon With Maps They Make Themselves

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/aerospace/robotic-exploration/robots-will-navigate-the-moon-with-maps-they-make-themselves

Astrobotic’s autonomous navigation will help lunar landers, rovers, and drones find their way on the moon

graphic link to special report landing page
graphic link to special report landing  page

Neil Armstrong made it sound easy. “Houston, Tranquility Base here. The Eagle has landed,” he said calmly, as if he had just pulled into a parking lot. In fact, the descent of the Apollo 11 lander was nerve-racking. As the Eagle headed to the moon’s surface, Armstrong and his colleague Buzz Aldrin realized it would touch down well past the planned landing site and was heading straight for a field of boulders. Armstrong started looking for a better place to park. Finally, at 150 meters, he leveled off and steered to a smooth spot with about 45 seconds of fuel to spare.

“If he hadn’t been there, who knows what would have happened?” says Andrew Horchler, throwing his hands up. He’s sitting in a glass-walled conference room in a repurposed brick warehouse, part of Pittsburgh’s Robotics Row, a hub for tech startups. This is the headquarters of space robotics company Astrobotic Technology. In the coming decades, human forays to the moon will rely heavily on robotic landers, rovers, and drones. Horchler leads a team whose aim is ensuring those robotic vessels—including Astrobotic’s own Peregrine lander—can perform at least as well as Armstrong did.

Astrobotic’s precision-navigation technology will let both uncrewed and crewed landers touch down exactly where they should, so a future Armstrong won’t have to strong-arm her landing vessel’s controls. Once they’re safely on the surface, robots like Astrobotic’s will explore the moon’s geology, scout out sites for future lunar bases, and carry equipment and material destined for those bases, Horchler says. Eventually, rovers will help mine for minerals and water frozen deep in craters and at the poles.

Astrobotic was founded in 2007 by roboticists at Carnegie Mellon University to compete for the Google Lunar X Prize, which challenged teams to put a robotic spacecraft on the moon. The company pulled out of the competition in 2016, but its mission has continued to evolve. It now has a 20-person staff and contracts with a dozen organizations to deliver payloads to the moon, at US $1.2 million per kilogram, which the company says is the lowest in the industry. Late last year, Astrobotic was one of nine companies that NASA chose to carry payloads to the moon for its 10-year, $2.6 billion Commercial Lunar Payload Services (CLPS) program. The space agency announced the first round of CLPS contracts in late May, with Astrobotic receiving $79.5 million to deliver its payloads by July 2021.

Meanwhile, China, India, and Israel have all launched uncrewed lunar landers or plan to do so soon. The moon will probably be a much busier place by the 60th anniversary of Apollo 11, in 2029.

The moon’s allure is universal, says John Horack, an aerospace engineer at Ohio State University. “The moon is just hanging in the sky, beckoning to us. That beckoning doesn’t know language or culture barriers. It’s not surprising to see so many thinking about how to get to the moon.”

On the moon, there is no GPS, compass-enabling magnetic field, or high-resolution maps for a lunar craft to use to figure out where it is and where it’s going. Any craft will also be limited in the computing, power, and sensors it can carry. Navigating on the moon is more like the wayfinding of the ancient Polynesians, who studied the stars and ocean currents to track their boats’ trajectory, location, and direction.

A spacecraft’s wayfinders are inertial measurement units that use gyroscopes and accelerometers to calculate attitude, velocity, and direction from a fixed starting point. These systems extrapolate from previous estimates, so errors accumulate over time. “Your knowledge of where you are gets fuzzier and fuzzier as you fly forward,” Horchler says. “Our system collapses that fuzziness down to a known point.”

A conventional guidance system can put a vessel down within an ellipse that’s several kilometers long, but Astrobotic’s system will land a craft within 100 meters of its target. This could allow touchdowns near minable craters, at the heavily shadowed icy poles, or on a landing pad next to a moon base. “It’s one thing to land once at a site, a whole other thing to land repeatedly with precision,” says Horchler.

Astrobotic’s terrain-relative navigation (TRN) sensor contains all the hardware and software needed for smart navigation. It uses 32-bit processors that have worked well on other missions and FPGA hardware acceleration for low-level computer-vision processing. The processors and FPGAs are all radiation hardened. The brick-size unit can be bolted to any spacecraft. The sensor will take a several-megapixel image of the lunar surface every second or so as the lander approaches. Algorithms akin to those for facial recognition will spot unique features in the images, comparing them with stored maps to calculate lunar coordinates and orientation.

Those stored maps are a computing marvel. Images taken by NASA’s Lunar Reconnaissance Orbiter (LRO), which has been mapping the moon since 2009, have very different perspectives and shadows from what the lander will see as it descends. This is especially true at the poles, where the angle of the sun changes the lighting dramatically.

So software wizards at Astrobotic are creating synthetic maps. Their software starts with elevation models based on LRO data. It fuses those terrain models with data on the relative positions of the sun, moon, and Earth; the approximate location of the lander; and the texture and reflectiveness of the lunar soil. Finally, a physics-based ray-tracing system, similar to what’s used in animated films to create synthetic imagery, puts everything together.

Horchler pulls up two images of a 50-by-200-kilometer patch near the moon’s south pole. One is a photo taken by the LRO. The other is a digitally rendered version created by the Astrobotic software. I can’t tell them apart. Future TRN systems may be able to build high-fidelity maps on the fly as the lander descends, but that’s impossible with current onboard computing power, Horchler says.

To confirm the TRN’s algorithms, Astrobotic has run tests in the Mojave Desert. A 2014 video shows the TRN sensor mounted on a vertical-takeoff-and-landing vehicle made by Masten Space Systems, another company chosen for NASA’s CLPS program. Astrobotic engineers had mapped the scrubby area beforehand, including a potential landing site littered with sandbags to mimic large rocks. In the video, the vehicle takes off without a programmed destination. The navigation sensor scans the ground, matching what it sees to the stored maps. The hazard-detection sensor uses lidar and stereo cameras to map shapes and elevation on the rocky terrain and track the lander’s distance to the ground. The craft lands safely, avoiding the sandbags.

Astrobotic expects its first CLPS mission to launch in July 2021, aboard a United Launch Alliance Atlas V rocket. The 28 payloads aboard the stout Peregrine lander will include NASA scientific instruments, another scientific instrument from the Mexican Space Agency, rovers from startups in Chile and Japan, and personal mementos from paying customers.

In a space that Astrobotic employees call the Tiger’s Den, a large plush tiger keeps an eye on aerospace engineer Jeremy Hardy, who looks like he’s having too much fun. He’s flying a virtual drone onscreen through a landscape of trees and rocks. When he switches to a drone’s-eye view, the landscape fills with green dots, each a unique feature that the drone is tracking, like a corner or an edge.

The program Hardy is using is called AstroNav, which will guide propulsion-powered drones as they fly through the moon’s immense lava tubes. These temperature-stable tunnels are believed to be tens of kilometers long and “could fit whole cities within them,” Horchler says. The drones will map the tunnels as they fly, coming back out to recharge and send images to a lunar station or to Earth.

Hardy’s drone is flying in unchartered territory. AstroNav uses a simultaneous localization and mapping (SLAM) algorithm, a heavyweight technology also used by self-driving cars and office delivery robots to build a map of their surroundings and compute their own location within that map. AstroNav blends data from the drone’s inertial measurement units, stereo-vision cameras, and lidar. The software tracks the green-dotted features across many frames to calculate where the drone is.

The company has tested AstroNav-guided hexacopters in West Virginian caves, craters in New Mexico, and the Lofthellir lava tube of Iceland. Similar SLAM techniques could guide autonomous lunar rovers as they explore permanently shadowed regions at the poles.

Astrobotic has plenty of competition. Another CLPS contractor is Draper Laboratory, which helped guide Apollo missions. The lab’s navigation system, also built around image processing and recognition, will take Japanese startup Ispace’s lander to the moon.

Draper’s “special sauce” is software developed for the U.S. Army’s Joint Precision Airdrop System, which delivers supplies via parachute in war zones, says space systems program manager Alan Campbell. Within a box called an aerial guidance unit is a downward-facing camera, motors, and a small computer running Draper’s software. The software determines the parachute’s location by comparing terrain features in the camera’s images with commercial satellite images to land the parachute within 50 meters of its target.

The unit also uses Doppler lidar, which detects hazards and measures relative velocity. “When you’re higher up, you can compare images to maps,” says Campbell. At lower altitudes, a different method tracks features and how they’re moving. “Lidar will give you a finer-grain map of hazards.”

Draper’s long experience dating back to Apollo gives the lab an edge, Campbell adds. “We’ve landed on the moon before, and I don’t think our competitors can say that.”

Other nations with lunar aspirations are also relying on autonomous navigation. China’s Chang’e 4, for example, became the first craft to land on the far side of the moon, in early January. In its landing video, the craft hovers for a few seconds above the surface. “That indicates it has lidar or [a] camera and is taking an image of the field to make sure it’s landing on a safe spot,” says Campbell. “It’s definitely an autonomous system.”

Israel’s lunar spacecraft Beresheet was also expected to make a fully automated touchdown in April. It relied on image-processing software run on a computer about as powerful as a smartphone, according to reports. However, just moments before it was to land, it crashed on the lunar surface due to an apparent engine failure.

In the race to the moon, there will be no one winner, Ohio State’s Horack says. “We need a fair number of successful organizations from around the world working on this.”

Astrobotic is also looking further out. Its AstroNav could be used on other cosmic bodies for which there are no high-resolution maps, like the moons of Jupiter and Saturn. The challenge will be scaling back the software’s appetite for computing power. Computing in space lags far behind computing on Earth, Horchler notes. Everything needs to be radiation tolerant and designed for a thermally challenging environment. “It tends to be very custom,” he says. “You don’t have a new family of processors every two years. An Apple Watch has more computing power than a lot of spacecraft out there.”

The moon will be a crucial test-bed for precision landing and navigation. “A lot of the technology that it takes to land on the moon is similar to what it takes to land on Mars or icy moons like Europa,” Horchler says. “It’s much easier to prove things out at our nearest neighbor than at bodies halfway across the solar system.”

This article appears in the July 2019 print issue as “Turn Left at Tranquility Base.”

Atom Power Is Launching the Era of Digital Circuit Breakers

Post Syndicated from Prachi Patel original https://spectrum.ieee.org/energywise/energy/the-smarter-grid/atom-power-is-launching-the-era-of-digital-circuit-breakers

New digital circuit breakers that combine computing power with wireless connectivity may finally replace an 140-year-old electromechanical technology

In the dark, dank depths of your home basement hangs a drab gray box that guards the building’s electrical circuits. The circuit breakers inside switch off current flow when there is risk of an overload or short circuit, keeping you safe from fires or electrocution. It’s a critical job, and one that breakers have been doing with a fairly simple, 140-year-old electromechanical technology.

But circuit breakers are about to get a digital overhaul. New semiconductor breakers that combine computing power and wireless connectivity could become the hub of smart, energy-efficient buildings of the future.

“It’s like going from a telephone that just makes calls to a smartphone with capabilities we’d never imagined before,” says Ryan Kennedy, CEO and co-founder of Atom Power in Charlotte, North Carolina. “This is a platform that changes everything in power systems.”

Digital circuit breakers have been a holy grail in power engineering circles. Atom Power has now become the first to earn certification from the Underwriters Laboratory (UL) for its product, which combines breakers based on silicon carbide transistors with software. While UL approval isn’t legally required, it’s the industry safety standard for commercial use.