Tag Archives: Semiconductors/Nanotechnology

IBM’s $3-Billion Research Project Has Kept Computing Moving Forward

Post Syndicated from Dexter Johnson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/ibms-3billion-research-project-has-kept-computing-moving-forward

Back in 2014, under the looming shadow of the end of Moore’s Law, IBM embarked on an ambitious, US $3 billion project dubbed “7-nm and Beyond”. The bold aim of that five-year research project was to see how computing would continue into the future as the physics of decreasing chip dimensions conspired against it.

Six years later, Moore’s Law isn’t much of a law anymore. The observation by Gordon Moore (and later the industry-wide adherence to that observation) that the number of transistors on a chip doubled roughly every two years seems now almost to be a quaint vestige of days gone by. But innovation in computing is still required, and the “7-nm and Beyond” project has helped meet that continuing need.

“The search for new device architectures to enable the scaling of devices, and the search for new materials for performance differentiation will never end,” says Huiming Bu, Director at IBM’s Advanced Logic & Memory Technology Research, Semiconductor, and AI Hardware Group.

Although the chip industry may not feel as constrained by Moore’s Law as it has in the past, the “7-nm and Beyond” project has delivered important innovations even while some chip manufacturers have seemingly thrown up their hands in frustration at various points in recent years. 

One example of this frustration was the decision two years ago by GlobalFoundries to suspend its 7-nanometer chip development.

Back in 2015, one year into its “7-nm and Beyond” project, IBM announced its first 7-nm test chip in which extreme-ultraviolet lithography (EUV), supplied by ASML, was a key enabling technology. While there have been growing pains in the use of EUV—resulting in the richest chip manufacturers being the only ones continuing on with the scaling down that it enables—it has since become a key enabling technology not only for 7-nm nodes, but also for 5-nm nodes and beyond, according to Bu.

“Back in the 2014-2015 time window, the whole industry had a big question about the practical feasibility of EUV technology,” says Bu. “Now it’s not a question. Now, EUV has become the mainstream enabler. The first-kind 7-nm work we delivered based on EUV back then helped to build the confidence and momentum towards EUV manufacturing in our industry.”

Of course, EUV has enabled 7-nm nodes, but the aim of IBM was to look beyond that. IBM believes that the foundational element of chips to enable the scaling beyond FinFET will be the nanosheet transistor, which some have suggested may even be the last step in Moore’s Law.

The nanosheet looks to be the replacement to the FinFET architecture, and is expected to make possible the transition from the 7-nm and 5-nm nodes to the 3-nm node. In the architecture of the nanosheet field-effect transistors, current flows through multiple stacks of silicon that are completely surrounded by the transistor gate. This design greatly reduces the amount of current that can leak during off state, allowing more current to be used in driving the device when the switch is turned on.

“In 2017, the industry had a question about what will be the new device structure beyond FinFET,” says Bu. “At this point, three years later, the whole industry is getting behind nanosheet technology as the next device structure after FinFET.”

The transistors and switches have had some key developments, but the “7-nm and Beyond” project also resulted in some significant insights into how the wiring above all these transistors and switches will be made going into the future.

“Part of our innovation has been to extend copper as far as possible,” says Daniel Edelstein, IBM Fellow; Si Technology Research; MRAM/BEOL Process Strategy.  “The hard part, as always,” says Edelstein, “has been simply patterning these extremely tiny and tall trenches and filling them without defects with copper.”

Despite the challenges with using copper, Edelstein doesn’t see the industry migrating away from it to more exotic materials in the near future. “Copper is certainly not at the end of its rope for what’s being manufactured today,” said Edelstein.

He adds: “Several companies have indicated that they intend to continue using it. So I can’t tell you exactly when it breaks. But we have seen that the so-called resistance crossover point keeps getting pushed farther into the future.”

While chip dimensions, architectures and materials have driven much of the innovations of the “7-nm and Beyond” project, both Edelstein and Bu note that artificial intelligence (AI) is also playing a key role in how they are approaching the future of computing.

“With the onset of AI-type, brain-inspired computing and other kinds of non-digital computing, we’re starting to develop, at the research level, additional devices—especially emerging memory devices,” says Edelstein.

Edelstein is referring to emerging memory devices, such as phase-change memory (or memristors,” as some others refer to them), which are thought of as analog computing devices.

The emergence of these new memory devices has provided a kind of resurrection in thinking about potential applications over and above conventional data storage. Researchers are imagining new roles for the thirty-year-old magnetoresistive random-access memory (MRAM), which IBM has been working on since MRAM’s debut.

“MRAM has finally had enough breakthroughs where it’s now not only manufacturable, but also approaching the kinds of requirements that it needs to achieve to be competitive with SRAM for system cache, which is kind of the holy grail in the end,” says Edelstein.

The evidence of this embedding of MRAM and other nonvolatile memories—including RRAM and phase-change memory—directly into the processor is seen in the move last year by chip equipment manufacturer Applied Materials to give its customers the tools for enabling this change.

The pursuit of new devices, new materials, and new computing architectures for better power-performance will continue, according to Bu. He also believes that the demand to integrate various components into a holistic computing system is starting to drive a whole new world of heterogeneous integration.

Bu adds: “Building these heterogeneous architecture systems is going to become a key in future computing. It is a new innovation strategy driven by the demands of AI.”

Researchers Create Shiny Rainbows of Nanotech Chocolate

Post Syndicated from Tekla S. Perry original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/researchers-create-shiny-rainbows-of-nanotech-chocolate

While so many of us are working at home during the coronavirus pandemic, we do worry that serendipitous hallway conversations aren’t happening.

Last year, before the pandemic, it was one of those conversations that led researchers at ETH Zurich to develop a way of making chocolates shimmer with color—without any coloring agents or other additives.

The project, announced in December, involves what the scientists call “structural color”. The team indicated that it creates colors in a way similar to what a chameleon does—that is, using the structure of its skin to  scatter a particular wavelength of light. The researchers have yet to release details, but Alissa M. Fitzgerald, founder of MEMS product development firm AMFitzgerald, has a pretty good guess.

She explains that Iridescence in nature (like that inside oyster shells and on the wings of butterflies) involves nanoscale patterns in the form of lines, plates, or holes. To make iridescent chocolate, she surmises, the researchers likely created a nanotech chocolate mold, using e-beam lithography to etch lines of about 100 nm wide on a glass or silicon wafer.

The ETH researchers hope to get their technique for coloring chocolate out of the lab and into the mass market. Meanwhile, during the pandemic shutdown, some tech professionals have been playing with rainbow chocolates of their own, like software engineer and startup founder Samy Kamkar, recently profiled in the New York Times. (You can only bake so much bread, after all.)

Chocolate is only the beginning for nanocolors, Fitzgerald says: “The combination of nano- and micro-technology fabrication techniques with atypical materials like food, fabric, paper and plastic is going to lead to some really exciting new products as well as improve or enhance existing products. For example, Teijin Fiber Japan uses structural color methods to make “Morphotex” fabric, named after the iridescent Morpho butterfly, recently demonstrated in the concept Morphotex Dress. Everyday objects are poised to benefit from advances in nanotechnology.”

TSMC’s Geopolitical Dance

Post Syndicated from Samuel K. Moore original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/tsmcs-geopolitical-dance

The global semiconductor supply chain is having an interesting year. Having adjusted to the potential and realities of a U.S.-China trade war, it is now faced with an economy-halting pandemic. Friday’s news seemed a microcosm of what is emerging from this moment: a combination of less concentrated advanced manufacturing and attempts to pressure companies to bend to geopolitical objectives.

On 15 May, the world’s largest semiconductor foundry, TSMC, announced that it planned to build a $12-billion fab in Arizona, which would begin production in 2024. (The $12 billion investment is 2021-2029.) That same day, the Trump Administration said it would now require TSMC and other non-U.S. chip makers to get a license from the U.S. Commerce Department if they want to ship chips to Huawei that were made using U.S. software and technology.

First, the new fab: According to VLSI Research’s Dan Hutcheson, a U.S. fab is partly a ploy to keep Apple happy. The iPhone-etc.-maker’s CEO Tim Cook has been pushing for such a move for some time to ensure supply continuity for the processors that go in the company’s products. These processors have historically used leading-edge chip making technology. Currently that’s TSMC’s 7-nanometer process, but the company says the next generation process, 5-nanometers, is in production now.

TSMC, of course, has other important customers for its leading-edge technologies. AMD, Xilinx, Qualcomm, and Nvidia are among them; and more recently, so are cloud giants such as Google, Microsoft, Facebook, and Amazon, which have been developing their own server and AI designs.

To keep them happy, the Arizona fab will have to operate at the most advanced process available. TSMC is promising a 5-nanometer fab there, but by 2024 when production is set to begin, TSMC may be moving to another process generation, 3-nanometers. But fabs are built to be upgraded, Hutcheson points out. They aren’t built around a particular technology, and it seems assured that whatever 3-nanometer and more advanced processes entail, it will still mainly rely on extreme-ultraviolet lithography, the same tech central to the 7-nanometer and 5-nanometer processes.

However, transferring a manufacturing process to a new location and getting it to the point that it yields a profitable proportion of wafers is never easy. Hutcheson notes that TSMC struggled with that when it first built fabs in Tainan, which is little more than an hour away by high-speed rail from its headquarters in Hsinchu. However, depending on where in Arizona the fab is located, the company may benefit from infrastructure and experienced employees related to Intel’s advanced fabs in Chandler.

The plant’s projected 20,000-wafers-per-month capacity figure is actually quite low compared to other facilities. It matches the company’s recently built 16-nanometer Fab 16 in Nanjing, China. But it’s not in the same league as the company’s planned 5-nanometer Fab 18 in southern Taiwan, which will have a nameplate capacity of 120,000 wafers per month. Still, 20,000 wafers per month is in line with the first phase of other new fabs, says Joanne Itow, managing director at Semico Research. And that capacity could translate to 144 million applications processors per year, according to Itow. That’s enough to partly supply several customers and generate about $1.44 billion in revenue for TSMC.

That’s all assuming this fab actually happens. “Right now, it’s a Powerpoint fab,” says Hutcheson. TSMC’s own press release gives a very conditional feel: TSMC “announced its intention to build and operate an advanced semiconductor fab in the United States with the mutual understanding and commitment to support from the U.S. federal government and the State of Arizona.”

“Technically, it probably doesn’t matter where the chips are manufactured; however, in today’s tense trade arena the optics of having a fab in the United States provide a more positive partnership atmosphere,” says Joanne Itow, managing director at Semico Research.

The other TSMC news is much less of a win-win. The U.S. government has sought to starve Huawei of advanced semiconductors. Its Bureau of Industry and Security (BIS) added Huwawei and its affiliates, particularly its semiconductor arm HiSilicon, to its list of entities that U.S. firms can’t sell to without a license in 2019. Huawei got around this by stepping up its own chip design capabilities, though it relies on foundries, especially TSMC, to manufacture its advanced chips. BIS is now seeking to tighten the screws by extending the licensing to foundries using U.S. software and tools to make Huawei’s chips.

In effect, the rule boils down to one country specifying which tools can be used in a factory in another country to produce goods for a customer in a third. TSMC is among the largest customers of U.S. chip tool makers and they have reason to worry, according to the Semiconductor Industry Association. “We are concerned this rule may create uncertainty and disruption for the global semiconductor supply chain, but it seems to be less damaging to the U.S. semiconductor industry than the very broad approaches previously considered,” the organization’s CEO John Neuffer said in a statement.

The new rule will likely accelerate Huawei’s ongoing shift away from U.S. technology, says Nelson Dong, a senior partner in charge of national security at the international law firm Dorsey & Whitney and a board member at the non-profit advocacy group the National Committee on US-China Relations. Indirectly, “this move may well force the global semiconductor industry to look away from U.S. suppliers of semiconductor design tools and semiconductor production equipment and even to create new rival companies in other countries, including China itself,” he says. He cites the example of export restrictions in the satellite industry, which ultimately led to the growth of competing businesses outside the United States and higher prices for U.S. satellite makers due to their suppliers’ smaller market.

It is difficult to imagine how the U.S. could enforce such a rule in an advanced fab. “Fabs are an extreme version of ‘What happens in Vegas, stays in Vegas,’” quips Hutcheson. Manufacturing processes are proprietary and very closely guarded. “How would they even know it was going on?”

Experiment Reveals the Peculiar Way Light Travels in a Photonic Crystal

Post Syndicated from Charles Q. Choi original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/topological-photonic-crystal-light

In novel materials known as photonic topological insulators, wavelengths of light can flow around sharp corners with virtually no losses. Now scientists have witnessed key details of what the light does inside these structures, which could help them to better engineer these materials for real-world applications.

Topology is the branch of mathematics that explores what features of shapes withstand deformation. For instance, an object shaped like a doughnut can get pushed and pulled into the shape of a mug, with the doughnut’s hole forming the hole in the cup’s handle, but it could not get deformed into a shape that lacked a hole.

Using insights from topology, researchers developed the first electronic topological insulators in 2007. Electrons traveling along the edges or surfaces of these materials strongly resist any disturbances that might hinder their flow, much as a doughnut might resist any change that would remove its hole.

New Math Makes Scientists More Certain About Quantum Uncertainties

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/new-math-makes-scientists-more-certain-about-quantum-uncertainties

Quantum measurements, at the core of next-generation technologies including quantum computing, quantum cryptography, and ultra-sensitive electronics, may face a new hurdle as system sensitivities brush up against Heisenberg’s Uncertainty Principle.

The practical Heisenberg limits in measuring some quantities up to the ultimate quantum sensitivity may be larger than expected—by a factor of pi. This new finding would, according to physicist Wojciech Górecki of the University of Warsaw in Poland, represent “an impediment compared to previous expectations.”

Górecki said he and his collaborators arrived at this theoretical limit by applying a branch of math known as Bayesian statistics to familiar quantum measurement problems.

The standard problem posed in many Intro to Quantum classes involves the push-and-pull conflict between measuring a particle’s position with high precision versus knowing that same particle’s momentum with high precision as well.

As Werner Heisenberg famously theorized in 1927, the product of uncertainties of these two observables can never dip below a very small number equal to Planck’s constant divided by four times pi (h/4π).

So, down at the quantum scale, there are always tradeoffs. Measuring a particle’s position with very high precision calls for sacrificing how precisely you can determine the speed and direction of its travel.

Yet, said Górecki, plenty of quantum scale measurements involve neither position nor momentum. For instance, some photonics instruments measure quantities like the phase of a wavefront versus the number of photons counted in a given energy range.

Górecki notes that canonical Heisenberg isn’t as much help here as is a related concept called the “Heisenberg limit.” The Heisenberg Limit, he says, delineates the smallest possible uncertainty in a measurement, given a set number of times a system is probed. “It is a natural consequence of Heisenberg’s uncertainty principle, interpreted in a slightly broader context,” says Górecki.

It was long believed that, with a hypothetical technology trying to discover phase as precisely as possible using only n photons, the Heisenberg Limit to the uncertainty in phase was 1/n. But no technology had been devised to prove that 1/n was the ultimate universal “Heisenberg Limit.”

There’s a good reason why. Górecki and colleagues report in a new paper in the journal Physical Review Letters that the Heisenberg Limit in this case scales as π/n instead of 1/n. In other words, the smallest measurable uncertainty is more than three times as much as previously believed. And so now we know that our observations of the universe are a little bit fuzzier than we imagined.

(To be clear, “n” here is not necessarily just the number of photons used in a measurement. It could also represent a number of other limits on the amount of resources expended in making a precision observation. The variable “n” here could also be, Górecki notes, the number of quantum gates in a measurement or the total time spent interrogating the system.)

Górecki says the new finding may not remain purely theoretical for too much longer. A 2007 experiment in precision phase measurement came within 56 percent of the new Heisenberg Limit.

“Our paper has attracted the interest of eminent researchers in the field of statistics, which find this idea worth spreading,” says Górecki. “Perhaps it would be possible to construct a simpler proof that could be included in standard textbooks.”

Graphene Made in a Flash From Trash

Post Syndicated from Charles Q. Choi original https://spectrum.ieee.org/tech-talk/semiconductors/nanotechnology/graphene-flash

Graphene can literally be made in a flash by using electricity to zap nearly anything that contains carbon, including discarded food and plastic, a new study finds.

Graphene is made of flexible, transparent sheets each just one carbon atom thick. It’s 200 times stronger than steel, lighter than paper, and more electrically and thermally conductive than copper. Currently the most common way to make graphene in bulk is via exfoliation. It works a bit like how you might exfoliate your skin, and involves sloughing layers of graphene off a block of graphite.

However, chemical exfoliation uses lots of acid and is very expensive, while exfoliation using sound energy or fast-flowing fluid pries off platelets of graphene that are often more than 20 layers thick. Scientists can also produce graphene by depositing it from a vapor onto a surface, but this only makes tiny amounts.

Ultrasensitive Microscope Reveals How Charging Changes Molecular Structures

Post Syndicated from Dexter Johnson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/structural-changes-of-molecules-during-charging-revealed

New ability to image molecules under charging promises big changes for molecular electronics and organic photovolatics

All living systems depend on the charging and discharging of molecules to convert and transport energy. While science has revealed many of the fundamental mechanisms of how this occurs, one area has remained shrouded in mystery: How does a molecule’s structure change while charging? The answer could have implications for range of applications including molecular electronics and organic photovoltaics.

Now a team of researchers from IBM Research in Zurich, the University of Santiago de Compostela and ExxonMobil has reported in the journal Science the ability to image, with unprecedented resolution, the structural changes that occur to individual molecules upon charging.

This ability to peer into this previously unobserved phenomenon should reveal the molecular charge-function relationships and how they relate to biological systems converting and transporting energy. This understanding could play a critical role in the development of both organic electronic and photovoltaic devices.

“Molecular charge transition is at the heart of many important phenomena, such as photoconversion, energy and molecular transport, catalysis, chemical synthesis, molecular electronics, to name some,” said Leo Gross, research staff member at IBM Zurich and co-author of the research. “Improving our understanding of how the charging affects the structure and function of molecules will improve our understanding of these fundamental phenomena.”

This latest breakthrough is based on research going back 10 years when Gross and his colleagues developed a technique to resolve the structure of molecules with an atomic force microscope. AFMs map the surface of a material by recording the vertical displacement necessary to maintain a constant force on the cantilevered probe tip as it scans a sample’s surface.

Over the years, Gross and his colleagues refined the technique so it could see the charge distribution inside a molecule, and then were able to get it to distinguish between individual bonds of a molecule.

The trick to these techniques was to functionalize the tip of the AFM probe with a single carbon monoxide (CO) molecule. Last year, Gross and his colleague Shadi Fatayer at IBM Zurich believed that the ultra-high resolution possible with the CO tips could be combined with controlling the charge of the molecule being imaged.

“The main hurdle was in combining two capabilities, the control and manipulation of the charge states of molecules and the imaging of molecules with atomic resolution,” said Fatayer.

The concern was that the functionalization of the tip would not be able to withstand the applied bias voltages used in the experiment. Despite these concerns, Fatayer explained that they were able to overcome the challenges in combining these two capabilities by using multi-layer insulating films, which avoid charge leakage and allow charge state control of molecules.

The researchers were able to control the charge-state by attaching single electrons from the AFM tip to the molecule, or vice-versa. This was achieved by applying a voltage between the tip and the molecule. “We know when an electron is attached or removed from the molecule by observing changes in the force signal,” said Fatayer.

The IBM researchers expect that this research could have an impact in the fundamental understanding of single-electron based and molecular devices. This field of molecular electronics promises a day when individual molecules become the building blocks of electronics.

Another important prospect of the research, according to Fatayer and Gross, would be its impact on organic photovoltaic devices. Organic photovoltaics have been a tantalizing solution for solar power because they are cheap to manufacture. However, organic solar cells have been notoriously poor compared to silicon solar cells at converting sunlight to energy efficiently.

The hope is that by revealing how the structural changes of molecules under charge impact the charge transition of molecules, engineers will be able to further optimize organic photovoltaics.

Is Graphene by Any Other Name Still Graphene?

Post Syndicated from Dexter Johnson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/is-graphene-by-any-other-name-still-graphene

Consumers may finally have a way to know if their graphene-enabled products actually get any benefit from the wonder material

Last year, the graphene community was rocked by a series of critical articles that appeared in some high-profile journals. First there was an Advanced Material’s article with the rather innocuously title: “The Worldwide Graphene Flake Production”. It was perhaps the follow-up article that appeared in the journal Nature that really shook things up with its incendiary title: “The war on fake graphene”.

In these two articles it was revealed that material that had been claimed to be high-quality (and high-priced) graphene was little more than graphite powder. Boosted by their appearance in high-impact journals, these articles threatened the foundations of the graphene marketplace.

But while these articles triggered a lot of hand wringing among the buyers and sellers of graphene, it’s not clear that their impact extended much beyond the supply chain of graphene. Whether or not graphene has aggregated back to being graphite is one question. An even bigger one is whether or not consumers are actually being sold a better product on the basis that it incorporates graphene. 

Consumer products featuring graphene today include everything from headphones to light bulbs. Consequently, there is already confusion among buyers about the tangible benefits graphene is supposed to provide. And of course the situation becomes even worse if the graphene sold to make products may not even be graphene: how are consumers supposed to determine whether graphene infuses their products with anything other than a buzzword?

Another source of confusion arises because when graphene is incorporated into a product it is effectively a different animal from graphene in isolation. There is ample scientific evidence that graphene when included in a material matrix, like a polymer or even paper, can impart new properties to the materials. “You can transfer some very useful properties of graphene into other materials by adding graphene, but just because the resultant material contains graphene it does not mean it will behave like free-standing graphene, explains Tom Eldridge, of UK-based Fullerex, a consultancy that provides companies with information on how to include graphene in a material matrix.

Eldridge added: “This is why it is often misleading to talk about the superlative properties of free-standing graphene for benefiting applications, because almost always graphene is being combined with other materials. For instance, if I combine graphene with concrete I will not get concrete which is 200 times stronger than steel.”

This is what leaves consumers a bit lost at sea: Graphene can provide performance improvements to a product, but what kind and by how much?

The Graphene Council (Disclosure: The author of this story has also worked for The Graphene Council) recognized this knowledge gap in the market and has just launched a “Verified Graphene Product” Program in addition to its “Verified Graphene Producer” program. The Verified Graphene Producer program takes raw samples of graphene and characterizes them to verify the type of graphene it is, while the Verified Graphene Product program addresses the issue of what graphene is actually doing in products that claim to use it. 

Companies that are marketing products that claim to be enhanced by graphene can use this service, and the verification can be applied to their product to give buyers confidence that graphene is actually doing something. (It’s not known if there are any clients taking advantage of it yet.)

“Consumers want to know that the products they purchase are genuine and will perform as advertised,” said Terrance Barkan, executive director of The Graphene Council. “This applies equally to purchasers of graphene enhanced materials and applications. This is why independent, third-party verification is needed.”

Magnet Sets World Record at 45.5 Teslas

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/tech-talk/semiconductors/nanotechnology/a-beachhead-to-superstrong-magnetic-fields

It’s the strongest continuous DC magnetic field ever recorded and could help scientists study nuclear fusion and exotic states of matter

A new multicomponent, partially-superconducting electromagnet—currently the world’s strongest DC magnet of any kind—is poised to reveal a path to substantially stronger magnets still. The new magnet technology could help scientists study many other phenomena including nuclear fusion, exotic states of matter“shape-shifting” molecules, and interplanetary rockets, to name a few.

The National High Magnetic Field Laboratory in Tallahassee, Florida is home to four types of advanced, ultra-strong magnets. One supports magnetic resonance studies. Another is configured for mass spectrometry. And a different type produces the strongest magnetic fields in the world. (Sister MagLab campuses at the University of Florida and Los Alamos National Laboratory provide three more high-capacity magnets for other fields of study.)

It’s that last category on the Tallahassee campus—world’s strongest magnet—that the latest research is attempting to complement. The so-called MagLab DC Field Facility, in operation since 1999, is nearing a limit in the strength of magnetic fields it can produce with its current materials and technology.

The MagLab’s DC magnet maintains a steady 45 Tesla of field strength, which until very recently was the strongest continuous magnetic field produced in the world. (Not to be confused with the electric car brand of the same name, Tesla is also a unit of magnetic field strength. The higher its Tesla rating, the stronger the magnet. For comparison, a typical MRI machine is built around a superconducting magnet with approximately 3 Tesla of field strength. The Earth’s magnetic field, felt at the planet’s surface, is 0.00005 T.)

A Faster Way to Rearrange Atoms Could Lead to Powerful Quantum Sensors

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/a-faster-way-to-rearrange-atoms

The technique is also more accurate than the traditional method of poking atoms with the tip of a scanning electron microscope

The fine art of adding impurities to silicon wafers lies at the heart of semiconductor engineering and, with it, much of the computer industry. But this fine art isn’t yet so finely tuned that engineers can manipulate impurities down to the level of individual atoms.

As technology scales down to the nanometer size and smaller, though, the placement of individual impurities will become increasingly significant. Which makes interesting the announcement last month that scientists can now rearrange individual impurities (in this case, single phosphorous atoms) in a sheet of graphene by using electron beams to knock them around like croquet balls on a field of grass.

The finding suggests a new vanguard of single-atom electronic engineering. Says research team member Ju Li, professor of nuclear science and engineering at MIT, gone are the days when individual atoms can only be moved around mechanically—often clumsily on the tip of a scanning tunneling microscope.

Europe Has Invested €1 Billion Into Graphene—But For What?

Post Syndicated from Dexter Johnson original https://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/europe-has-invested-1-billion-into-graphenebut-for-what

Six years into an ambitious 10-year research project, experts weigh in on whether the Graphene Flagship can help the “wonder material” make it through the Valley of Death

Six years ago, the European Union (EU) embarked on an ambitious project to create a kind of Silicon Valley for the “wonder material” of the last decade: graphene. The project—called the Graphene Flagship—would leverage €1 billion over 10 years to push graphene into commercial markets. The project would bring together academic and industrial research institutes to not only ensure graphene research would be commercialized, but to also make Europe an economic powerhouse for graphene-based technologies.

To this day, the EU’s investment in the Graphene Flagship represents the single largest project in graphene research and development (though some speculate that graphene-related projects in China may have surpassed it). In the past six years, the Graphene Flagship has spawned nine companies and 46 new graphene-based products. Despite these achievements, there remains a sense among critics that the wonder material has not lived up to expectations and the Flagship’s efforts have not done much to change that perception.

Graphene’s unique properties have engendered high expectations in a host of areas, including for advanced composites and new types of electronic devices. While graphene can come in many forms, its purest form is that of a one-atom-thick layer of graphite. This structure has provided the highest thermal conductivity ever recorded—10 times higher than copper. It also has one of the highest intrinsic electron mobilities of any material (the speed at which electrons can travel through a material), which is approximately 100 times greater than silicon—a tantalizing property for electronic applications.

The Graphene Flagship is now more than halfway through its 10-year funding cycle. To many observers, the project’s achievements—or lack thereof—is a barometer for the commercial status of graphene, which was first synthesized at the UK’s University of Manchester in 2004, earning its discoverers the Nobel Prize in 2010. When it was founded, the Flagship wrestled with a key question that it still faces today: Was the Flagship set up to support “fundamental” research or “applied” research in its quest to make Europe the “Graphene Valley” of the world?