All posts by Payal Dhar

Spinach Gives Fuel Cells a Power Up

Post Syndicated from Payal Dhar original

When Shouzhong Zou and his team at the Department of Chemistry, American University, decided to try spinach as way to improve the performance of fuel cells, even they were a little surprised at how well it worked. In their proof-of-concept experiments, they used spinach—bought from local supermarkets—to make a carbon-rich catalyst that can be used in fuel cells and metal-air batteries.

The spinach was a used a precursor for high-performance catalysts required for the oxygen reduction reactions (ORRs) in fuel cells. Traditionally, fuel cells have used platinum-based catalysts, but not only is platinum very expensive and difficult to obtain, it can be vulnerable to chemical poisoning in certain conditions. Consequently, researcher have looked into biomass-derived, carbon-based, catalysts to replace platinum, but there have been bottlenecks in preparing the materials in a cost-effective and non-toxic way. “We were a little bit lucky to pick up spinach,” says Zou, because of its high iron and nitrogen content. “At this point [our method] does require us to add a little bit more nitrogen into the starting material, because even though [spinach] has a lot of nitrogen to begin with, during the preparation process, some of this nitrogen gets lost.”

Zou and his team weren’t the first to discover the electrochemical wonders of spinach, of course, even though other studies have used the leafy greens for other purposes. For example, a 2014 study harvested activated carbon from spinach to create capacitor electrodes, while a more recent paper tackled spinach-based nanocomposites as photocatalysts. Spinach, apart from being abundant in iron and nitrogen (both essential in ORRs), is easy to cultivate, and “definitely cheaper than platinum,” Zou adds. 

The preparation of the spinach-based catalyst sounds as first suspiciously like a smoothie recipe at first—wash fresh leaves, pulverize into a juice, and freeze-dry. This freeze-dried juice is then ground into a powder, to which melamine is added as a nitrogen promoter. Salts like sodium chloride and potassium chloride—“pretty much like the table salt that we use in our kitchen,” says Zou—are also added, necessary for creating pores that increase the surface area available for reactions. Nanosheets are produced from the spinach–melamine–salt composites by pyrolyzing them at 900 C a couple of times. “Obviously…we can optimize how we prepare this material [to make it more efficient].”

An efficient catalyst means a faster, more efficient reaction. In the case of fuel cells, this can increase the energy output of batteries. This is where the porosity of the nanosheets helps. “Even though we call them nanosheets,” Zou says, “when they are stacked together, it’s not like a stack of paper that is very solid.” The addition of salts to create tiny holes that allows oxygen to penetrate the material rather than access only the outer surfaces. “We need to make it porous enough that…all the active sites can be used.”

The other factor that favorably disposed the American University team towards spinach was that it is a renewable source of biomass. “Sustainability is a very important factor in our consideration,” says Zou. The big question to explore, he adds, is how can we avoid competition “with the dinner table”. (Biofuel production has already raised concerns about food crops being diverted away from hungry mouths.) “And the second is, how do we keep the carbon footprint down in terms of his catalyst preparation…because currently we do use high temperatures in our preparation procedure?… If we can find different ways to do these to achieve the same type of material, that will cut back the energy consumption and reduce significantly the carbon footprint.”

Even though the results are promising, there is yet a long way to go. Zou cautions that the study so far is only a proof-of-principle. “We need to be very careful when we talk about practical applications because something that shows excellent performance in [lab] conditions could become more challenging when we implement them in the real device.” Another aspect that needs further study, he adds, is that while the spinach-derived catalyst outperforms platinum-based catalysts in alkaline conditions, the performance in an acidic medium is not as efficient. “So obviously, there is still some tuning we need to do to see if they can work through a range of pH.”

A complete prototype is obviously a next step—testing the catalyst derived from spinach in a fuel cell. “That’s the kind of expertise I don’t have in my lab at this point,” Zou admits. “We are thinking about collaborating with other groups, or we can build up our expertise in this area, because it’s a necessary step.”

ITER Celebrates Milestone, Still at Least a Decade Away From Fusing Atoms

Post Syndicated from Payal Dhar original

It was a twinkle in U.S. President Ronald Reagan’s eye, an enthusiasm he shared with General Secretary Mikhail Gorbachev of the Soviet Union: boundless stores of clean energy from nuclear fusion.

That was 35 years ago. 

On July 28, 2020, the product of these Cold Warriors’ mutual infatuation with fusion, the International Thermonuclear Experimental Reactor (ITER) in Saint-Paul-lès-Durance, France inaugurated the start of the machine assembly phase of this industrial-scale tokamak nuclear fusion reactor. 

An experiment to demonstrate the feasibility of nuclear fusion as a virtually inexhaustible, waste-free and non-polluting source of energy, ITER has already been 30-plus years in planning, with tens of billions invested. And if there are new fusion reactors designed based on research conducted here, they won’t be powering anything until the latter half of this century.

Speaking from Elysée Palace in Paris via an internet link during last month’s launch ceremony, President Emmanuel Macron said, [ITER] is proof that what brings together people and nations is stronger than what pulls them apart. [It is] a promise of progress, and of confidence in science.” Indeed, as the COVID-19 pandemic continues to baffle modern science around the world, ITER is a welcome beacon of hope.

ITER comprises 35 collaborating countries, including members of the European Union, China, India, Japan, Russia, South Korea and the United States, which are directly contributing to the project either in cash or in kind with components and services. The EU has contributed about 45%, while the others pitch in about 9% each. The total cost of the project could be anywhere between $22 billion to $65 billion—even though the latter figure has been disputed.

The idea for ITER was sparked back in 1985, at the Geneva Superpower Summit, where President Ronald Reagan of the United States and General Secretary Mikhail Gorbachev of the Soviet Union spoke of an international collaboration to develop fusion energy. A year later, at the US–USSR Summit in Reykjavik, an agreement was reached between the European Union’s Euratom, Japan, the Soviet Union and the United States to jointly start work on the design of a fusion reactor. At that time, controlled release of fusion power hadn’t even been demonstrated—that only happened in 1991, by the Joint European Torus (JET) in the UK.

The first big component to be installed at ITER was the 1,250-metric ton cryostat base, which was lowered into the tokamak pit in late May 2020. The cryostat is India’s contribution to the reactor, and uses specialized tools specifically procured for ITER by the Korean Domestic Agency to place components weighing hundreds of tonnes and having positioning tolerances of a few millimeters. Machine assembly is scheduled to finish by the end of 2024, and by mid-2025, we are likely to see first plasma production.

Anil Bhardwaj, group leader of the cryostat team, tells IEEE Spectrum “First plasma will only verify [various] compliances for initial preparation of the plasma. That does not mean that we are achieving fusion.”

That will come another decade or so down the line.

If everything goes to plan, the first deuterium–tritium fusion experiments will be demonstrated by 2035, and will in essence be replicating the fusion reactions that take place in the sun. ITER estimates that for 50 MW of power injected into the tokamak to heat the plasma (up to 150 million degrees Celsius), 500 MW of thermal power for 400- to 600-second periods will be output, a tenfold return (expressed as Q ≥ 10). The existing record as of now is Q = 0.67, held by the JET tokamak.

Despite recent progress, there is still a lot of uncertainty around ITER. Critics decry the hyperbole around it, especially of it being a magic-bullet solution to the worlds energy problems, in the words of Daniel Jassby, a former researcher at the Princeton Plasma Physics Lab. His 2017 article explains why “scaling down the sun” may not be the ideal fallback plan.

“In the most feasible terrestrial fusion reaction [using deuterium–tritium fuel], 80% of the fusion output is in the form of barrages of neutron bullets, whose conversion to electrical energy is a dubious endeavor,” he said in an interview. Switching to a different type of reactor based on much weaker fusion reactions might result in less neutron production, but also are unlikely to produce net energy of any type.

Delays and mismanagement have also plagued ITER, something that Jassby contends was a result of poor leadership. “There are only a few people in the world who have the technological, administrative and political expertise that allow them to make continuous progress in directing and completing a multinational project,” he said. Bernard Bigot, who took over as director-general five years ago, possesses the requisite skillset, in Jassby’s opinion. At present, ITER is running about six years behind schedule.

Critics of ITER are also concerned about diverting resources from developing existing renewable energies. “The greatest energy issue of our time is not supply, but how to choose among the plethora of existing energy sources for wide-scale deployment,” Jassby said. ITER’s value, however, he said, lies in delinking the fantasy of electricity from fusion energy, thus saving hundreds of billions of dollars in the long run.

Jassby thinks that if successful, ITER will allow physicists to study long-lived, high-temperature fusioning plasmas or the development of neutron sources. There are practical applications for fusion neutrons, he says, such as isotope production, radiography and activation analysis. He adds that ITER can have significant benefits if new technologies emerge application in other fields, such as superconducting magnets, new materials and novel fabrication techniques.

Philippa Browning, professor of astrophysics at the University of Manchester, believes that only something of the scale of ITER can test how things work in fusion reactors. “It may well be that in future alternative devices turn out to be better, but those advantages could be incorporated into the successor to ITER which will be a demonstration fusion power station… The route to fusion power is slow, [so] we can hope that it will be ready when it is really needed in the second half of this century.” Meanwhile, she added, “it is important that other approaches to fusion are explored in parallel, smaller and more agile projects.”

One of the most impressive things about ITER, Browning said, is the combination of a truly international cooperation pushing at the frontiers in many ways. “Understanding how plasmas interact with magnetic fields is a hugely challenging scientific problem… There are all sorts of scientific and technological spin-offs, as well as the direct contribution to achieving, hopefully, a fusion power station.”

Indian Mobile Service Providers Suspected of Providing Discriminatory Services

Post Syndicated from Payal Dhar original

India’s Telecom Disputes Settlement and Appellate Tribunal (TDSAT) has granted interim relief to telecom companies Bharti Airtel and Vodafone Idea, allowing them to continue with their premium-service plans. The TDSAT order came on 18 July, exactly a week after the country’s telecom regulatory authority had blocked the two companies from offering better speeds to higher-paying customers, citing net neutrality violations.

“This is not a final determination by the TDSAT,” says Apar Gupta of the Internet Freedom Foundation, a digital liberties organization that has been at the forefront of the fight for online freedom, privacy, and innovation in India. While the Telecom Regulatory Authority of India (TRAI) continues with its inquiry, the two providers will not be prevented from rolling out their plains.

The matter was brought to TRAI’s notice on 8 July by a rival mobile service provider, Reliance Jio, which wrote to the regulatory body asking about Airtel’s and Vodafone Idea’s Platinum and RedX plans, respectively. “Before offering any such plans ourselves…we would like to seek the Authority’s views on whether [these] tariff offerings…are in compliance with the extant regulatory framework,” the letter said.

Three days later, TRAI asked for the respective Airtel and Vodafone Idea plans to be blocked while these claims were investigated. It also sent both telcos a 10-point questionnaire related to various elements of their services, seeking clarification on how they defined “priority 4G network” and “faster speeds,” among other things. Following the blocking of the plans, Vodafone Idea approached TDSAT, arguing that TRAI’s order was illegal and arbitrary, considering that their RedX plan had been rolled out over eight months earlier. When contacted for comment on the matter, Vodafone declined, “as the matter is in TDSAT court.” Airtel, meanwhile, has agreed to comply with TRAI’s directive and not take new customers for its Platinum plan until the matter has been fully investigated.

Although it is being framed as such by media coverage and in the court of public opinion, strictly speaking, the offering of new tariffs by Airtel and Vodafone Idea are not net neutrality concerns, says Nikhil Pahwa, co-founder of Save the Internet, the campaign that played a key role in framing India’s net neutrality rules. “In India, net neutrality regulation covers…whether specific internet services or apps are either being priced differentially or being offered at speeds different from the rest of the Internet.” However, from a consumer perspective, he adds, “I think it is important for the TRAI to investigate these plans because…it is impossible for telecom operators to guarantee speeds for customers. What needs to be investigated is whether speeds are effectively deprecated for a particular set of consumers, because the throughput from a mobile base station is limited.”

Since July 2018, India has had stringent net neutrality regulations in place—possibly among the strongest in the world—at least on paper. Any form of data discrimination is banned; blocking, degrading, slowing down or granting preferential speeds or treatment by providers is prohibited; and Internet service providers stand to lose their licenses if found in violation. This was the result of a massive, public, volunteer-driven campaign since 2015. Save the Internet estimates that over 1 million citizens were part of the campaign at one point or another.

The concept of net neutrality captured public imagination when, in 2014, Airtel decided it would charge extra for VoIP services. The company pulled its plan after public outcry, but the wheels of differential pricing were set in motion. This resulted in TRAI prohibiting discriminatory tariffs for data services in 2016—a precursor to the net neutrality principles adopted two years later. These developments also forced Facebook to withdraw its zero-rated Free Basics service in India.

“We have not seen net neutrality enforcement in India till now in a very clear manner,” says Gupta, adding that TRAI is in the process of coming up with an enforcement mechanism. “They opened a consultation on it, and invited views from people… Right now they’re in the process of making…recommendations to the Department of Telecom, which can then frame them under the Telegraph Act.” The telecom department exercises wider powers under this Act, even though TRAI also has specific powers in administering certain licensing conditions, including quality of service and interconnection.

“[The] internet is built around the idea that all users have equal right to create websites, applications, and services for the rest of the world, and enables innovation because it is a space with infinite competition,” Pahwa says. And net neutrality is at the core of that freedom.

Peer Review of Scholarly Research Gets an AI Boost

Post Syndicated from Payal Dhar original

In the world of academics, peer review is considered the only credible validation of scholarly work. Although the process has its detractors, evaluation of academic research by a cohort of contemporaries has endured for over 350 years, with “relatively minor changes.” However, peer review may be set to undergo its biggest revolution ever—the integration of artificial intelligence.

Open-access publisher Frontiers has debuted an AI tool called the Artificial Intelligence Review Assistant (AIRA), which purports to eliminate much of the grunt work associated with peer review. Since the beginning of June 2020, every one of the 11,000-plus submissions Frontiers received has been run through AIRA, which is integrated into its collaborative peer-review platform. This also makes it accessible to external users, accounting for some 100,000 editors, authors, and reviewers. Altogether, this helps “maximize the efficiency of the publishing process and make peer-review more objective,” says Kamila Markram, founder and CEO of Frontiers.

AIRA’s interactive online platform, which is a first of its kind in the industry, has been in development for three years.. It performs three broad functions, explains Daniel Petrariu, director of project management: assessing the quality of the manuscript, assessing quality of peer review, and recommending editors and reviewers. At the initial validation stage, the AI can make up to 20 recommendations and flag potential issues, including language quality, plagiarism, integrity of images, conflicts of interest, and so on. “This happens almost instantly and with [high] accuracy, far beyond the rate at which a human could be expected to complete a similar task,” Markram says.

“We have used a wide variety of machine-learning models for a diverse set of applications, including computer vision, natural language processing, and recommender systems,” says Markram. This includes simple bag-of-words models, as well as more sophisticated deep-learning ones. AIRA also leverages a large knowledge base of publications and authors.

Markram notes that, to address issues of possible AI bias, “We…[build] our own datasets and [design] our own algorithms. We make sure no statistical biases appear in the sampling of training and testing data. For example, when building a model to assess language quality, scientific fields are equally represented so the model isn’t biased toward any specific topic.” Machine- and deep-learning approaches, along with feedback from domain experts, including errors, are captured and used as additional training data. “By regularly re-training, we make sure our models improve in terms of accuracy and stay up-to-date.”

The AI’s job is to flag concerns; humans take the final decisions, says Petrariu. As an example, he cites image manipulation detection—something AI is super-efficient at but is nearly impossible for a human to perform with the same accuracy. “About 10 percent of our flagged images have some sort of problem,” he adds. “[In academic publishing] nobody has done this kind of comprehensive check [using AI] before,” says Petrariu. AIRA, he adds, facilitates Frontiers’ mission to make science open and knowledge accessible to all.

Crowdsourced Protein Modeling Efforts Focus on COVID-19

Post Syndicated from Payal Dhar original

IEEE COVID-19 coverage logo, link to landing page

Researchers have been banking on millions of citizen-scientists around the world to help identify new treatments for COVID-19. Much of that work is being done through distributed computing projects that utilize the surplus processing power of PCs to carry out various compute-intensive tasks. One such project is [email protected], which helped model how the spike protein of SARS-CoV-2 binds with the ACE2 receptor of human cells to cause infection. Started at Stanford University in 2000, [email protected] is currently based at the Washington University School of Medicine in St. Louis; it undertakes research into various cancers, and neurological and infectious diseases by studying the movement of proteins.

Proteins are made up of a sequence of amino acids that fold into specific structural forms. A protein’s shape is critical in its ability to undertake its specific function. Viruses have proteins that enable them to suppress a host’s immune system, invade cells, and replicate.

Greg Bowman, director of [email protected], says, “We’re basically building maps of what these viral proteins can do… [The distributed computing network] is like having people around the globe jump in their cars and drive around their local neighborhoods and send us back their GPS coordinates at regular intervals. If we can develop detailed maps of these important viral proteins, we can identify the best drug compounds or antibodies to interfere with the virus and its ability to infect and spread.”

After Covid-19 was declared a global pandemic, [email protected] prioritized research related to the new virus. The number of devices running its software shot up from some 30,000 to over 4 million as a result. Tech behemoths such as Microsoft, Amazon, AMD, Cisco, and others have loaned computing power to [email protected] The European Organization for Nuclear Research (CERN) has freed up 10,000 CPU cores to add to the project, and the Spanish premier soccer league La Liga has chipped in with its supercomputer that is otherwise dedicated to fighting piracy.

While [email protected] models how proteins fold, another distributed computing project called [email protected]—this one at the University of Washington Institute for Protein Design (IPD)—predicts the final folded shape of the protein. Though the projects are quite different, they are complementary.

“A big difference…is that the [email protected] distributed computing is…directly contributing to the design of new proteins… These calculations are trying to craft brand new proteins with new functions,” says Ian C. Haydon, science communications manager and former researcher at IPD. He adds that the [email protected] community, which comprises about 3.3 million instances of the software, has helped the research team come up with more than 2 million candidate antiviral proteins that recognize the coronavirus’s spike protein and bind very tightly to it. When that happens, the spike is no longer able to recognize or infect a human cell.

“At this point, we’ve tested more than 100,000 of what we think are the most promising options,” Haydon says. “We’re working with collaborators who were able to show that the best of these antiviral proteins…do keep the coronavirus from being able to infect human cells…. [What’s more,] they have a potency that looks at least as good if not better than the best known antibodies.”

There are many possible outcomes for this line of research, Haydon says. “Probably the fastest thing that could emerge… [is a] diagnostic…tool that would let you detect whether or not the virus is present.” Since this doesn’t have to go into a human body, the testing and approval process is likely to be quicker. “These proteins could [also] become a therapy that…slows down or blocks the virus from being able to replicate once it’s already in the human body… They may even be useful as prophylactic.”

The Search for Extraterrestrial Intelligence Gets a Major Upgrade

Post Syndicated from Payal Dhar original

We’ve all wondered at one point or another if intelligent life exists elsewhere in the universe. “I think it’s very unlikely that we are alone,” says Eric Korpela, an astronomer at the University of California Berkeley’s Search for ExtraTerrestrial Intelligence (SETI) Research Center. “They aren’t right next door, but they may be within a thousand light years or so.”

Korpela is project director of the [email protected] project. For more than two decades, that project harnessed the surplus computing power of over 1.8 million computers around the globe to analyze data collected by radio telescopes for narrow-band radio signals from space that could indicate the existence of extraterrestrial technology. On 31 March 2020, [email protected] stopped putting new data in the queue for volunteers’ computers to process, but it’s not the end of the road for the project.

Now begins the group’s next phase. “We need to sift through the billions of potential extraterrestrial signals that our volunteers have found and find any that show signs of really being extraterrestrial,” says Korpela. That task is difficult, he adds, because humans “make lots of signals that look like what we would expect to see from E.T.”

Could Airships Rise Again?

Post Syndicated from Payal Dhar original

Transportation produces about one-fourth of global anthropogenic carbon emissions. Of this, maritime shipping accounts for 3 percent, and this figure is expected to increase for the next three decades even though the shipping industry is actively seeking greener alternatives, and developing near-zero-emission vessels.

Researchers with the International Institute for Applied Systems Analysis (IIASA) in Austria recently explored another potential solution: the return of airships to the skies. Airships rely on jet stream winds to propel them forward to their destinations. They offer clear advantages over cargo ships in terms of both efficiency and avoided emissions. Returning to airships, says Julian Hunt, a researcher at the IIASA and lead author of the new study, could “ultimately [increase] the feasibility of a 100 percent renewable world.”

Today, world leaders are meeting in New York for the UN Climate Action Summit to present plans to address climate change. Already, average land and sea surface temperatures have risen to approximately 1 degree C above pre-industrial levels. If the current rate of emissions remains unchecked, the Intergovernmental Panel on Climate Change estimates that by 2052, temperatures could rise by up to 2 degrees C. At that point, as much as 30 percent of Earth’s flora and fauna could disappear, wheat production could fall by 16 percent, and water would become more scarce.

According to Hunt and his collaborators, airships could play a role in cutting future anthropogenic emissions from the shipping sector. Jet streams flow in a westerly direction with an average wind speed of 165 kilometers per hour (km/h). On these winds, a lighter-than-air vessel could travel around the world in about two weeks (while a ship would take 60 days) and require just 4 percent of the fuel consumed by the ship, Hunt says.

Rooftop Solar Refinery Produces Carbon-Neutral Fuels

Post Syndicated from Payal Dhar original

Scientists in Switzerland have demonstrated a technology that can produce kerosene and methanol from solar energy and air

Scientists have searched for a sustainable aviation fuel for decades. Now, with emissions from air traffic increasing faster than carbon-offset technologies can mitigate them, environmentalists worry that even with new fuel-efficient technologies and operations, emissions from the aviation sector could double by 2050.

But what if, by 2050, all fossil-derived jet fuel could be replaced by a carbon-neutral one made from sunlight and air?

In June, researchers at the Swiss Federal Institute of Technology (ETH) in Zurich demonstrated a new technology that creates liquid hydrocarbon fuels from thin air—literally. A solar mini-refinery—in this case, installed on the roof of ETH’s Machine Laboratory—concentrates sunlight to create a high-temperature (1,500 degrees C) environment inside the solar thermochemical reactor.

Delhi Rolls Out a Massive Network of Surveillance Cameras

Post Syndicated from Payal Dhar original

The state government says closed-circuit TVs will help fight crime, but digital liberties activists are concerned about the project’s lack of transparency

In India, the government of Delhi is rolling out an ambitious video surveillance program as a crime-prevention measure. Technicians will install more than a quarter million closed-circuit TV (CCTV) cameras near residential and commercial properties across the city, and in schools. A central monitoring system is expected to take care of behind-the-scenes logistics, though authorities have not shared details on how the feeds will be monitored.

After delays due to political and legal wrangles, the installations began on 7 and 8 July. The first cameras to go up in a residential area were installed in Laxmi Bai Nagar, at a housing society for government employees, and at the upmarket Pandara Road in New Delhi. When the roll out is complete, there will be an average of 4,000 cameras in each of Delhi’s 70 assembly constituencies, for a total of around 280,000 cameras.

In early 2020, the National Capital Territory of Delhi (usually just called ‘Delhi’), which includes New Delhi, the capital of India, will vote to elect a new state assembly. Lowering the crime rates is a key election issue for the incumbent Aam Aadmi Party (literally, Common Man’s [sic] Party). The party has promised that the CCTV cameras will deter premeditated crime and foster a semblance of order among the general public.

Cyberespionage Collective Platinum Targets South Asian Governments

Post Syndicated from Payal Dhar original

Kaspersky says the group used an HTML-based exploit that’s almost impossible to detect

Following a trail of suspicious digital crumbs left in cloud-based systems across South Asia, Kaspersky Lab’s security researchers have uncovered a steganography-based attack carried out by a cyberespionage group called Platinum. The attack targeted government, military, and diplomatic entities in the region.

Platinum was active years ago, but was since believed to have been disarmed. Kaspersky’s cyber-sleuths, however, now suspect that Platinum might have been operating covertly since 2012, through an “elaborate and thoroughly crafted” campaign that allowed it to go undetected for a long time.

The group’s latest campaign harnessed a classic hacking tool known as steganography. “Steganography is the art of concealing a file of any format or communication in another file in order to deceive unwanted people from discovering the existence of [the hidden] initial file or message,” says Somdip Dey, a U.K.-based computer scientist with a special interest in steganography at the University of Essex and the Samsung R&D Institute.

Digital Doppelgängers Fool Advanced Anti-Fraud Tech

Post Syndicated from Payal Dhar original

With traces of a user’s browsing history and online behavior, hackers can build a fake virtual “twin” and use it to log in to a victim’s accounts

As new security technologies shield us from cybercrime, a slew of adversarial technologies match them, step for step. The latest such advance is the rise of digital doppelgängers—virtual entities that mimic real user behaviors authentic enough to fool advanced anti-fraud algorithms.

In February, Kaspersky Lab’s fraud-detection teams busted a darknet marketplace called Genesis that was selling digital identities starting from US $5 and going up to US $200. The price depended on the value of the purchased profile—for example, a digital mask that included a full user profile with bank login information would cost more than just a browser fingerprint.

The masks purchased at Genesis could be used through a browser and proxy connection to mimic a real user’s activity. Coupled with stolen (legitimate) user accounts, the attacker was then free to make new, trusted transactions in their name—including with credit cards.