All posts by John Boyd

Japan’s Fugaku Supercomputer Completes First-Ever Sweep of High-Performance Benchmarks

Post Syndicated from John Boyd original

The public-private partnership Fujitsu, and national research institute RIKEN put Japan on top of the world supercomputer rankings nine long years ago with the K computer. They’ve done it again, and in spades, with their jointly developed Fugaku supercomputer.

Fugaku, another name for Mount Fuji, sits at the summit of the TOP500 list announced on 22 June. It earned the top spot with an extraordinary performance of 415 Linpack petaflops. This is nearly triple that of the runner-up and previous No. 1, Oak Ridge National Lab’s Summit supercomputer in Tennessee, built by IBM. Fugaku achieved this using 396 racks employing 152,064 A64FX Arm nodes. The Arm components comprise  approximately 95 percent of the computer’s almost 159,000 nodes. 

In addition to demonstrating world-beating speed, Fugaku beat the competition in: the High Performance Conjugate Gradients (HPCG) benchmark used to test real-world application performance; the Graph500, a rating for data-intensive loads; and HPL-AI, a benchmark for rating artificial intelligence workloads. A Fugaku prototype also took top spot for the most energy-efficient system on the Green500 list last November, achieving an outstanding 16.9 GFlops/Watt power-efficiency during a 2.0 Pflops per second Linpack performance run. 

Driving Fugaku’s success is Fujitsu’s 48-core Arm v8.2-A A64FX CPU, which the company is billing as the world’s first CPU to adopt Scalable Vector Extension—an instruction-set extension of Arm v8-A architecture for supercomputers. The 512-bit, 2.2 GHz CPU employs  1,024 Gbytes/s 3D-stacked memory and can handle half-precision arithmetic and multiply-add operations that reduce memory loads in AI and deep learning applications where lower precision is admissible. The CPUs are directly linked by a 6.8 Gbytes/s network Tofu D interconnect that uses a 6-dimensional mesh torus connection.

During three years of planning the computer starting in 2011, a number of designs and architectures were considered. “Our guiding strategy was to build a science-driven, low-powered machine that was easy to use and could run science and engineering applications efficiently,” says Toshiyuki Shimizu, Principal Engineer of Fujitsu’s Platform Development Unit. 

Independent observers say they succeeded in every element of the goal. “Fugaku is very impressive with over 7 million cores,” says Jack Dongarra, director of the Innovative Computing Lab, University of Tennessee, Knoxville. “The machine was designed for doing computational science problems from the ground up. It’s a first.”

As for the choice of Arm architecture, Shimizu notes the large number of application developers supporting Arm. “Fugaku also supports Red Hat Enterprise Linux 8.x, a de facto standard operating system widely used by commercial servers,” he points out. 

Another plus for Fugaku is that it follows the K computer by maintaining an all-CPU design. Shimizu says this makes memory access and CPU interconnectivity more efficient. Most other supercomputers rely on graphic processing units (GPUs) to accelerate performance. 

Dongarra points out an additional benefit here. “A CPU-only system simplifies the programming. Just one program is needed, not two: one for the CPU and one for the GPU.”

Designing and building a computer that, from the ground up, was intended to be Japan’s national flagship didn’t come cheap, of course. The government’s estimated budget for the project’s R&D, acquisitions, and application development is 110 billion yen (roughly US $1 billion). 

Fujitsu dispatched the first units of Fugaku to the RIKEN Center for Computational Science (R-CCS) in Kobe last December and shipments were completed last month. 

Speaking at the ISC 2020 conference in June, Satoshi Matsuoka, Director of R-CCS, said that although Fugaku was scheduled to start up next year, Japan’s government decided it should be deployed now to help combat Covid-19. He cited that it was being used to study how the virus behaves, what existing drugs might be repurposed to counter it, and how a vaccine could be made.

Other government-targeted application areas given high priority include: disaster-prevention simulations of earthquakes and tsunami; development of fundamental technologies for energy creation, conversion, and storage; creation of new materials to support next-generation industries; and development of new design and production processes for the manufacturing industry. 

Fugaku will also be used to realize the creation of a smarter society—dubbed Society 5.0—“that balances economic advancement with the resolution of social problems by a system that highly integrates cyberspace and physical space.” 

But the supercomputer industry is nothing if not a game of technology leapfrog, with one country or enterprise providing machines with the highest performance only to be outpaced a short time later. Just how long will Fugaku stay No. 1? 

Shimizu doesn’t claim to know, but he says there is room for further improvement of Fugaku’s performance. “The TOP500 result was only 81 percent of peak performance, whereas the efficiency of silicon is higher. We believe we can improve the performance in all the categories.”

But even that might not be enough to keep it on top for long. As Dongarra says, “The U.S. will have exascale machines in 2021.”  

Novel Error Correction Code Opens a New Approach to Universal Quantum Computing

Post Syndicated from John Boyd original

Government agencies and universities around the world—not to mention tech giants like IBM and Google—are vying to be the first to answer a trillion-dollar quantum question: How can quantum computers reach their vast potential when they are still unable to consistently produce results that are reliable and free of errors? 

Every aspect of these exotic machines—including their fragility and engineering complexity; their preposterously sterile, low-temperature operating environment; complicated mathematics; and their notoriously shy quantum bits (qubits) that flip if an operator so much as winks at them—are all potential sources of errors. It says much for the ingenuity of scientists and engineers that they have found ways to detect and correct these errors and have quantum computers working to the extent that they do: at least long enough to produce limited results before errors accumulate and quantum decoherence of the qubits kicks in.

When it comes to correcting errors arising during quantum operations, an error-correction method known as the surface code has drawn a lot of research attention. That’s because of its robustness and the fact that it’s well suited to being set out on a two-dimensional plane (which makes it amenable to being laid down on a chip). The surface code uses the phenomenon known as entanglement (quantum connectivity) to enable single qubits to share information with other qubits on a lattice layout. The benefit: When qubits are measured, they reveal errors in neighboring qubits.

For a quantum computer to tackle complicated tasks, error-correction codes need to be able to perform quantum gate operations; these are small logic operations carried out on qubit information that, when combined, can run algorithms. Classical computing analogs would be AND gates, XOR gates, and the like. 

Physicists describe two types of quantum gate operations (distinguished by their different mathematical approaches) that are necessary to achieve universal computing. One of these, the Clifford gate set, must work in combination with  magic-state distillation—a purification protocol that uses multiple noisy quantum states to perform non-Clifford gate operations.

“Without magic-state distillation or its equivalent, quantum computers are like electronic calculators without the division button; they have limited functionality,” says Benjamin Brown, an EQUS  researcher at the University of Sydney’s School of Physics. “However, the combination of Clifford and non-Clifford gates can be prohibitive because it eats up so much of a quantum computer’s resources, that there’s little left to deal with the problem at hand.”

To overcome this problem, Brown has developed a new type of non-Clifford-gate error-correcting method that removes the need for overhead-heavy distillation. A paper he published on this development appeared in Science Advances on 22 May. 

“Given it is understood to be impossible to use two-dimensional code like the surface code to do the work of a non-Clifford gate, I have used a three-dimensional code and applied it to the physical two-dimensional surface code scheme using time as the third dimension,” explains Brown. “This has opened up possibilities we didn’t have before.”

The non-Clifford gate uses three overlapping copies of the surface code that locally interact over a period of time. This is carried out by taking thin slices of the 3D surface code and collapsing them down into a 2D space. The process is repeated over and over on the fly with the help of just-in-time gauge fixing, a procedure for stacking together the two-dimensional slices onto a chip, as well as dealing with any occurring errors. Over a period of time, the three surface codes replicate the three-dimensional code that can perform the non-Clifford gate function(s).

“I’ve shown this to work theoretically, mathematically,” says Brown. “The next step is to simulate the code and see how well it works in practice.”

Michael Beverland, a senior researcher at Microsoft Quantum commented on the research: “Brown’s paper explores an exciting, exotic approach to perform fault-tolerant quantum computation. It points the way towards potentially achieving universal quantum computation in two spatial dimensions without the need for distillation—something many researchers thought was impossible.”

Brown notes that reducing errors in quantum computing is one of the biggest challenges facing scientists before machines capable of solving useful problems can be built. “My approach to suppressing errors could free up a lot of the hardware from error correction and will allow the computer to get on with doing useful stuff.”

Australia’s Contact-Tracing COVIDSafe App Off to a Fast Start

Post Syndicated from John Boyd original

The Australian government launched its home-grown COVIDSafe contact-tracing app for the new coronavirus on 26 April. And despite the government’s history of technology failures and misuse of personal data, smartphone users have been eager to download the opt-in software on Apple’s App Store and on Google Play. But if the government is to achieve its target of 10 million downloads, there’s still a ways to go.

Novel Annealing Processor Is the Best Ever at Solving Combinatorial Optimization Problems

Post Syndicated from John Boyd original

During the past two years, IEEE Spectrum has spotlighted several new approaches to solving combinatorial optimization problems, particularly Fujitsu’s Digital Annealer and more recently Toshiba’s Simulated Bifurcation Algorithm. Now, researchers at the Tokyo Institute of Technology, with help from colleagues at Hitachi, Hokkaido University, and the University of Tokyo, have engineered a new annealer architecture to deal with this kind of task that has proven too taxing for conventional computers to deal with.

Dubbed STATICA (Stochastic Cellular Automata Annealer Architecture), the processor is designed to take on challenges such as portfolio, logistic, and traffic flow optimization when they are expressed in the form of Ising models.

Originally used to describe the spins of interacting magnets, Ising models can also be used to solve optimization problems. That’s because the evolving magnetic interactions in a system progress towards the lowest-energy state, which conveniently mirrors how an optimization algorithm searches for the best—i.e. ground state—solution. In other words, the answer to a particular optimization question becomes the equivalent of searching for the lowest energy state of the Ising model.

Current annealers such as D-Wave’s quantum annealer computer and Fujitsu’s Digital Annealer calculate spin-evolutions serially, points out Professor Masato Motomura at Tokyo Tech’s Institute of Innovative Research and leader of the STATICA project. As one spin affects all the other spins in a given iteration, spin switchings are calculated one by one, making it a serial process. But in STATICA, he notes, that updating is performed in parallel using stochastic cellular automata (SCA). That is a means of simulating complex systems using the interactions of a large number of neighboring “cells” (spins in STATICA) with simple updating rules and some stochasticity (randomness).

In conventional annealing systems, if one spin flips, it affects all of the connected spins and therefore all the spins must be processed in the next iteration. But in STATICA, SCA introduces copies (replicas) of the original spins into the process. All original spin-spin interactions are redirected to their individual replica spins.

“In this method, all the replica spins are updated in parallel using these spin-spin interactions,” explains Motomura.” If one original spin flips, it affects its replica spin but not any of the other original spins because there is no interaction between them, unlike conventional annealing. And in the next iteration, the replica spins are interpreted as original spins and the parallel spin-update is repeated.

As well as enabling paralleling processing, STATICA also uses pre-computed results to reduce computation. “So if there is no spin-flip, there is nothing to compute,” says Motomura. “And if the influence of a flipped spin has already been computed, that result is reused.”

For proof of concept, the researchers had a 3-by-4-mm STATICA chip fabricated using a 65-nm CMOS process operating at a frequency of 320 megahertz and running on 649 milliwatts. Memory comprises a 1.3 megabit SRAM. This enabled an Ising model of 512 spins, equivalent to 262,000 connections, to be tested.

“Scaling by at least two orders of magnitude is possible,” notes Motomura. And the chip can be fabricated using the same process as standard processors and can easily be added to a PC as a co-processor, for instance, or added to its motherboard.

“At the ISSCC Conference in February, where we presented a paper on STATICA, we mounted the chip on a circuit board with a USB connection,” he says, “and demonstrated it connected to a laptop PC as proof of concept.”

To compare STATICA’s performance against existing annealing technologies (using results given in published papers), the researchers employed a Maxcut benchmark test of 2,000 connections. STATICA came out on top in processing speed, accuracy, and energy efficiency. Compared with its nearest competitor, Toshiba’s Simulated Bifurcation Algorithm, STATICA took 0.13 milliseconds to complete the test, versus 0.5 ms for SBA. In energy efficiency, STATICA ran on an estimated 2 watts of power, far below the to 40 watts for SBA. And in histogram comparisons of accuracy STATICA also came out ahead, according to Motomura.

For the next step, he says the team will scale up the processor and test it out using realistic problems. 

Other than that, there are no more technology hurdles to overcome. 

“STATICA  is ready,” states Motomura. “The only question is whether there is sufficient market demand for such an annealing processor. We hope to see interest, for instance, from ride-sharing companies like Uber, and product distributors such as Amazon. Local governments wanting to control problems such as traffic congestion might also be interested. These are just a few examples of how STATICA might be used besides more obvious applications like portfolio optimization and drug discovery.”

Graphene Solar Thermal Film Could Be a New Way to Harvest Renewable Energy

Post Syndicated from John Boyd original

Researchers at the Center for Translational Atomaterials (CTAM) at Swinburne University of Technology in Melbourne, Australia, have developed a new graphene-based film that can absorb sunlight with an efficiency of over 90 percent, while simultaneously eliminating most IR thermal emission loss—the first time such a feat has been reported.

The result is an efficient solar heating metamaterial that can heat up rapidly to 83 degrees C (181 degrees F) in an open environment with minimal heat loss. Proposed applications for the film include thermal energy harvesting and storage, thermoelectricity generation, and seawater desalination.

Unmanned Solar Aircraft Aims to Compete Commercially With Satellites and Drones

Post Syndicated from John Boyd original

At 35 meters, the wingspan of the new BAE Systems aircraft equals that of a Boeing 737, yet the plane weighs in at just 150 kilograms, including a 15 kg payload. The unmanned plane, dubbed the PHASA-35 (Persistent High-Altitude Solar Aircraft), made its maiden voyage on 10 February  at the Royal Australian Air Force Woomera Test Range in South Australia.

“It flew for just under an hour—enough time to successfully test its aerodynamics, autopilot system, and maneuverability,” says Phil Varty, business development leader of emerging products at BAE Systems. “We’d previously tested other sub-systems such as the flight control system in smaller models of the plane in the U.K. and Australia, so we’d taken much of the risk out of the craft before the test flight.”

The prototype aircraft uses gallium arsenide–based triple-junction solar cell panels manufactured by MicroLink Devices in Niles, Ill. MicroLink claims an energy conversion efficiency of 31 percent for these specialist panels.

“For test purposes, the solar panels—which are as thin as paper—covered just part of the wingspan to generate 4 kilowatts of power,” says Varty. “For the production version, we’ll use all that space to produce 12 kilowatts.”

The energy is used to drive two brushless, direct-drive electric motors modified for the aircraft, and to charge a lithium-ion battery system comprising over 400 batteries that delivers the energy needed to fly the plane at night. They are supplied by Amprius Technologies in Fremont, Calif.

Varty says that unlike the solar panels, which have been chosen for their high efficiency, the batteries—similar to the kind powering smartphones—are not massively efficient. Instead, they are a proven, reliable technology that can easily be replaced when a more efficient version becomes available.

“Although the test flight took place in Australia’s summer, the aircraft is designed for flying during the worst time of the year—the winter solstice,” says Varty. “That’s why it has the potential to stay up for a whole year in the stratosphere, around 65,000 feet [20,000 meters], where there’s little wind and no clouds or turbulence.”

He describes the unmanned control flight system as a relatively simple design similar to that used in drones and model airplanes. A controller on the ground, known as the pilot, guides the plane, though the aircraft can also run on autopilot using a preprogrammed route. Other technicians may also be involved to control specialist applications such as cameras, surveillance, or communications equipment.

The aircraft was originally developed by Prismatic Ltd., a designer and manufacturer of unmanned aerial vehicles in Southwest England that was acquired last year by BAE.

Further test flights are planned for this year and engineers from Prismatic and BAE are using the results of the trial to improve the various subsystems.

“There were no surprises found during the test flight so perhaps the biggest challenge now is educating the market that what we are offering is different,” says Varty. “Different from satellites or drones.”

Its special feature is it can sit in the stratosphere over a particular point by flying into the wind or doing maneuvers like a figure eight and using cameras or surveillance equipment mounted on gimbals to provide constant monitoring. By comparison, Varty says, even the best military drones can stay airborne for a maximum of only three days; satellites are limited by virtue of the fact that they have to maintain a speed of 7 kilometers per second or more in order to stay in orbit.

“Satellites provide merely a date-time snapshot of what is going on below,” Varty points out. “Whereas we can monitor a spot for as long as necessary to draw conclusions about what is likely to happen next: where forest fires are going to spread, for instance, or where disaster relief is most needed.”

After reviewing the BAE announcement, Saburo Matsunaga, head of the Laboratory for Space Systems at the Tokyo Institute of Technology, sees both pros and cons with the aircraft.

Compared to satellites, its “lower altitude means it should realize higher resolution monitoring, low power communications, dedicated local area services, and so on. But performance, he adds, “will be strongly dependent on the equipment constraints such as mass, volume, and power budget” because the aircraft’s lightweight design limits its payload.

As for possible drawbacks, Matsunaga suspects long-duration operations—which assumes no maintenance and no failures—may be more difficult to achieve and more expensive to deal with that BAE expects. “The flights must always be controlled, and the available time for [issue-free] flights will be short. Communications with the plane’s tracking control and operation systems may also be a concern over long flights. And there’s the possibility that external disturbances may affect sensor resolution.”

BAE plans to offer several different services to customers based on the aircraft’s ability to constantly monitor a particular point or to provide it with communications via onboard transceiver equipment. “A customer could use it to monitor how crops are growing or how minute by minute, hour by hour changes to weather affect areas of interest on the ground,” says Varty. “Now we’re working out the pricing for such services. Direct sales of the aircraft are also possible.”

Commercialization will soon follow completion of the flight trials, which could mean the launch of services as early as next year.

Toshiba’s Optimization Algorithm Sets Speed Record for Solving Combinatorial Problems

Post Syndicated from John Boyd original

Toshiba has come up with a new way of solving combinatorial optimization problems. A classic example of such problems is the traveling salesman dilemma, in which a salesman must find the shortest route between many cities.

Such problems are found aplenty in science, engineering, and business. For instance, how should a utility select the optimal route for electric transmission lines, considering construction costs, safety, time, and the impact on people and the environment? Even the brute force of supercomputers is impractical when new variables increase the complexity of a question exponentially. 

But it turns out that many of these problems can be mapped to ground-state searches made by Ising machines. These specialized computers use mathematical models to describe the up-and-down spins of magnetic materials interacting with each other. Those spins can be used to represent a combinatorial problem. The optimal solution, then, becomes the equivalent of finding the ground state of the model.

With New Tech, Panasonic Aims to Revive Interest in Delivering Broadband Over Power Lines

Post Syndicated from John Boyd original

Using radio frequencies to transmit data over existing power lines both inside and outside of homes has long promised to turn legacy cabling into a more attractive asset by delivering two essential services on a single wire. But broadband over power lines (BPL) has never achieved its potential, due in part to initial low speeds and unreliability, and concerns about radio interference and electromagnetic radiation.

One company that has continued to invest in and improve BPL since 2000 is Panasonic, a multinational electronics and appliance manufacturer with headquarters in Osaka, Japan. In March of this year, the IEEE Standards Association approved the IEEE 1901a standard for BPL that covers IoT applications, and which is based on Panasonic’s upgraded HD-PLC technology.

HD-PLC (high-definition power line communications) is backward compatible with IEEE’s 1901 standard for BPL ratified in 2010. The 1901a standard implements new functions based on Panasonic’s Wavelet orthogonal frequency-division multiplexing (OFDM) technology that is also incorporated in the 2010 standard. 

Is the World Ready for Floating Nuclear Power Stations?

Post Syndicated from John Boyd original

The world’s first floating nuclear power plant (FNPP) docked at Pevek, Chukotka, in Russia’s remote Far East on 14 September. It completed a journey of some 9,000 kilometers from where it was constructed in a St. Petersburg shipyard. First, it was towed to the city of Murmansk, where its nuclear fuel was loaded, and from there took the North Sea Route to the other side of Russia’s Arctic coast

The FNPP will replace an aging land-based nuclear plant and a brown coal-fired plant, reducing some 50,000 tons of CO2 emissions a year, according to Rosatom, the project’s creator and Russia’s state nuclear corporation. The reactor is slated to begin operations this December.

The co-generation plant, named the Akademik Lomonosov, consists of a non-motorized barge, two pressurized-water KLT-40S reactors similar to those powering Russian nuclear icebreakers, and two steam turbine plants.

The FNPP can generate up to 70 megawatts (MW) of electricity and 50 gigacalories of heat an hour. That is sufficient to power the electric grids of the resource-rich region—where some 50,000 people live and work—and also deliver steam heat to the supply lines of Pevek city. The plant will manage this second feat by using steam extracted from the turbines to heat its intermediate circuit water system, which circulates between the reactor units and the coastal facilities, from 70 to 130 degrees C.

Construction of the floating reactor began in 2007 and had to overcome a messy financial situation including the threat of bankruptcy in 2011. The venture is based on the small modular reactor (SMR) design: a type of nuclear fission reactor that is smaller than conventional reactors. Such reactors can be built from start to finish at a plant and then shipped—fully-assembled, tested, and ready to operate—to remote sites where normal construction would be difficult to manage.

Andrey Zolotkov, head of the Murmansk, Russia office of Bellona Foundation, an environmental organization based in Oslo, Norway, acknowledges the practicability of the SMR design. But he is one of many who questions its necessity in this particular case. 

“The same plant could be built on the ground there (in Chukotka) without resorting to creating a floating structure,” says Zolotkov. “After all, the [nuclear power plant] presently in use was built on land there and has been operating for decades.” 

The floating design has raised both environmental and safety concerns, given that the plant will operate in the pristine Arctic and must endure its harsh winters and choppy seas. Greenpeace has dubbed it a “floating Chernobyl,” and “a nuclear Titanic.”

Rosatom rejects such criticism, saying the plant meets safety standards put forth by Russia and the International Atomic Energy Agency. The company notes the same kind of reactors have been used in icebreakers and submarines for decades. And, Rosatom states on its website, the “FNPP will be moored and secured to a special pier,” and operate without the need for “motor or propeller functions.”

Coastal structures, dams, and breakwaters have also been built to protect the vessel against tsunamis and icebergs.

The plant employs a number of active and passive safety systems, including an electrically-driven automated system and a passive system that uses gravity to insert control rods into the reactor core to ensure the reactor remains at subcritical levels in emergencies. The reactors also use low enriched uranium in a concentration below 20 percent of Uranium-235. This makes the fuel unsuitable for producing nuclear weapons.

Given such safety measures, Rosatom says on its site that a peer-reviewed probabilistic safety assessment modeling of possible damage to the FNPP finds the chances of a serious accident happening at the FNPP “are less than one hundred thousandth of a percent.”

Zolotkov, who worked in various capacities—including radiation safety officer—for 35 years in Russia’s civilian nuclear fleet, also notes that there have been no serious incidents on such ships since 1975. “In the event of an accident in the FNPP, the consequences, I believe, would be localized within its structure, so the release of radioactive substances will be minimal,” he says. 

The plant’s nuclear fuel has to be replaced every three years. The unloaded fuel is held in onboard storage pools, and later in dry containers also kept on board. Every 10 to 12 years during its 40-year life cycle (possibly extendable to 50 years), the FNPP will be towed to a special facility for maintenance.

After decommissioning, the plant will be towed to a deconstruction and recycling facility. Rosatom says on its site, “No spent nuclear fuel or radioactive waste is planned to be left in the Arctic—spent fuel will be taken to the special storage facilities in mainland Russia.” 

Rosatom has not disclosed the cost of the venture, calling it a pilot project. It is currently working on a next-generation version that will use two RITM-200M reactors, each rated at 50 MW. Improvement targets include a more compact design, longer periods between refueling, flexible load-following capabilities, and multipurpose uses that include water desalination and district heating. 

Provided Rosatom receives sufficient orders, it says it aims to compete in price with plants based on fossil fuels and renewable energy.

The company, however, may face challenges other than marketing and operating its novel design. “These FNPPs will eventually carry spent nuclear fuel and are not yet recognized by international maritime law,” says Zolotkov. “So Rosatom may face problems obtaining permits and insurance when it comes to towing them along certain sea routes.”

A Novel Thermal Battery Promises Green Power Around the Clock

Post Syndicated from John Boyd original

Japanese scientists have developed a thermal battery that converts heat into electricity when buried in a geothermal zone

You can fry an egg on the ground in Las Vegas in August, but try that in Iceland or Alaska and you’ll just end up with the stuff on your face—unless you know how to tap into the Earth’s vast reservoirs of geothermal energy. 

Researchers at the Tokyo Institute of Technology have developed a new kind of battery that can reliably generate electric power from heat in environments with temperatures ranging from 60 degrees C to 100 degrees C—which is low enough to mimic geothermal heat.

In an earlier experiment, the researchers developed sensitized thermal cells (STCs) that employed dye-sensitized solar cells to convert light into electric power. In their latest advance, team leader Sachiko Matsushita, an associate professor at Tokyo Tech, explained that they replaced the dye with a semiconductor to enable the cells to operate using heat instead of light.

Open-Source Tool Lets Anyone Experiment With Cryptocurrency Blockchains

Post Syndicated from John Boyd original

SimBlock, a new blockchain simulator, lets users play around with the parameters of Bitcoin, Litecoin, and Dogecoin

Blockchain technology records information to a ledger shared between thousands of nodes. In the technology’s purest form, those nodes are not controlled by any central authority, and information cannot be changed once written to the ledger. Because of the security and autonomy this technology offers (in theory at least), blockchains now underpin many popular cryptocurrencies such as Bitcoin

But as Kazuyuki Shudo, an associate professor at the Tokyo Institute of Technology, points out, “It has been nearly impossible to test improvements on real-world blockchain networks, because that would mean having to update the software of all the thousands of nodes on a network.”

In researching blockchains, Shudo and his colleagues searched for a simulator that would help them experiment with and improve the technology. But existing simulators were too hard to use and lacked the features the team wanted. Moreover, these simulators had apparently been created for specific research and were abandoned soon after that work was completed, because many of the tools the group found were no longer being updated. 

AI, Drones Survey Great Barrier Reef in Last Ditch Effort to Avoid Catastrophe

Post Syndicated from John Boyd original

An Australian research team is using tech to monitor global climate change’s assault on the world’s largest living organism

The stats are daunting. The Great Barrier Reef is 2,300 kilometers long, comprises 2,900 individual coral reefs, and covers an area greater than 344,000 square kilometers, making it the world’s largest living organism and a UNESCO World Heritage Site. 

A team of researchers from Queensland University of Technology (QUT) in Brisbane, is monitoring the reef, located off the coast of northeastern Australia, for signs of degradation such as the bleaching caused by a variety of environmental pressures including industrial activity and global warming. 

The team, led by Felipe Gonzalez, an associate professor at QUT, is collaborating with the Australian Institute of Marine Science (AIMS), an organization that has been monitoring the health of the reef for many years. AIMS employs aircraft, in-water surveys, and NASA satellite imagery to collect data on a particular reef’s condition. But these methods have drawbacks, including the relatively low resolution of satellite images and high cost of operating fixed-wing aircraft and helicopters.

So Gonzalez is using an off-the-shelf drone modified to carry both a high-resolution digital camera and a hyperspectral camera. The monitoring is conducted from a boat patrolling the waters 15 to 70 km from the coast. The drone flies 60 meters above the reef, and the hyperspectral camera captures reef data up to three meters below the water’s surface. This has greatly expanded the area of coverage and is helping to verify AIMS’s findings.

The digital camera is used to build up a conventional 3D model of an individual reef under study, explains Gonzalez. But this conventional camera is capable of capturing light only from three spectral channels: the red, green, and blue covering the 380-to-740-nanometer portion of the electromagnetic spectrum. The hyperspectral camera, by contrast, collects the reflected light of 270 spectral bands.

“Hyperspectral imaging greatly improves our ability to monitor the reef’s condition based on its spectral properties,” says Gonzalez. “That’s because each component making up a reef’s environment—water, sand, algae, etc.—has its own spectral signature, as do bleached and unbleached coral.”

But this expansion in reef coverage and richness of gathered data presented the team with a new challenge. Whereas AIMS divers can gather information on 40 distinct points on a reef in an underwater session, just one hyperspectral image presents more than 4,000 data points. Consequently, a single drone flight can amass a thousand gigabytes of raw data that has to be processed and analyzed. 

In processing the data initially, the team used a PC, custom software tools, and QUT’s high-performance computer, a process that took weeks and drew heavily on the machine’s run time.

So the team applied for and received a Microsoft AI for Earth grant, which makes software tools, cloud computing services, and AI deep learning resources available to researchers working on global environmental challenges. 

“Now we can use Microsoft’s AI tools in the cloud to supplement our own tools and quickly label the different spectral signatures,” says Gonzalez. “So, where processing previous drone sweeps used to take three or four weeks, depending on the data, it now takes two or three days.”

This speedup in data processing is critical. If it took a year or more before the team were able to tell AIMS that a certain part of the reef is degrading rapidly, it might be too late to save it. 

“And by being informed early, the government can then take quicker action to protect an endangered area of the reef,” Gonzalez adds.

He notes that the use of hyperspectral imaging is now a growing area of remote sensing in a variety of fields, including agriculture, mineral surveying, mapping, and location of water resources.

For example, he and colleagues at QUT are also using the technology to monitor forests, wheat crops, and vineyards that can be affected by pathogens, fungi, or aphids.

Meanwhile, over the next two months, Gonzalez will continue processing the spectral data collected from the reef so far; and then in September, he will start a second round of drone flights. 

“We aim to return to the four reefs AIMS has already studied to monitor any changes,” he says, “then extend the monitoring to new reefs.”