All posts by Michael Koziol

A Machine Learning Classifier Can Spot Serial Hijackers Before They Strike

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/telecom/internet/mit-and-caida-researchers-want-to-use-machine-learning-to-plug-one-of-the-internets-biggest-holes

How would you feel if, every time you had to send sensitive information somewhere, you relied on a chain of people playing the telephone game to get that information to where it needs to go? Sounds like a terrible idea, right? Well, too bad, because that’s how the Internet works.

Data is routed through the Internet’s various metaphorical tubes using what’s called the Border Gateway Protocol (BGP). Any data moving over the Internet needs a physical path of networks and routers to make it from A to B. BGP is the protocol that moves information through those paths—though the downside, like a person in a game of telephone, is that each junction in the path only knows what they’ve been told by their immediate neighbor.

5G’s Waveform Is a Battery Vampire

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/telecom/wireless/5gs-waveform-is-a-battery-vampire

As carriers roll out 5G, industry group 3GPP is considering other ways to modulate radio signals

5G report logo, link to report landing page

In 2017, members of the mobile telephony industry group 3GPP were bickering over whether to speed the development of 5G standards. One proposal, originally put forward by Vodafone and ultimately agreed to by the rest of the group, promised to deliver 5G networks sooner by developing more aspects of 5G technology simultaneously.

Adopting that proposal may have also meant pushing some decisions down the road. One such decision concerned how 5G networks should encode wireless signals. 3GPP’s Release 15, which laid the foundation for 5G, ultimately selected orthogonal frequency-division multiplexing (OFDM), a holdover from 4G, as the encoding option.

But Release 16, expected by year’s end, will include the findings of a study group assigned to explore alternatives. Wireless standards are frequently updated, and in the next 5G release, the industry could address concerns that OFDM may draw too much power in 5G devices and base stations. That’s a problem, because 5G is expected to require far more base stations to deliver service and connect billions of mobile and IoT devices.

“I don’t think the carriers really understood the impact on the mobile phone, and what it’s going to do to battery life,” says James Kimery, the director of marketing for RF and software-defined radio research at National Instruments Corp. “5G is going to come with a price, and that price is battery consumption.”

And Kimery notes that these concerns apply beyond 5G handsets. China Mobile has “been vocal about the power consumption of their base stations,” he says. A 5G base station is generally expected to consume roughly three times as much power as a 4G base station. And more 5G base stations are needed to cover the same area.

So how did 5G get into a potentially power-guzzling mess? OFDM plays a large part. Data is transmitted using OFDM by chopping the data into portions and sending the portions simultaneously and at different frequencies so that the portions are “orthogonal” (meaning they do not interfere with each other).

The trade-off is that OFDM has a high peak-to-average power ratio (PAPR). Generally speaking, the orthogonal portions of an OFDM signal deliver energy constructively—that is, the very quality that prevents the signals from canceling each other out also prevents each portion’s energy from canceling out the energy of other portions. That means any receiver needs to be able to take in a lot of energy at once, and any transmitter needs to be able to put out a lot of energy at once. Those high-energy instances cause OFDM’s high PAPR and make the method less energy efficient than other encoding schemes.

Yifei Yuan, ZTE Corp.’s chief engineer of wireless standards, says there are a few emerging applications for 5G that make a high PAPR undesirable. In particular, Yuan, who is also the rapporteur for 3GPP’s study group on nonorthogonal multiple-access possibilities for 5G, points to massive machine-type communications, such as large-scale IoT deployments.

Typically, when multiple users, such as a cluster of IoT devices would communicate using OFDM, their communications would be organized using orthogonal frequency-division multiple access (OFDMA), which allocates a chunk of spectrum to each user. (To avoid confusion, remember that OFDM is how each device’s signals are encoded, and OFDMA is the method to make sure that overall, one device’s signals don’t interfere with any others.) The logistics of using distinct spectrum for each device could quickly spiral out of control for large IoT networks, but Release 15 established OFDMA for 5G-connected machines, largely because it’s what was used on 4G.

One promising alternative that Yuan’s group is considering, non-orthogonal multiple access (NOMA), could deliver the advantages of OFDM while also overlapping users on the same spectrum.

For now, Yuan believes OFDM and OFDMA will suit 5G’s early needs. He sees 5G first being used by smartphones, with applications like massive machine-type communications not arriving for at least another year or two, after the completion of Release 16, currently scheduled for December 2019.

But if network providers want to update their equipment to provide NOMA down the line, there could very well be a cost. “This would not come for free,” says Yuan. “Especially for the base station sites.” At the very least, base stations would need software updates to handle NOMA, but they might also require more advanced receivers, more processing power, or other hardware upgrades.

Kimery, for one, isn’t optimistic that the industry will adopt any non-OFDMA options. “It is possible there will be an alternative,” he says. “The probability isn’t great. Once something gets implemented, it’s hard to shift.”

Google’s Equiano Cable Will Extend to the Remote Island of Saint Helena, Flooding It With Data

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/telecom/internet/googles-equiano-cable-will-extend-to-the-remote-island-of-saint-helena-flooding-it-with-data

The tiny island will need to turn itself into a data hub to make use of the expected bandwidth

If you know anything about the South Atlantic island of Saint Helena, that’s probably because it was the island where the British government exiled Napoleon until he died in 1821. It was actually the second time Britain attempted to exile Napoleon, and the island was chosen for a very specific reason: It’s incredibly remote.

Napoleon is long gone, but the island’s remoteness continues to pose challenges for its 4,500-odd residents. They used to only be able to reach St. Helena by boat once every 3 weeks, though it’s now possible to catch the occasional flight from Johannesburg, South Africa to what’s been called “the world’s most useless airport.” Residents’ Internet prospects are even worse—the island’s entire population shares a single 50 megabits per second satellite link.

That’s about to change, however, as the St. Helena government has shared a letter of intent describing a plan to connect the island to Google’s recently announced Equiano cable. The cable will be capable of delivering orders of magnitude more data than anything the island has experienced. It will create so much capacity, in fact, that St. Helena could use the opportunity to transform itself from an almost unconnected island to a South Atlantic data hub.

Lunar Pioneers Will Use Lasers to Phone Home

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/telecom/wireless/lunar-pioneers-will-use-lasers-to-phone-home

NASA’s Orion and Gateway will try out optical communications gear for a high-speed connection to Earth

graphic link to special report landing page
graphic link to special report landing  page

With NASA making serious moves toward a permanent return to the moon, it’s natural to wonder whether human settlers—accustomed to high-speed, ubiquitous Internet access—will have to deal with mind-numbingly slow connections once they arrive on the lunar surface. The vast majority of today’s satellites and spacecraft have data rates measured in kilobits per second. But long-term lunar residents might not be as satisfied with the skinny bandwidth that, say, the Apollo astronauts contended with.

To meet the demands of high-definition video and data-intensive scientific research, NASA and other space agencies are pushing the radio bands traditionally allocated for space research to their limits. For example, the Orion spacecraft, which will carry astronauts around the moon during NASA’s Artemis 2 mission in 2022, will transmit mission-critical information to Earth via an S-band radio at 50 megabits per second. “It’s the most complex flight-management system ever flown on a spacecraft,” says Jim Schier, the chief architect for NASA’s Space Communications and Navigation program. Still, barely 1 Mb/s will be allocated for streaming video from the mission. That’s about one-fifth the speed needed to stream a high-definition movie from Netflix.

To boost data rates even higher means moving beyond radio and developing optical communications systems that use lasers to beam data across space. In addition to its S-band radio, Orion will carry a laser communications system for sending ultrahigh-definition 4K video back to Earth. And further out, NASA’s Gateway will create a long-term laser communications hub linking our planet and its satellite.

Laser communications are a tricky proposition. The slightest jolt to a spacecraft could send a laser beam wildly off course, while a passing cloud could interrupt it. But if they work, robust optical communications will allow future missions to receive software updates in minutes, not days. Astronauts will be sheltered from the loneliness of working in space. And the scientific community will have access to an unprecedented flow of data between Earth and the moon.

Today, space agencies prefer to use radios in the S band (2 to 4 gigahertz) and Ka band (26.5 to 40 GHz) for communications between spacecraft and mission control, with onboard radios transmitting course information, environmental conditions, and data from dozens of spaceflight systems back to mission control. The Ka band is particularly prized—Don Cornwell, who oversees radio and optical technology development at NASA, calls it “the Cadillac of radio frequencies”—because it can transmit up to gigabits per second and propagates well in space.

Any spacecraft’s ability to transmit data is constrained by some unavoidable physical truths of the electromagnetic spectrum. For one, radio spectrum is finite, and the prized bands for space communications are equally prized by commercial applications. Bluetooth and Wi-Fi use the S band, and 5G cellular networks use the Ka band.

The second big problem is that radio signals disperse in the vacuum of space. By the time a Ka-band signal from the moon reaches Earth, it will have spread out to cover an area about 2,000 kilometers in diameter—roughly the size of India. By then, the signal is a lot weaker, so you’ll need either a sensitive receiver on Earth or a powerful transmitter on the moon.

Laser communications systems also have dispersion issues, and beams that intersect can muddle up the data. But a laser beam sent from the moon would cover an area only 6 km across by the time it arrives on Earth. That means it’s much less likely for any two beams to intersect. Plus, they won’t have to contend with an already crowded chunk of spectrum. You can transmit a virtually limitless quantity of data using lasers, says Cornwell. “The spectrum for optical is unconstrained. Laser beams are so narrow, it’s almost impossible [for them] to interfere with one another.”

Higher frequencies also mean shorter wavelengths, which bring more benefits. Ka-band signals have wavelengths from 7.5 millimeters to 1 centimeter, but NASA plans to use lasers that have a 1,550-nanometer wavelength, the same wavelength used for terrestrial optical-fiber networks. Indeed, much of the development of laser communications for space builds on existing optical-fiber engineering. Shorter wavelengths (and higher frequencies) mean that more data can be packed into every second.

The advantages of laser communications have been known for many years, but it’s only recently that engineers have been able to build systems that outperform radio. In 2013, for example, NASA’s Lunar Laser Communications Demonstration proved that optical signals can reliably send information from lunar orbit back to Earth. The month-long experiment used a transmitter on the Lunar Atmosphere and Dust Environment Explorer to beam data back to Earth at speeds of 622 Mb/s, more than 10 times as fast as Orion’s S-band radio will.

“I was shocked to learn [Orion was] going back to the moon with an S-band radio,” says Bryan Robinson, an optical communications expert at MIT Lincoln Laboratory in Lexington, Mass. Lincoln Lab has played an important role in developing many of the laser communications systems on NASA missions, starting with the early optical demonstrations of the classified GeoLITE satellite in 2001. “Humans have gotten used to so much more, here on Earth and in low Earth orbit. I was glad they came around and put laser comm back on the mission.”

As a complement to its S-band radio, during the Artemis 2 mission Orion will carry a laser system called Optical to Orion, or O2O. NASA doesn’t plan to use O2O for any mission-critical communications. Its main task will be to stream 4K ultrahigh-definition video from the moon to a curious public back home. O2O will receive data at 80 Mb/s and transmit at 20 Mb/s while in lunar orbit. If you’re wondering why O2O will transmit at 20 Mb/s when a demonstration project six years ago was able to transmit at 622 Mb/s, it’s simply because the Orion developers “never asked us to do 622,” says Farzana Khatri, a senior staff member in Lincoln Lab’s optical communications group. Cornwell confirms that O2O’s downlink will deliver a minimum of 80 Mb/s from Earth, though the system is capable of higher data rates.

If successful, O2O will open the door for data-heavy communications on future crewed missions, allowing for video chats with family, private consultations with doctors, or even just watching a live sports event during downtime. The more time people spend on the moon, the more important all of these connections will be to their mental well-being. And eventually, video will become mission critical for crews on board deep-space missions.

Before O2O can even be tested in space, it first has to survive the journey. Laser systems mounted on spacecraft use telescopes to send and receive signals. Those telescopes rely on a fiddly arrangement of mirrors and other moving parts. O2O’s telescope will use an off-axis Cassegrain design, a type of telescope with two mirrors to focus the captured light, mounted on a rotating gimbal. Lincoln Lab researchers selected the design because it will allow them to separate the telescope from the optical transceiver, making the entire system more modular. The engineers must ensure that the Space Launch System rocket carrying Orion won’t shake the whole delicate arrangement apart. The researchers at Lincoln Lab have developed clasps and mounts that they hope will reduce vibrations and keep everything intact during the tumultuous launch.

Once O2O is in space, it will have to be precisely aimed. It’s hard to miss a receiver when your radio signal has the cross section the size of a large country. A 6-km-diameter signal, on the other hand, could miss Earth entirely with just a slight bump from the spacecraft. “If you [use] a laser pointer when you’re nervous and your hand is shaking, it’s going to go all over the place,” says Cornwell.

Orion’s onboard equipment will also generate constant minuscule vibrations, any one of which would be enough to throw off an optical signal. So engineers at NASA and Lincoln Lab will place the optical system on an antijitter platform. The platform measures the jitters from the spacecraft and produces an opposite pattern of vibrations to cancel them out—“like noise-canceling headphones,” Cornwell says.

One final hurdle for O2O will be dealing with any cloud cover back on Earth. Infrared wavelengths, like the O2O’s 1,550 nm, are easily absorbed by clouds. A laser beam might travel the nearly 400,000 km from the moon without incident, only to be blocked just above Earth’s surface. Today, the best defense against losing a signal to a passing stratocumulus is to beam transmissions to multiple receivers. O2O, for example, will use ground stations at Table Mountain, Calif., and White Sands, N.M.

The Gateway, scheduled to be built in the 2020s, will present a far bigger opportunity for high-speed laser communications in space. NASA, with help from its Canadian, European, Japanese, and Russian counterparts, will place this space station in orbit around the moon; the station will serve as a staging area and communications relay for lunar research.

NASA’s Schier suspects that research and technology demonstrations on the Gateway could generate 5 to 8 Gb/s of data that will need to be sent back to Earth. That data rate would dwarf the transmission speed of anything in space right now—the International Space Station (ISS) sends data to Earth at 25 Mb/s. “[Five to 8 GB/s is] the kind of thing that if you turned everything on in the [ISS], you’d be able to run it for 2 seconds before you overran the buffers,” Schier says.

The Gateway offers an opportunity to build a permanent optical trunk line between Earth and the moon. One thing NASA would like to use the Gateway for is transmitting positioning, navigation, and timing information to vehicles on the lunar surface. “A cellphone in your pocket needs to see four GPS satellites,” says Schier. “We’re not going to have that around the moon.” Instead, a single beam from the Gateway could provide a lunar rover with accurate distance, azimuth, and timing to find its exact position on the surface.

What’s more, using optical communications could free up radio spectrum for scientific research. Robinson points out that the far side of the moon is an optimal spot to build a radio telescope, because it would be shielded from the chatter coming from Earth. (In fact, radio astronomers are already planning such an observatory: Our article “Rovers Will Unroll a Telescope on the Moon’s Far Side” explains their scheme.) If all the communication systems around the moon were optical, he says, there’d be nothing to corrupt the observations.

Beyond that, scientists and engineers still aren’t sure what else they’ll do with the Gateway’s potential data speeds. “A lot of this, we’re still studying,” says Cornwell.

In the coming years, other missions will test whether laser communications work well in deep space. NASA’s mission to the asteroid Psyche, for instance, will help determine how precisely an optical communications system can be pointed and how powerful the lasers can be before they start damaging the telescopes used to transmit the signals. But closer to home, the communications needed to work and live on the moon can be provided only by lasers. Fortunately, the future of those lasers looks bright.

This article appears in the July 2019 print issue as “Phoning Home, With Lasers.”

Ossia’s Wireless Charging Tech Could Be Available By Next Year

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/energywise/consumer-electronics/gadgets/ossias-wireless-charging-tech-may-be-available-by-2020

The Cota wireless power system delivers 1 watt up to 1 meter away

Wireless power company Ossia has received authorization from the U.S. Federal Communications Commission (FCC) for its Cota wireless power system. The FCC authorization is a crucial step toward Ossia’s goal of seeing devices that incorporate Cota on the market in 2020.

While supplying power wirelessly seems like a promising idea in theory, the technology has been explored for years and its reality has never quite lived up to the hype. In all likelihood, the most sophisticated example you’ve seen of wireless charging is something like the wireless charging pads on coffee shop tables. Such pads require you to leave your device on them, making it difficult or impossible to use your gadget at the same time. That’s a far cry from being able to set down your device anywhere in the room and have it charge, or charge while you’re using it.

The physics of making such a thing possible are, it turns out, tricky. There’s been no shortage of ideas for wireless charging methods; startup uBeam, for example, has promised ultrasonic power transmitters but struggled to deliver for years. Energous, another startup, sells a charger that emits a pocket of 5.8 GHz radio waves to provide power to nearby devices.

Melting Arctic Ice Opens a New Fiber Optic Cable Route

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/telecom/internet/melting-sea-ice-opens-the-floodgate-for-a-new-fiber-optic-cable-route

Cinia and MegaFon’s proposed Arctic cable would bring lower latencies and geographical diversity

The most reliable way to reduce latency is to make sure your signal travels over the shortest physical distance possible. Nowhere is that more evident than the fiber optic cables that cross oceans to connect continents. There will always be latency between Europe and North America, for example, but where you lay the cable connecting the two continents affects that latency a great deal.

To that end, Helsinki-based Cinia, which owns and operates about 15,000 kilometers of fiber optic cable, and MegaFon, a Russian telecommunications operator, signed a memorandum of understanding to lay a fiber optic cable across the Arctic Ocean. The cable, if built, would not only reduce latency between users in Europe, Asia, and North America, but provide some much-needed geographical diversity to the world’s undersea cable infrastructure.

The vast majority of undersea cable encircles the world along a relatively similar path: From the east coast of the United States, cables stretch across the northern Atlantic to Europe, through the Mediterranean and Red Seas, across the Indian Ocean before swinging up through the South China Sea and finally spanning the Pacific to arrive at the west coast of the U.S. Other cable routes exist, but none have anywhere near the data throughput that this world-girding trunk line has.

Ari-Jussi Knaapila, the CEO of Cinia, estimates that the planned Arctic cable, which would stretch from London to Alaska, would shorten the physical cable distance between Europe and the western coast of North America by 20 to 30 percent. Additional cable will extend the route down to China and Japan, for a planned total of 10,000 kilometers of new cable.

Knaapila also says that the cable is an important effort to increase geographic diversity of the undersea infrastructure. Because many cables run along the same route, various events—earthquakes, tsunamis, seabed landslides, or even an emergency anchoring by a ship—can damage several cables at once. On December 19, 2008, 14 countries lost their connections to the Internet after ship anchors cut five major undersea cables in the Mediterranean Sea and Red Sea.

“Submarine cables are much more reliable than terrestrial cables with the same length,” Knaapila says. “But when a fault occurs in the submarine cable, the repair time is much longer.” The best way to avoid data crunches when a cable is damaged is to already have cables elsewhere that were unaffected by whatever event broke the original cable in the first place.

Stringing a cable across the Arctic Ocean is not a new idea, though other proposed projects, including the semi-built Arctic Fibre project, have never been completed. In the past, the navigational season in the Arctic was too short to easily build undersea cables. Now, melting sea ice due to climate change is expanding that window and making it more feasible (The shipping industry is coming to similar realizations as new routes open up).

The first step for the firms under the memorandum of understanding is to establish a company by the end of the year tasked with the development of the cable. Then come route surveys of the seabed, construction permits (for both on-shore and off-shore components), and finally, the laying of the cable. According to Knaapila, 700 million euros have been budgeted for investment in the route.

The technical specifics of the cable itself have yet to be determined. Most fiber optic cables use wavelengths of light around 850, 1300, or 1550 nanometers, but for now, the goal is to remain as flexible as possible. For the same reason, the data throughput of the proposed cable remains undecided.

Of course, not all cable projects succeed. The South Atlantic Express (SAex), would be one of the first direct links between Africa and South America, and connect remote islands like St. Helena along the way. But SAex has struggled with funding and currently sits in limbo. Cinia and MegaFon hope to avoid a similar fate.

Loon’s Balloons Deliver Emergency Internet Service to Peru Following 8.0 Earthquake

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/telecom/wireless/loons-balloons-deliver-emergency-service-to-peru-following-80-earthquake

The company was able to respond quickly because it had already begun tests in the country

When a magnitude 8.0 earthquake struck Peru on Sunday, it wreaked havoc on the country’s communications infrastructure. Within 48 hours, though, people in affected regions could use their mobile phones again. Loon, the Alphabet company, was there delivering wireless service by balloon.

Such a rapid response was possible because Loon happened to be in the country, testing its equipment while working out a deal with provider Telefonica. Both terrestrial infrastructure and the balloons themselves were already in place, and Loon simply had to reorganize its balloons to deliver service on short notice.

Creating a Better Online Experience for Billions is Just a Fix Away

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/at-work/innovation/creating-a-better-online-experience-for-billions-is-just-a-fix-away

Code that makes network applications compliant with universal-acceptance standards is an easy way to build a more inclusive Internet

Because much of the earliest work in computing and networking occurred in the United States and Europe, the Latin alphabet and its conventions—such as a left-to-right ordering of characters—got baked into software and hardware. But after spending years as the general manager of a domain registry for the Asia Pacific region, Don Hollander became convinced that Internet applications should support as many languages and writing systems as possible.

Which is why Hollander is now the secretary general of the Universal Acceptance Steering Group (UASG), a group that champions the idea that all valid top-level domains (TLDs), such as .com, .tech, or .信息, should function in any Web or email application. In the process, not only would the Web become more globally accessible, but companies would also be able to make sales or capture customer information that they currently lose, with the UASG estimating that the economic benefits could be US $9.8 billion per annum.

“The domain name space has changed a lot in the last few years,” Hollander says. Originally, TLDs used to be either three letters long, such as .edu, or two letters long, for country codes like .de. But around 2010, things started changing. People were clamoring for more diversity in what could be used for a TLD.

That led to two big changes. First was the creation of extended gTLDS—generic TLDs that can be three letters or longer—which is why .law and .info are now valid options (the UASG website itself uses .tech). Second, TLDs could be set up in languages that don’t use the Latin alphabet, allowing general Unicode characters in email addresses and TLDs. By 2013, over 2,000 new TLDs had been established.

By 2015, Hollander says, the ability to handle these new and various TLDs had been largely sorted out at the Domain Name System (DNS) level—that is, at the level of the directories that manage TLDs and associate them with specific numeric Internet addresses. (There are still some problems, however. Emojis are fickle because from a code perspective, the same emoji can be composed in multiple ways. That’s why emoji-based URLs, while they do exist, are difficult to work with.)

The remaining challenge, according to Hollander, is spreading the word, because it doesn’t matter if everything works at the network level if the code driving specific applications still supports only two- or three-letter TLDs and Latin-character email addresses. And unfortunately, many application developers have not kept up with the times.

Creating a software routine to check if an email address or TLD was valid used to be pretty straightforward. Ten years ago, if an application asked a user for an email address, for example, the developer could check if the response was valid by testing it in a few simple ways: It should have the symbol @, and it should end in a period followed by two or three letters. If it didn’t pass those tests, it was garbage.

When longer domain names and Unicode came along, those developers’ tests got more convoluted. “Now, I need to look for 2, 3, 4, 6, or 7 characters,” Hollander says. Nevertheless, it’s a largely solved problem: “It’s not a hard fix,” he says, adding that there is plenty of code available on GitHub and Stack Overflow for developers looking to make sure their applications are universal-acceptance compliant. For those looking to dig deeper into the issue, the UASG’s website offers documentation and links to relevant standards. UASG also has information about various languages and code libraries and which ones are up to date. (Hollander says, for example, that Python is currently not up to date.)

Ultimately, universal acceptance is an easy way to make the Internet more accessible for the billions of people whose first language is not written in Latin characters. Hollander wants developers to be mindful of that. “The world changed, and they should bring their systems up to date,” he says.

This article appears in the June 2019 print issue as “The Universal Internet.”

For Specialized Optimizing Machines, It’s All in the Connections

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/computing/hardware/for-specialized-optimizing-machines-its-all-in-the-connections

Whether it’s an Ising machine or a quantum annealer, the machine seems to matter less than how the parts connect

Suppose you want to build a special machine, one that can plot a traveling salesperson’s route or schedule all the flights at an international airport. That is, the sorts of problems that are incredibly complex, with a staggering number of variables. For now, the best way to crunch the numbers for these optimization problems remains a powerful computer.

But research into developing analog optimizers—machines that manipulate physical components to determine the optimized solution—is providing insight into what is required to make them competitive with traditional computers.

To that end, a paper published today in Science Advances provides the first experimental evidence that high connectivity, or the ability for each physical component to directly interact with the others, is a vital component for these novel optimization machines. “Connectivity is very, very important, it’s not something one should ignore,” says Peter McMahon, a postdoctoral researcher at Stanford, who participated in the research.

Terahertz Waves Could Push 5G to 6G

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/telecom/wireless/at-the-6th-annual-brooklyn-5g-summit-some-eyes-are-on-6g

At the Brooklyn 5G summit, experts said terahertz waves could fix some of the problems that may arise with millimeter-wave networks

5G report logo, link to report landing page

It may be the sixth year for the Brooklyn 5G Summit, but in the minds of several speakers, 2019 is also Year Zero for 6G. The annual summit, hosted by Nokia and NYU Wireless, is a four-day event that covers all things 5G, including deployments, lessons learned, and what comes next.

This year, that meant preliminary research into terahertz waves, the frequencies that some researcher believe will make up a key component of the next next generation of wireless. In back-to-back talks, Gerhard Fettweis, a professor at TU Dresden, and Ted Rappaport, the founder and director of NYU Wireless, talked up the potential of terahertz waves.

Australia’s Troubled National Broadband Network Delivers a Fraction of What Was Promised

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/telecom/internet/australias-troubled-national-broadband-network-delivers-a-fraction-of-what-was-promised

The newly elected government will inherit a floundering AUD $51 billion broadband network that’s providing slower service to fewer properties than planned

On 18 May, voters in Australia’s federal election will determine whether the Liberal-National Coalition will remain in control or the Australian Labor Party will win the government. Either way, the new leaders will have to contend with the National Broadband Network (NBN), a lumbering disaster that began as an ambitious effort by the Australian government to construct a countrywide broadband network.

When the NBN was first proposed in April 2009, the government aimed to build a fiber-optic network that would deliver connections of up to 100 megabits per second to 90 percent of Australian homes, schools, and workplaces within eight years. A decade later, however, the NBN has failed to deliver on that promise. NBN Co., the government-owned company created to construct and manage the network, now expects to deliver 50 Mb/s connections to 90 percent of the country by the end of 2020.

“None of the promises have ever been met,” says Rod Tucker, a telecommunications engineer and professor emeritus at the University of Melbourne. The watershed moment for the network was the 2013 federal election. That year, the Labor government that had championed the network was replaced by a conservative coalition government. The new government promptly reevaluated the NBN plan, taking issue with its projected cost.

The original plan, estimated to cost AU $43 billion, was a fiber-to-the-premise (or FTTP) plan. FTTP, as the name implies, requires threading fiber-optic cable to each and every building. The coalition government balked at the price tag, fired NBN Co.’s CEO, restructured the company, and proposed an alternative fiber-to-the-node (or FTTN) strategy, which was estimated to cost no more than AU $29.5 billion. The cost has now ballooned to more than AU $51 billion.

The reason an FTTN network theoretically costs less is because there’s less fiber to install. Rather than run fiber to every premise, FTTN runs fiber to centralized “nodes,” from which any number of technologies can then connect to individual premises. Fiber could be used, but more typically these last-mile connections are made with copper cabling.

The rationale behind the FTTN pivot was that there’s no need to lay fiber to homes and offices because copper landlines already connect those buildings. Unfortunately, copper doesn’t last forever. “A lot of the copper is very old,” says Mark Gregory, an associate professor of engineering at RMIT University, in Melbourne. “Some of it is just not suitable for fiber to the node.” Gregory says that NBN Co. has run into delays more than once as it encounters copper cabling near the end of its usable life and must replace it.

NBN Co. has purchased roughly 26,000 kilometers of copper so far to construct the network, according to Gregory—enough to wrap more than halfway around the Earth. Buying that copper added roughly AU $10 billion to the FTTN price tag, says Tucker. On top of that, NBN Co. is paying AU $1 billion a year to Telstra, the Australian communications company, for the right to use the last-mile copper that Telstra owns.

But perhaps the worst part is that even after the cost blowouts, the lackluster connections, and the outdated copper technology, there doesn’t seem to be a good path forward for the network, which will be obsolete upon arrival. “In terms of infrastructure,” says Gregory, “it’s pretty well the only place I know of that’s spent [AU] $50 billion and built an obsolete network.”

Upgrading the network to deliver the original connection speeds will require yet another huge investment. That’s because copper cables can’t compete with fiber-optic cables. To realize 100 Mb/s, NBN Co. will eventually have to lay fiber from all the nodes to every premise anyway. Gregory, for one, estimates that could cost NBN Co. an additional AU $16 billion, a hard number to swallow for a project that’s already massively over budget. “Fiber to the node is a dead-end technology in that it’s expensive to upgrade,” says Tucker, who also wrote about the challenges the NBN would face in the December 2013 issue of IEEE Spectrum.

After the federal election, the incoming government will have to figure out what to do with this bloated project. NBN Co. is far from profitable, and even if it was, it still owes billions of dollars to the Australian government. If the government does decide to bite the bullet and upgrade to FTTP, it will have to contend with other commercial networks now delivering equivalent speeds.

Had Australia delivered the NBN as originally promised, it would have been one of the fastest, most robust national networks in the world at the time. Instead, the country has watched its place in rankings of broadband speeds around the world continue to drop, says Tucker, while places like New Zealand, Australia’s neighbor “across the ditch,” have invested in and largely completed robust FTTP networks.

“It was an opportunity lost,” says Gregory.

This article appears in the May 2019 print issue as “Australia’s Fiber-Optic Mess.”