Tag Archives: computing

5G Terms and Acronyms Defined

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/5g-terms-and-acronyms-defined

Want to sound like a 5G expert?

With so many 5G terms talked about these days, it’s easy to get confused. Even more terms and acronyms are on the way. We’ve got you covered with our up-to-date publication quality list of over 90 terms, ranging from AM Distortion to Xn Interface. This handy resource is sure to make you sound like a 5G expert and is perfect for sharing with your colleagues or students.

image

University of Southampton Uses the USRP and LabVIEW to Change the Way It Teaches Wireless Communications

Post Syndicated from National Instruments original https://spectrum.ieee.org/computing/software/university-of-southampton-uses-the-usrp-and-labview-to-change-the-way-it-teaches-wireless-communications

The University of Southampton has been looking at new and innovative ways to teach the principles of wireless communication at a time when there is significant interest in wireless technologies

Demonstrating the Practical Challenges of Wireless Communications

Most electronics education worldwide teaches wireless communications with a typical focus on  communications theory. At the University of Southampton, educators have taken a different outlook in teaching students the practical aspects of communication technology to better prepare them for their careers in industry. Students focus on the rapid prototyping of a wireless communications system with live radio frequency (RF) signal streaming for a practical approach to communications education. With this approach, students gain a valuable experience in manipulating live signals for a greater understanding of wireless communication and the associated practical challenges.

A Real Communications System to Demonstrate Practical Concepts

The University of Southampton have accomplished this demonstration of the practical concepts of wireless communication as part of their masters course in wireless communications. The focus was on creating a wireless communications system to demonstrate the concept of differential-quadrature phase-shift keying(DQPSK) and how it is used within wireless communications. The students were given a USRP™ (Universal Software Radio Peripheral) and tasked with building a DPSK transceiver in a practical session. Before this they attended a one-hour lecture on the USRP and how to use it to achieve their learning outcomes. Additionally were given a pre-session assignment to do, which familiarised them with LabVIEW and its environment.

Practical Challenges of Wireless Communication

Southampton students were tasked with building one half of a wireless communications system. The setup consisted of an incomplete DQPSK demodulator, which needed to be completed so that a modulated signal sent by a separate USRP device could be decoded. To complete this task, a number of steps covering different concepts are required so that the end result is a fully working communications system.

The students first applied a filter to the received and down-converted signal and compared this to the input of the filter in the transmitter of the system. They then down sampled the data to detect, synchronize, and extract the DPSK symbols from the waveform and compare them to those in the transmitter. Finally, students demodulated and decoded these DPSK symbols to recover the message bits, which are again compared with those in the transmitter.

After these three features were implemented into the demodulator, students rigorously tested their system by comparing their constellation graph and signal eye diagram to those of the transmitter, which is shown below.

The constellation diagram gives a visual overview of how the different phases in the phase-shift keying modulation scheme matched up to symbols and how they are represented within the signal envelope. They are important because they give a visual overview of how much interference or distortion is in a signal or channel and are a quick way of seeing if everything is functioning normally. The eye diagram gives a similar visual reference in that it helps show all of the different types of symbols within a channel superimposed over each other to see the characteristics of the system. From this students could infer characteristics such as if the symbols were too long, short, or noisy or poorly synchronized. If the eye is “open”, as it is in the above diagram, then it infers minimal distortion in the signal. If the signal was distorted, then the eye pattern begins to close, decreasing the spaces in the pattern.

Four Out of Five Students Would Like to Make More Use of USRPs

After the conclusion of the module on communications system, students completed questionnaires about their satisfaction and provided feedback on the practical session.

More than four out of five students, 82 percent, said that in the future they would like to make use of the USRP in the taught aspects of their course. In addition, 75 percent of students said that they would like to make use of the USRP in their MSc research projects—showing its great potential in all aspects of wireless communications education and research.

One student said that “The USRP gives an avenue for exploration. It is a good tool to bridge the gap between practical and theory.” Whilst another said that “The USRP vividly helps me understand the theory that I learned in class.” This shows that Southampton has created a strong benchmark in practical communications education.

Next Steps

See Other Academic Applications

Learn More About NI USRP

How AI is Starting to Influence Wireless Communications

Post Syndicated from National Instruments original https://spectrum.ieee.org/computing/software/how-ai-is-starting-to-influence-wireless-communications

Machine learning and deep learning technologies are promising an end-to-end optimization of wireless networks while they commoditize PHY and signal-processing designs and help overcome RF complexities

What happens when artificial intelligence (AI) technology arrives on wireless channels? For a start, AI promises to address the design complexity of radio frequency (RF) systems by employing powerful machine learning algorithms and significantly improving RF parameters such as channel bandwidth, antenna sensitivity and spectrum monitoring.

So far, engineering efforts have been made for smartening individual components in wireless networks via technologies like cognitive radio. However, these piecemeal optimizations targeted at applications such as spectrum monitoring have been labor intensive, and they entail efforts to hand-engineer feature extraction and selection that often take months to design and deploy.

On the other hand, AI manifestations like machine learning and deep learning can invoke data analysis to train radio signal types in a few hours. For instance, a trained deep neural network takes a few milliseconds to perform signal detection and classification as compared to traditional methodologies based on the iterative and algorithmic signal search and signal detection and classification.

It is important to note that such gains also significantly reduce power consumption and computational requirements. Moreover, a learned communication system allows wireless designers to prioritize key design parameters such as throughput, latency, range and power consumption.

More importantly, deep learning-based training models facilitate a better awareness of the operational environment and promise to offer end-to-end learning for creating an optimal radio system. Case in point: a training model that can jointly learn an encoder and decoder for a radio transmitter and receiver while encompassing RF components, antennas and data converters.

Additionally, what technologies like deep learning promise in the wireless realm is the commoditization of the physical layer (PHY) and signal processing design. Combining deep learning-based sensing with active radio waveforms creates a new class of use cases that can intelligently operate in a variety of radio environments.

The following section will present a couple of design case studies that demonstrate the potential of AI technologies in wireless communications.

Two design case studies

First, the OmniSIG software development kit (SDK) from DeepSig Inc. is based on deep learning technology and employs real-time signal processing to allow users to train signal detection and classification sensors.

DeepSig claims that its OmniSIG sensor can detect Wi-Fi, Bluetooth, cellular and other radio signals up to 1,000 times faster than existing wireless technologies. Furthermore, it enables users to understand the spectrum environment and thus facilitate contextual analysis and decision making.

ENSCO, a U.S. government and defense supplier, is training the OmniSIG sensor to detect and classify wireless and radar signals. Here, ENSCO is aiming to deploy AI-based capabilities to overcome the performance limitations of conventionally designed RF systems for signal intelligence.

What DeepSig’s OmniPHY software does is allow users to learn the communication system, and subsequently optimize channel conditions, hostile spectrum environments and hardware performance limitations. The applications include anti-jam capabilities, non-line-of-sight communications, multi-user systems in contested spectrums and mitigation of the effects of hardware distortion.

Another design case study showing how AI technologies like deep learning can impact future hardware architectures and designs is the passive Wi-Fi sensing system for monitoring health, activity and well-being in nursing homes (Figure 2). The continuous surveillance system developed at Coventry University employs gesture recognition libraries and machine learning systems for signal classification and creates a detailed analysis of the Wi-Fi signals that reflect off a patient, revealing patterns of body movements and vital signs.

Residential healthcare systems usually employ wearable devices, camera-based vision systems and ambient sensors, but they entail drawbacks such as physical discomfort, privacy concerns and limited detection accuracy. On the other hand, a passive Wi-Fi sensing system, based on activity recognition and through-wall respiration sensing, is contactless, accurate and minimally invasive.

The passive Wi-Fi sensing for nursing homes has its roots in a research project on passive Wi-Fi radar carried out at University College London. The passive Wi-Fi radar prototype —based on software-defined radio (SDR) solutions from National Instruments (NI) — is completely undetectable and can be used in military and counterterrorism applications.

USRP transceiver plus LabVIEW

A passive Wi-Fi sensing system is a receive-only system that measures the dynamic Wi-Fi signal changes caused by moving indoor objectives across multiple path propagation. Here, AI technologies like machine learning allow engineers to use frequency to measure the phase changing rate during the measurement duration as well as Doppler shift to identify movements.

Machine learning algorithms can establish the link between physical activities and the Doppler-time spectral map associated with gestures such as picking things up or sitting down. The phase of the data batches is accurate enough to discern the small body movements caused by respiration.

Coventry University built a prototype of a passive Wi-Fi sensing system using Universal Software Radio Peripheral (USRP) and LabVIEW software to capture, process and interpret the raw RF signal samples. LabVIEW, an intuitive graphical programming tool for both processors and FPGAs, enables engineers to manage complex system configurations and adjust signal processing parameters to meet the exact requirements.

On the other hand, USRP is an SDR-based tunable transceiver that works in tandem with LabVIEW for prototyping wireless communication systems. It has already been used in prototyping wireless applications such as FM radio, direction finding, RF record and playback, passive radar and GPS simulation.

Engineers at Coventry University have used USRP to capture the raw RF samples and deliver them to the LabVIEW application for speedy signal processing. They have also dynamically changed the data arrays and batch size of analysis routines to adapt the system to slow and fast movements.

Engineers were able to interpret some captured signals and directly link the periodic change of batch phase with gestures and respiration rate. Next, they examined if the phase of the data batches was accurate enough to discern the small body movements caused by respiration.

AI: The next wireless frontier

The above design examples show the potential of AI technologies like machine learning and deep learning to revolutionize the RF design, addressing a broad array of RF design areas and creating new wireless use cases.

These are still the early days of implementing AI in wireless networks. But the availability of commercial products such as USRP suggests that the AI revolution has reached the wireless doorstep.

For more information on the role of AI technologies in wireless communications, go to Ettus Research, which provides SDR platforms like USRP and is a National Instruments’ brand since 2010.

 

 

Free Download: EMI step-by-step guide from Rohde & Schwarz

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/free-download-emi-stepbystep-guide-from-rohde-schwarz

Solve your EMI problems more efficiently with solutions from Rohde & Schwarz.

With this guide, you are now able to discover and analyze EMI in a more systematic and methodical approach to solve your problems.

img

New Zealand Startup Seeks to Automate (Most) Code Review

Post Syndicated from Rina Diane Caballar original https://spectrum.ieee.org/tech-talk/computing/software/how-codelingo-helps-teams-develop-better-software

CodeLingo has developed a tool that it says can help developers with code reviews, refactoring, and documentation

Software developers are force-multipliers. Yet, instead of spending their time building new products or services, software developers are wasting too much of it on maintaining existing code.

CodeLingo, a New Zealand-based startup founded in 2016, aims to change that. The company has developed an automated code review tool that catches new and existing issues in code. CodeLingo’s search engine and query language finds patterns across the code base and uses those patterns to automate code reviews, code refactoring (restructuring existing code to optimize it), and contributor documentation.

According to a 2018 study by Stripe, developers could increase the global GDP by US $3 trillion over the next 10 years. But developers spend almost half of their working hours—that’s 17 hours in an average 41-hour work week—on code maintenance. This includes finding and repairing bugs, fixing bad code, and refactoring. This equates to an estimated $300 billion loss in productivity each year.

CodeLingo hopes to recapture some of that loss so it could be spent on what matters. “CodeLingo is, in essence, an analysis platform,” says founder Jesse Meek. “It treats the whole software stack as data, then looks for patterns in that data and ways to automate common development workflows, such as finding and fixing bugs, automatically refactoring the code base, automating reviews of pull requests as they come into a repository, and automating the generation of contributor documentation.”

For Specialized Optimizing Machines, It’s All in the Connections

Post Syndicated from Michael Koziol original https://spectrum.ieee.org/tech-talk/computing/hardware/for-specialized-optimizing-machines-its-all-in-the-connections

Whether it’s an Ising machine or a quantum annealer, the machine seems to matter less than how the parts connect

Suppose you want to build a special machine, one that can plot a traveling salesperson’s route or schedule all the flights at an international airport. That is, the sorts of problems that are incredibly complex, with a staggering number of variables. For now, the best way to crunch the numbers for these optimization problems remains a powerful computer.

But research into developing analog optimizers—machines that manipulate physical components to determine the optimized solution—is providing insight into what is required to make them competitive with traditional computers.

To that end, a paper published today in Science Advances provides the first experimental evidence that high connectivity, or the ability for each physical component to directly interact with the others, is a vital component for these novel optimization machines. “Connectivity is very, very important, it’s not something one should ignore,” says Peter McMahon, a postdoctoral researcher at Stanford, who participated in the research.

Get tips to develop your DAQ test systems

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/get-tips-to-develop-your-daq-test-systems

Reduce your test development time, increase throughput and improve the accuracy of your test systems

There is a growing trend across all industries to design feature-rich products. You need to thoroughly test your product while meeting market windows and project deadlines. Learn how a data acquisition system could help you achieve all of these goals in this Ebook entitled, Four Things to Consider When Using a DAQ as a Data Logger

img

Rohde & Schwarz Presents: Smart Jammer / DRFM Testing – Test and Measurement Solutions for the Next Level

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/rohde-schwarz-presents-smart-jammer-drfm-testing-test-and-measurement-solutions-for-the-next-level

The webinar introduces the concept of Digital RF Memory Jammers, describes their technology and the respective test and measurement challenges and solutions from Rohde & Schwarz.

The DRFM jammer has become a highly complex key element of the EA suite. It has evolved from a simple repeater with some fading capabilities to a complex electronic attack asset. Some of the more critical tests are verifying proper operation and timing of the deception techniques on the system level, qualifying the individual components, submodules and modules at the RF/IF level, and last but not least making sure that clock jitter and power integrity are addressed early at the design stage. For all these requirements, Rohde & Schwarz offers cutting edge test and measurement solutions. The webinar introduces the concept of Digital RF Memory Jammers, describes their technology and the respective Test and Measurement challenges and solutions from Rohde & Schwarz.

Please note: By downloading a webinar, you’re contact information will be shared with the sponsoring company, Rohde & Schwarz GmbH & Co.KG and the Rohde & Schwarz entity or subsidiary company mentioned in the imprint of www.rohde-schwarz.com, and you may be contacted by them directly via email or phone for marketing or advertising purposes.

Using Ethernet-Based Synchronization on the USRP™ N3xx Devices

Post Syndicated from National Instruments original https://spectrum.ieee.org/computing/it/using-ethernetbased-synchronization-on-the-usrp-n3xx-devices

The USRP N3xx product family supports three different methods of baseband synchronization: external clock and time reference, GPSDO module, and Ethernet-based timing protocol

USRP N3xx Synchronization Options

The USRP N3xx product family supports three different methods of baseband synchronization: external clock and time reference, GPSDO module, and Ethernet-based timing protocol. Using an external clock and time reference source, such as the CDA-2990 accessory, offers a precise and convenient method of baseband synchronization for high channel count systems where devices are located near each other, such as in a rackmount configuration. Using the GPSDO module enables synchronization when the devices are physically separated by large distances such as in small cell, RF sensor, TDOA, and distributed testbed applications. However, the GPSDO method typically has more skew than the other two methods and requires line of sight to satellites. Therefore, indoor, urban, or hostile environments restrict the use of GPSDO. Ethernet-based synchronization enables precise baseband synchronization over large distances in GPS-denied environments. However, this method consumes one of the SFP+ ports of the USRP N3xx devices and therefore reduces the number of connectors available for IQ streaming. This application note provides instructions for synchronizing multiple USRP N3xx devices using the Ethernet-based method.

Ethernet-Based Synchronization Overview

The USRP N3xx product family supports Ethernet-based synchronization using an open source protocol known as White Rabbit. White Rabbit is a fully deterministic Ethernet-based network protocol for general purpose data transfer and synchronization[1]. This project is supported by a collaboration of academic and industry experts such as CERN and GSI Helmholtz Centre for Heavy Ion Research.

White Rabbit is an extension of the IEEE 1588 Precision Time Protocol (PTP) standard, which distributes time references over Ethernet networks. In addition, White Rabbit uses Synchronous Ethernet (SyncE) to distribute a common clock reference over the network across the Ethernet physical layer to ensure frequency syntonization between all nodes. This combination of SyncE and PTP, in addition to further measurements, provides sub-nanosecond synchronization over distances of up to 10 km. The White Rabbit extension of the IEEE 1588-2008 standard is in the final stages of becoming generalized as the IEEE 1588 High Accuracy profile[2].

The USRP N3xx product family implements the White Rabbit protocol using a combination of the FPGA and dedicated clocking resources. The USRP N3xx operates as a slave node, a White Rabbit master node is required in the network. Seven Solutions provides White Rabbit hardware that works with the USRP N3xx devices to create synchronous clock and time references that are precisely aligned across all devices in the network. See the “Required Accessories” section for details on the required external hardware. The USRP N3xx devices do not support IQ sample streaming over this protocol. Therefore, only one of SFP+ ports is available for streaming when using White Rabbit synchronization.

For more information on the White Rabbit project, visit the links below:

White Rabbit documentation:

Standardization as IEEE1588 High Accuracy:

Required Accessories

White Rabbit synchronization utilizes specific optical SFP transceivers and single mode fiber optic cables to achieve precise time alignment, as documented on the project website. The USRP N3xx was tested to work as a White Rabbit slave using the AXGE-1254-0531 SFP transceiver marked in blue, the AXGE-3454-0531 SFP transceiver marked in purple, and a G652 type single mode fiber optic cable.

Seven Solutions is a provider of White Rabbit equipment, including the WR-LEN and the White Rabbit Switch (WRS). The USRP N3xx was tested to work with both the WR-LEN and the WRS products. All accessories required for White Rabbit operation can be purchased directly from the Seven Solutions website. The AXGE SFP transceivers and fiber optic cables are only listed on the website as part of the “KIT WR-LEN” product, but they can also be purchased individually by contacting Seven Solutions.

For more information on White Rabbit accessories, visit the links below:

White Rabbit SFP wiki:

Seven Solutions WR-LEN:

Seven Solutions KIT WR-LEN:

Seven Solutions WRS:

System Configuration

The White Rabbit feature of the USRP N3xx product family is based on standard networking technology, therefore many system topologies are possible. However, the USRP N3xx device only works as a downstream slave node and must receive its synchronization reference from an upstream master node. This section shows examples of typical configurations used to synchronize a network of multiple USRP N3xx devices.

Figure 1 shows a WRS operating as the master node connected to several USRP N3xx devices. Note that a master SFP port requires the purple SFP transceiver mentioned in the previous section, and a slave SFP port requires the blue SFP transceiver. The USRP N3xx use the SFP+ 0 port for White Rabbit and SFP+ 1 port for IQ streaming. This port configuration requires the White Rabbit “WX” FPGA bitfile.

Download all FPGA images for the version of the USRP Hardware Driver (UHD) installed on the host PC by running the following command in a terminal:

uhd_images_downloader

Load the WX bitfile by running:

uhd_image_loader –args type=n3xx,addr=ni-n3xx-<DEVICE_SERIAL>-fpga-path=”<UHD_INSTALL_DIRECTORY>/share/uhd/images/usrp_n310_fpga_WX.bit

Using the UHD API, configure the USRP application to use “internal” clock source and “sfp0” time source:

usrp->set_clock_source(“internal”)

usrp->set_time_source(“sfp0”)

The White Rabbit IP running on the FPGA disciplines the internal VCXO of the USRP N3xx to the clock reference from the upstream master node in the network. See the USRP N3xx block diagram for reference.

The WRS/WR-LEN device needs to be configured as a master on the ports connected to the USRP N3xx modules. Users can make this configuration with the WR-GUI application provided by Seven Solutions, or with a serial console connection to the WRS/WR-LEN device. See the WRS/WR-LEN manual for detailed instructions. After White Rabbit lock is achieved, the standard USRP N3xx synchronization process completes and the devices are ready for use.

In addition to operating as a master, the WRS and WR-LEN devices can operate as a grandmaster by receiving clock and time references from an external source. This feature is useful for situations where the entire White Rabbit network needs to be disciplined to GPS or other high accuracy synchronization equipment such as a rubidium source. See the WRS/WR-LEN documentation for more information on grandmaster mode.

Synchronization Example

This section provides an example measurement of the timing alignment between multiple USRP N3xx devices synchronized using White Rabbit, with varying fiber cable lengths. As shown in Figure 3, a White Rabbit Switch in master mode is connected to one USRP N3xx device using a 5 km spool of fiber, and to another USRP N3xx device using 1 m of fiber. The synchronization performance was measured by probing the exported PPS signal, which is in the sample clock domain on both USRP N3xx devices thereby demonstrating sample clock and timestamp alignment. The time difference between each PPS edge was measured with an oscilloscope at room temperature in a laboratory environment. As shown in Figure 4, the resulting measurement shows about 222 ps of skew between the two USRP N3xx devices, thereby demonstrating the sub-nanosecond synchronization of White Rabbit over long distances.

The frequency accuracy of the internal oscillator of each USRP N3xx slave node is derived from the frequency accuracy of the upstream master node, in a manner similar to disciplining to an external clock reference source connected to the REF IN port. By connecting a high accuracy frequency source such as a rubidium reference to the master White Rabbit device in grandmaster mode, all USRP N3xx devices in the White Rabbit network would inherit this frequency accuracy.

References

NI Announces Test UE Offering for 5G Lab and Field Trials

Post Syndicated from National Instruments original https://spectrum.ieee.org/computing/it/ni-announces-test-ue-offering-for-5g-lab-and-field-trials

Release 15 compliant 5G New Radio non-standalone system can help customers deliver 5G commercial wireless to market faster

 NI

NI (Nasdaq: NATI), the provider of platform-based systems that help engineers and scientists solve the world’s greatest engineering challenges, today announced a real-time 5G New Radio (NR) test UE offering. The NI Test UE offering features a fully 3GPP Release 15 non-standalone (NSA) compliant system capable of emulating the full operation of end-user devices or user equipment (UE).

With the 5G commercial rollout this year, engineers must validate the design and functionality of 5G NR infrastructure equipment before productization and release. Based on the rugged PXI Express platform, the NI Test UE offering helps customers test prototypes in the lab and in the field to evaluate them on service operators’ networks. In addition, customers can perform InterOperability Device Testing (IoDT), which is a critical part of the commercialization process to ensure that network equipment works with UE from any vendor and vice versa. The NI Test UE offering can also be used to perform benchmark testing to evaluate the full capabilities of commercial and precommercial micro-cell, small-cell and macro-cell 5G NR gNodeB equipment.

Spirent has worked with NI to add 5G NR support to its existing portfolio of products. “As 5G was picking up steam, we looked to find a world-class 5G NR platform that would outperform the market today and continue to do so as the 5G market matures,” said Clarke Ryan, senior director of Product Development at Spirent. “As a leader in SDR-based radios since 2011, NI was the natural choice to ensure we have the best radio with the best testing capabilities to stay ahead of the curve for our customers.”

The NI Test UE offering provides a flexible system for evaluating 5G technology. Customers can use the SDR front ends to select the sub-6 GHz frequency of their choice. The system scales up to one 100 MHz bandwidth component carrier and can be configured for up to 4×2 MIMO to achieve a maximum throughput of 2.3 Gb/s. The 5G NR Release 15 software includes complete protocol stack software that can connect with a 5G gNodeB while providing real-time diagnostic information. Customers can log diagnostic information to a disk for post-test analysis and debugging and can view it on the software front panel for a real-time visualization of the link’s performance.

“The industry is on the cusp of 5G commercial deployments and mobile operators need to ensure that their infrastructure is 5G enabled in a virtualized, programmable, open and cost-efficient way,” said Neeraj Patel, Vice President and General Manager, Software and Services, Radisys. “NI is leveraging our first-to-market 5G Software Suite as the engine for its Test UE offering. Our complete 5G source code solution for UE, gNB and 5GCN represents a disruptive end-to-end enabling technology for customers to build 5G NR solutions. By powering such first to market test applications together with NI and Spirent, we are accelerating 5G commercialization that will change how the world connects.”

Find more information about the NI Test UE offering for 5G NR at ni.com/5g-test-ue.

About NI

NI (ni.com) develops high-performance automated test and automated measurement systems to help you solve your engineering challenges now and into the future. Our open, software-defined platform uses modular hardware and an expansive ecosystem to help you turn powerful possibilities into real solutions.

Broadband Chokes for Bias Tee Applications

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/broadband-chokes-for-bias-tee-applications

Broadband chokes for Bias Tee applications: how to successfully apply a DC bias onto an RF line

Numerical modeling allows for better design and optimization of batteries and battery systems.

This white paper illustrates how simulation is a necessary tool to understand, optimize, control, and design batteries and battery systems.

Download for FREE: EMI step-by-step guide from Rohde & Schwarz

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/download-for-free-emi-stepbystep-guide-from-rohde-schwarz

Be able to discover & analyze EMI in a more systematic & methodical approach to solve your problems.

In our free step-by-step guide, we break down the whole EMI design test process into “Locate”, “Capture”, and “Analyze”. Download & learn more.

image

For Better Computing, Liberate CPUs From Garbage Collection

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/computing/hardware/this-little-device-relieves-a-cpu-from-its-garbage-collection-duties

An accelerator unit improves both the performance and efficiency of a system by taking over one simple task

Journal Watch report logo, link to report landing page

Without you noticing (much), your computer is working hard in the background to organize its memory system. On top of its many tasks, a CPU must do something called “garbage collection,” whereby it identifies and deletes redundant or irrelevant data from applications to free up additional memory space.

Garbage collection is meant to spare programmers from having to manually address this unnecessary data, but the automated process that CPUs are tasked with consumes a lot of computational power—up to 10 percent or more of the total time a CPU spends on an application.

While completing his PhD at the University of California, Berkeley, Martin Maas, who now works at Google, designed a new type of device that relieves the CPU from its garbage collection duties. The design is described in a paper published 23 April in IEEE Micro.

Get connected: Key considerations when selecting a connector

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/get-connected-key-considerations-when-selecting-a-connector

This whitepaper takes you back to basics to look at key factors to be considered when selecting a connector solution

This whitepaper explains the key factors to consider when specifying the best connector solution for the application. Topics include; electrical properties, mechanical and environmental considerations, physical space issues, designing for manufacture and servicing, standards and certifications. Also includes a checklist to assist in the shortlisting and justification process.

Early Warning System Predicts Risk of Online Students Dropping Out

Post Syndicated from Michelle Hampson original https://spectrum.ieee.org/tech-talk/computing/software/a-predictive-modeling-tool-for-identifying-postsecondary-students-at-risk-of-dropping-out

With the new system, every student is scored based on how likely they are to finish their courses

Journal Watch report logo, link to report landing page

It’s easy enough for students to sign up for online university courses, but getting them to finish is much harder. Dropout rates for online courses can be as high as 80 percent. Researchers have tried to help by developing early warning systems that predict which students are more likely to drop out. Administrators could then use these predictions to target at-risk students with extra retention efforts. And as these early warning systems become more sophisticated, they also reveal which variables are most closely correlated with dropout risk.

In a paper published 16 April in IEEE Transactions on Learning Technologies, a team of researchers in Spain describe an early warning system that uses machine learning to provide tailored predictions for both new and recurrent students. The system, called SPA (a Spanish acronym for Dropout Prevention System), was developed using data from more than 11,000 students who were enrolled in online programs at Madrid Open University (UDIMA) over the course of five years.

Amateurs’ Al Tells Real Rembrandts From Fakes

Post Syndicated from Mark Anderson original https://spectrum.ieee.org/tech-talk/computing/software/the-rembrandt-school-of-ai-an-algorithm-that-detects-art-forgery

In their spare time, a Massachusetts couple programmed a system that they say accurately identifies Rembrandts 90 percent of the time

A new AI algorithm may crack previously inaccessible image-recognition and analysis problems—especially those stymied by AI training sets that are too small, or whose individual sample images are too big and full of high-resolution detail that AI algorithms cannot process. Already, the new algorithm can detect forgeries of one famous artist’s work, and its creators are actively searching for other areas where it could potentially improve our ability to transform small data sets into ones large enough to train an AI neural network.

According to two amateur AI researchers, whose study is now under peer review at IEEE Transactions on Neural Networks and Learning Systems, the concept of entropy, borrowed from thermodynamics and information theory, may help AI systems uncover fake works of art.

In physical systems such as boiling pots of water and black holes, entropy concerns the amount of disorder contained within a given volume. In an image file, entropy is defined as the amount of useful, nonredundant information the file contains.

For Better AI, Turn Up the Contrast on Reality

Post Syndicated from Robert W. Lucky original https://spectrum.ieee.org/computing/software/for-better-ai-turn-up-the-contrast-on-reality

Humans are sometimes criticized for seeing the world in black and white, but maybe AI should learn the same trick

Many years ago, I was touring a working orchard with a friend. His son, who was the orchard’s manager, was describing his work. His father and I, being engineers, got into a discussion about how a robot might be instructed to pick the fruit.

The son stopped and stared at us in consternation. “What are you guys talking about!? It’s simple—you see it, you pick it.”

Not so simple: It’s only now, decades later, that commercial fruit-picking robots are on the radar. There are many everyday tasks that seem trivial yet are difficult to describe and structure for automation. Humans have the advantage of common-sense reasoning, which is much more deep and profound than most people would believe.

In my February column, I wrote about our success in creating computer programs that can master games like chess and poker. By their descriptions, these games are extraordinarily simple—a small number of immutable rules involving a few elements, whether they be chess pieces or playing cards. But there is a paradox, because underneath this simplicity is an enormous complexity. Nonetheless, that complexity is precisely defined, and that’s what we engineers are good at.

However, life is fuzzy and often ill defined. (If only real-life tasks could be modeled as board games, we’d be in business.) I love the idea of fuzzy logic, but on reflection, I actually do want my computer to be precise.

But maybe there could be some mechanism that would take fuzziness as an input and hand off well-defined output to a computational unit. This unit could then bring to bear the kind of techniques we’ve used to master games.

In real life we have such mechanisms. Consider American football, for example. There are rules about what happens following a forward pass that depend on whether it is caught or not caught. But “catch” is a fuzzy concept. So we have a device that digitizes the analog “catch.” It is called a referee. In law we have the equivalent in the courts, where judges and juries use various subjective standards such as “reasonableness” to determine whether or not an action falls on one side of the law or the other. And if we don’t like the court’s digitization, we treat it as an analog result and send it through another court.

As a manager, I was the digital arbiter on many personnel decisions—who got raises and in what dollar amounts, who got promoted, who got fired, and so forth. People always asked me what criteria I used for these decisions. What is the algorithm you use? they wanted to know. In truth, I wanted an algorithm too. There was a lot of fuzziness involved.

Besides being fuzzy, much of life is influenced by luck. The best team doesn’t always win, and the best person doesn’t always get the promotion. Life isn’t always fair, but this does shake up the pieces, so I’m not sure if this is a bug or a feature. Many board games, like Monopoly, do combine luck with skill. Maybe the fuzziness converter could add a bit of randomness.

But I’m just dreaming about this. Whatever real-life task we’re trying to automate, someone will ask why it’s taking so long. It’s simple, they’ll say. But it really isn’t.

This article appears in the May 2019 print issue as “AI’s Achilles’ Heel: Ambiguity.”

[$] Advanced computing with IPython

Post Syndicated from jake original https://lwn.net/Articles/756192/rss

If you use Python, there’s a good chance you have heard of IPython, which provides an enhanced read-eval-print
loop (REPL) for Python. But there is more to IPython than just a more
convenient REPL. Today’s IPython comes with integrated libraries that turn
it into an assistant for several advanced computing tasks. We will look at
two of those tasks, using multiple languages and distributed computing, in
this article.