Tag Archives: Sponsored

Important asphere specifications and their impact on optical performance.

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/important-asphere-specifications-and-their-impact-on-optical-performance

Aspheres as key optical components are true “enablers” in the field of optics and photonics, especially for applications which require light weight and small size. The whitepaper gives an overview of important asphere specifications and the impact they can have on optical performance.

Learn about Aspheres and their specifications and understand how to best use them to optimize performance of your optical system.

McMaster Engineering Grows Its Premier Program with Global Faculty Recruitment

Post Syndicated from McMaster Faculty of Engineering original https://spectrum.ieee.org/at-work/education/mcmaster-engineering-grows-its-premier-program-with-global-faculty-recruitment

The Faculty of Engineering at McMaster University in Hamilton, Ont., Canada is aiming to build on its ranking as one of the world’s top engineering schools by expanding its recruitment of both tenure-track and teaching track positions across multiple departments. This broad initiative is expected to continue the growth of McMaster as a leading destination for innovative teaching and research.

To support this growth and further develop McMaster Engineering’s longstanding strengths in research, innovation and graduate training, the positions being offered will include two Tier II Canada Research Chair (CRC) and tenure-track positions, with specialization in the fields of micro-nano technology, smart systems, and bio-innovation.

“The rapid growth in the reputation of the Faculty of Engineering reflects our continuing focus on innovative research designed for impact and educating agile learners to become equipped to tackle our world’s greatest challenges,” says Ishwar K. Puri, McMaster’s dean of engineering.

In addition to successful applicants teaching both undergraduate and graduate level courses, they will also be expected to establish a strong externally-funded research program, supervise graduate students and foster existing or new collaborations with other departments and faculties.

“A range of perspectives leads to better insights and innovation, and our diverse and inclusive community is a key factor in our success. We welcome experts from around the world to be part of this next generation of growth and innovation in the Faculty of Engineering,” adds John Preston, McMaster Engineering’s associate dean, research and external relations.

The strength of McMaster Engineering has been its strong focus on interdisciplinary collaboration and an emphasis on research with impact.  This focus on R&D with real-world impact is demonstrated by how its research has scored in the United Nations Sustainable Development Goals in categories such as good health and well-being, quality education, gender equality, and industry, innovation and infrastructure and climate action.

Earlier this year, McMaster ranked 17th in the world in the Times Higher Education Impact Rankings and number one in Canada for good health and well-being and decent work and economic growth. The rankings recognize the important contributions universities make to their communities, countries and on an international scale.

In a combination of both its commitment to impactful research and collaboration, McMaster Engineering has also aimed at providing a supportive and inclusive environment that celebrates big ideas and commercialization while working with industry partners around the world to solve the world’s most pressing challenges.

The Faculty’s mission to push the boundaries of discovery and innovation plays a significant role in helping McMaster University earn its reputation as one of Canada’s most innovative universities.

As Canada’s most research-intensive university, McMaster’s commitment to research continues to be reflected in its rankings. Most recently McMaster was named one of the world’s top 70 universities in the 2021 Times Higher Education rankings. As well, 14 academic disciplines at McMaster Engineering are ranked among the best in Canada by Shanghai Ranking.

Innovation extends to McMaster Engineering’s approach to education. In September 2020 after a two-year pilot, McMaster Engineering formally launched The Pivot, an historic $15 million initiative marking the largest transformation of the school’s curriculum, experiential learning and the classroom experience in the 62-year history of the Faculty.

This year, as part of The Pivot initiative, more than 1,100 first-year engineering students are experiencing the school’s new interactive course called Integrated Cornerstone Design Projects in Engineering. This novel course integrates concepts previously taught in four different courses into a single, seamless, project-based learning experience, allowing students to work in teams, design prototypes and solve real-world problems.

By transforming the engineering curriculum, reimagining the learning environment and amplifying experiential learning, The Pivot takes a project-based and experiential learning approach to developing future-ready graduates with design-thinking and entrepreneurial mindsets.

For more information on current opportunities within the Faculty of Engineering, view the postings here.

Increasing test coverage in hard switching half bridge configurations

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/increasing-test-coverage-in-hard-switching-half-bridge-configurations

Do you have to pay particular attention to proper switching operations to prevent shoot-through events? Learn more.

Setting up complex real-time trigger conditions using the R&S oscilloscopes increases the test coverage & robustness of converter & inverter systems.

The Right Robotic Solution for Your Unique Operation

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/the-right-robotic-solution-for-your-unique-operation

Robotic solutions can help your operation keep up with the demands of today’s changing e-commerce market. Honeywell Robotics is helping DCs evaluate solutions with powerful physics-based simulation tools to ensure that everything works together in an integrated ecosystem.

Put more than a quarter-century of automation expertise to work for you.

Download White Paper

Meet the next generation of quantum analyzers at this Zurich Instruments launch event

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/meet-the-next-generation-of-quantum-analyzers-at-this-zurich-instruments-launch-event

Would you like to improve the readout of your superconducting qubits, increase the fidelity of your quantum algorithm, or scale up your qubit system size? These are the goals that motivated Zurich Instruments to bring to the market the
SHFQA Quantum Analyzer. This virtual launch event will provide a technical overview of the instrument’s capabilities, including the strengths of the SHFQA’s integrated and mixer-calibration-free frequency conversion scheme. Practical instrument demonstrations will then show you how to measure a resonator at 8 GHz with two microwave cables only, and how to perform the readout of 16 qubits in parallel.

Register Now

Prevent and Solve Common Test & Measurement Issues

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/prevent-and-solve-common-test-measurement-issues

With distance learning, students may not have a Professor nearby to help them setup and perform their labs. This leaves the student, the instruments and the device under test at risk. Share this troubleshooting flyer with your EE students to navigate some common issues.

Download Now

Simulation-Driven Design of a Hyperloop Capsule Motor

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/simulationdriven-design-of-a-hyperloop-capsule-motor

The Hyperloop transportation system is composed of a constrained space characterized by a low-pressure environment that is usually represented by tubes/tunnels. The space also houses a dedicated rail responsible for the mechanical constraining of energy-autonomous vehicles (called capsules or pods) carrying a given payload. Hyperloop capsules are expected to be self-propelled and can use the tube’s rail for guidance, magnetic levitation, and propulsion. For an average speed in the order of two to three times larger than high-speed electric trains and a maximum speed in the order of the speed of sound, the Hyperloop is expected to achieve average energy consumption in the range of 30–90 Wh/passenger/km and CO2 emissions in the range of 5–20 g CO2/passenger/km. A key aspect to achieve this performance is the optimal design of the capsule propulsion. A promising solution is represented by the double-sided linear induction motor (DSLIM). The performance of high-speed DSLIM is affected by material properties and geometrical factors.

In this webinar, we describe how to model a DSLIM using the COMSOL Multiphysics® software to provide an accurate estimation of the exerted thrust by the motor. Furthermore, we illustrate how to carry out a simulation-driven optimization to find the best motor configuration in terms of maximum speed. The results of the simulations are compared with measurements carried out in an experimental test bench developed at the Swiss Federal Institute of Technology, Lausanne, within the context of the participation of the EPFLoop team to the 2019 SpaceX Hyperloop pod competition.

Discover how AWS Marketplace seller solutions can help you scale and bring productivity to your SOC

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/discover-how-aws-marketplace-seller-solutions-can-help-you-scale-and-bring-productivity-to-your-soc

You’re Invited!

Join this webinar to learn how AWS customers are using automation and integrated threat intelligence to increase efficiency and scale their cloud security operations center (SOC).

In this webinar:

SANS and AWS Marketplace will explore real-world examples and offer practical guidance to help equip you with the needed visibility and efficiencies to scale. You will learn how to limit alert fatigue while enhancing SOC productivity through automating actionable insights and removing repetitive manual tasks. 

Attendees of this webinar will learn how to:

  • Structure a cloud SOC to scale through technology
  • Integrate threat intelligence into security workflows
  • Utilize automated triaging and action playbooks
  • Leverage AWS services and seller solutions in AWS Marketplace to help achieve these goals

Register Now

Nanosilica Filled Optically Clear Epoxy Adhesive

Post Syndicated from Jane Trager original https://spectrum.ieee.org/semiconductors/materials/nanosilica-filled-optically-clear-epoxy-adhesive

Master Bond EP30NS is a two component epoxy system with moderate viscosity and good flow. It contains a nanosilica filler which lends to its specific property profile by providing a much higher abrasion resistance than a typical epoxy, and much lower linear shrinkage upon cure. EP30NS passes ASTM E595 for NASA low outgassing.

To obtain optical properties, cure overnight at room temperature, followed by 2-3 hours at 150-200°F. The epoxy has a moderate mixed viscosity ranging from 25,000 to 45,000 cps. It is optically clear, with a refractive index of 1.56. EP30NS has been independently tested per ASTM D4060-14 for abrasion resistance for 1,000 cycles and exhibited a loss of weight of only 18.3 mg. It is thus able to withstand exposure to scuffing, gouging, scraping, scratching and wear.

This system has excellent electrical insulation, making it well suited for small potting applications. It forms dimensionally stable, rigid bonds. It bonds well to metals, glass, ceramics, composites, rubbers, and plastics. It is chemically resistant to water, fuels, oils, acids and solvents.  The service temperature range is from -60°F to +300°F. This system is recommended for high tech applications in the aerospace, electronic, optical, opto-electronic and specialty OEM industries. It is available in both standard packaging and specialty gun dispenser packaging.

For more information on EP30NS and to request a technical datasheet please visit https://www.masterbond.com/tds/ep30ns.

NYU Wireless Picks Up Its Own Baton to Lead the Development of 6G

Post Syndicated from NYU Wireless original https://spectrum.ieee.org/telecom/wireless/nyu_wireless_picks_up_its_own_baton_to_lead_the_development_of_6g

The fundamental technologies that have made 5G possible are unequivocally massive MIMO (multiple-input multiple-output) and millimeter wave (mmWave) technologies. Without these two technologies there would be no 5G network as we now know it.

The two men, who were the key architects behind these fundamental technologies for 5G, have been leading one of the premier research institutes in mobile telephony since 2012: NYU Wireless, a part of NYU’s Tandon School of Engineering

Ted Rappaport is the founding director of NYU Wireless, and one of the key researchers in the development of mmWave technology. Rappaport also served as the key thought leader for 5G by planting the flag in the ground nearly a decade ago that argued mmWave would be a key enabling technology for the next generation of wireless.  His earlier work at two other wireless centers that he founded at Virginia Tech and The University of Austin laid the early groundwork that helped NYU Wireless catapult into one of the premier wireless institutions in the world.

Thomas Marzetta, who now serves as the director of NYU Wireless, was the scientist who led the development of massive MIMO while he was at Bell Labs and has championed its use in 5G to where it has become a key enabling technology for it.

These two researchers, who were so instrumental in developing the technologies that have enabled 5G, are now turning their attention to the next generation of mobile communications, and, according to them both, we are facing some pretty steep technical challenges to realizing a next generation of mobile communications.

“Ten years ago, Ted was already pushing mobile mmWave, and I at Bell Labs was pushing massive MIMO,” said Marzetta.  “So we had two very promising concepts ready for 5G. The research concepts that the wireless community is working on for 6G are not as mature at this time, making our focus on 6G even more important.”

This sense of urgency is reflected by both men, who are pushing against any sense of complacency that may exist in starting the development of 6G technologies as soon as possible. With this aim in mind, Rappaport, just as he did 10 years ago, has planted a new flag in the world of mobile communications with his publication last year of an article with the IEEE, entitled “Wireless Communications and Applications Above 100 GHz: Opportunities and Challenges for 6G and Beyond” 

“In this paper, we said for the first time that 6G is going to be in the sub-terahertz frequencies,” said Rappaport. “We also suggested the idea of wireless cognition where human thought and brain computation could be sent over wireless in real time. It’s a very visionary look at something. Our phones, which right now are flashlights, emails, TV browsers, calendars, are going to be become much more.”

While Rappaport feels confident that they have the right vision for 6G, he is worried about the lack of awareness of how critical it is for the US Government funding agencies and companies to develop the enabling technologies for its realization. In particular, both Rappaport and Marzetta are concerned about the economic competitiveness of the US and the funding challenges that will persist if it is not properly recognized as a priority.

“These issues of funding and awareness are critical for research centers, like NYU Wireless,” said Rappaport. “The US needs to get behind NYU Wireless to foster these ideas and create these cutting-edge technologies.”

With this funding support, Rappaport argues, teaching research institutes like NYU Wireless can create the engineers that end up going to companies and making technologies like 6G become a reality. “There are very few schools in the world that are even thinking this far ahead in wireless; we have the foundations to make it happen,” he added.

Both Rappaport and Marzetta also believe that making national centers of excellence in wireless could help to create an environment in which students could be exposed constantly to a culture and knowledge base for realizing the visionary ideas for the next generation of wireless.

“The Federal government in the US needs to pick a few winners for university centers of excellence to be melting pots, to be places where things are brought together,” said Rappaport. “The Federal government has to get together and put money into these centers to allow them to hire talent, attract more faculty, and become comparable to what we see in other countries where huge amounts of funding is going in to pick winners.”

While research centers, like NYU Wireless, get support from industry to conduct their research, Rappaport and Marzetta see that a bump in Federal funding could serve as both amplification and a leverage effect for the contribution of industrial affiliates. NYU Wireless currently has 15 industrial affiliates with a large number coming from outside the US, according to Rappaport.

“Government funding could get more companies involved by incentivizing them through a financial multiplier,” added Rappaport

Of course, 6G is not simply about setting out a vision and attracting funding, but also tackling some pretty big technical challenges.

Both men believe that we will need to see the development of new forms of MIMO, such as holographic MIMO, to enable more efficient use of the sub 6 GHz spectrum. Also, solutions will need to be developed to overcome the blockage problems that occur with mmWave and higher frequencies.

Fundamental to these technology challenges is accessing new frequency spectrums so that a 6G network operating in the sub-terahertz frequencies can be achieved. Both Rappaport and Marzetta are confident that technology will enable us to access even more challenging frequencies.

“There’s nothing technologically stopping us right now from 30, and 40, and 50 gigahertz millimeter wave, even up to 700 gigahertz,” said Rappaport. “I see the fundamentals of physics and devices allowing us to take us easily over the next 20 years up to 700 or 800 gigahertz.”

Marzetta added that there is much more that can be done in the scarce and valuable sub-6GHz spectrum. While massive MIMO is the most spectrally efficient wireless scheme ever devised, it is based on extremely simplified models of how antennas create electromagnetic signals that propagate to another location, according to Marzetta, adding, “No existing wireless system or scheme is operating close at all to limits imposed by nature.”

While expanding the spectrum of frequencies and making even better use of the sub-6GHz spectrum are  the foundation for the realization of future networks, Rappaport and Marzetta also expect that we will see increased leveraging of AI and machine learning. This will enable the creation of intelligent networks that can manage themselves with much greater efficiency than today’s mobile networks.

“Future wireless networks are going to evolve with greater intelligence,” said Rappaport. An example of this intelligence, according to Rappaport, is the new way in which the Citizens Broadband Radio Service (CBRS) spectrum is going to be used in a spectrum access server (SAS) for the first time ever.

“It’s going to be a nationwide mobile system that uses these spectrum access servers that mobile devices talk to in the 3.6 gigahertz band,” said Rappaport. “This is going to allow enterprise networks to be a cross of old licensed cellular and old unlicensed Wi-Fi. It’s going to be kind of somewhere in the middle. This serves as an early indication of how mobile communications will evolve over the next decade.”

These intelligent networks will become increasingly important when 6G moves towards so-called cell-less (“cell-free”) networks.

Currently, mobile network coverage is provided through hundreds of roughly circular cells spread out across an area. Now with 5G networks, each of these cells will be equipped with a massive MIMO array to serve the users within the cell. But with a cell-less 6G network the aim would be to have hundreds of thousands, or even millions, of access points, spread out more or less randomly, but with all the networks operating cooperatively together.

“With this system, there are no cell boundaries, so as a user moves across the city, there’s no handover or handoff from cell to cell because the whole city essentially constitutes one cell,” explained Marzetta. “All of the people receiving mobile services in a city get it through these access points, which in principle, every user is served by every access point all at once.”

One of the obvious challenges of this cell-less architecture is just the economics of installing so many access points all over the city. You have to get all of the signals to and from each access point from or to one sort of central point that does all the computing and number crunching.

While this all sounds daunting when thought of in the terms of traditional mobile networks, it conceptually sounds far more approachable when you consider that the Internet of Things (IoT) will create this cell-less network.

“We’re going to go from 10 or 20 devices today to hundreds of devices around us that we’re communicating with, and that local connectivity is what will drive this cell-less world to evolve,” said Rappaport. “This is how I think a lot of 5G and 6G use cases in this wide swath of spectrum are going to allow these low-power local devices to live and breathe.”

To realize all of these technologies, including intelligent networks, cell-less networks, expanded radio frequencies, and wireless cognition, the key factor will be training future engineers. 

To this issue, Marzetta noted: “Wireless communications is  a growing and dynamic field that  is a real opportunity for the next generation of young engineers.”

Introducing Calibre nmLVS-Recon – A new paradigm for circuit verification

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/introducing-calibre-nmlvsrecon-a-new-paradigm-for-circuit-verification

Each year, at least 50% of tapeouts are late, with physical and circuit verification closure a significant contributing factor. Mentor incorporated the know-how of our industry-leading Calibre nmLVS sign-off tool with lessons learned from customers to create an innovative smart engine specifically engineered to help design teams find and fix high-impact systemic errors early in the design flow. As part of our growing suite of early-stage design verification technologies, the Calibre nmLVS-Recon tool enables designers to accelerate early-stage design analysis and debug cycles, and reduce the time needed to reach tapeout. We explain the concept behind our innovative technology, and introduce the first Calibre nmLVS-Recon use model – short isolation analysis.

Protect Electronic Components with Dam & Fill

Post Syndicated from MasterBond original https://spectrum.ieee.org/computing/hardware/protect-electronic-components-with-dam-and-fill

Watch now to see the dam and fill method.

Sensitive electronic components mounted on a circuit board often require protection from exposure to harsh environments. While there are several ways to accomplish this, the dam and fill method offers many benefits. Dam-and-filling entails dispensing the damming material around the area to be encapsulated, thereby restricting the flow of the fill from spreading to other parts of the board.

By using a damming compound such as Supreme 3HTND-2DM-1, you can create a barrier around the component. Then, with a flowable filler material like EP3UF-1, the component can be completely covered for protection.

To start, apply the damming compound, Supreme 3HTND-2DM-1, around the component. Supreme 3HTND-2DM-1 is readily dispensed to create a structurally sound barrier. This material will not run and will cure in place in 5-10 minutes at 300°F, in essence forming a dam.

After the damming compound has been applied and cured, a filling compound such as EP3UF-1 is     dispensed to fill the area inside the dam and cover the component to be protected. EP3UF-1 is a specialized, low viscosity one part system with a filler that has ultra small particle sizes, which enables it to flow even in tiny spaces. This system cures in 10-15 minutes at 300°F and features low shrinkage and high dimensional stability once cured.

Both Supreme 3HTND-2DM-1 and EP3UF-1 are thermally conductive, electrically insulating compounds and are available for use in syringes for automated or manual dispensing.

Despite being a two step process, dam and fill offers the following advantages over glob topping:

  • Flow of the filling compound is controlled and restricted
  • It can be applied to larger sections of the board
  • Filling compound flows better than a glob top, allowing better protection underneath and around component

The Devil is in the Data: Overhauling the Educational Approach to AI’s Ethical Challenge

Post Syndicated from NYU Tandon School of Engineering original https://spectrum.ieee.org/computing/software/the-devil-is-in-the-data-overhauling-the-educational-approach-to-ais-ethical-challenge

The evolution and wider use of artificial intelligence (AI) in our society is creating an ethical crisis in computer science like nothing the field has ever faced before. 

“This crisis is in large part the product of our misplaced trust in AI in which we hope that whatever technology we denote by this term will solve the kinds of societal problems that an engineering artifact simply cannot solve,” says Julia Stoyanovich,  an Assistant Professor in the Department of Computer Science and Engineering at the NYU Tandon School of Engineering, and the Center for Data Science at New York University. “These problems require human discretion and judgement, and where a human must be held accountable for any mistakes.”

Stoyanovich believes the strikingly good performance of machine learning (ML) algorithms on tasks ranging from game playing, to perception, to medical diagnosis, and the fact that it is often hard to understand why these algorithms do so well and why they sometimes fail, is surely part of the issue. But Stoyanovich is concerned that it is also true that simple rule-based algorithms such as score-based rankers — that compute a score for each job applicant, sort applicants on their score, and then suggest to interview the top-scoring three — can have discriminatory results. “The devil is in the data,” says Stoyanovich.

As an illustration of this point, in a comic book that Stoyanovich produced with Falaah Arif Khan entitled “Mirror, Mirror”,  it is made clear that when we ask AI to move beyond games, like chess or Go, in which the rules are the same irrespective of a player’s gender, race, or disability status, and look for it to perform tasks that allocated resources or predict social outcomes, such as deciding who gets a job or a loan, or which sidewalks in a city should be fixed first, we quickly discover that embedded in the data are social, political and cultural biases that distort results.

In addition to societal bias in the data, technical systems can introduce additional skew as a result of their design or operation. Stoyanovich explains that if, for example, a job application form has two options for sex, ‘male’ and ‘female,’ a female applicant may choose to leave this field blank for fear of discrimination. An applicant who identifies as non-binary will also probably leave the field blank. But if the system works under the assumption that sex is binary and post-processes the data, then the missing values will be filled in. The most common method for this is to set the field to the value that occurs most frequently in the data, which will likely be ‘male’. This introduces systematic skew in the data distribution, and will make errors more likely for these individuals.

This example illustrates that technical bias can arise from an incomplete or incorrect choice of data representation. “It’s been documented that data quality issues often disproportionately affect members of historically disadvantaged groups, and we risk compounding technical bias due to data representation with pre-existing societal bias for such groups,” adds Stoyanovich.

This raises a host of questions, according to Stoyanovich, such as: How do we identify ethical issues in our technical systems? What types of “bias bugs” can be resolved with the help of technology? And what are some cases where a technical solution simply won’t do? As challenging as these questions are, Stoyanovich maintains we must find a way to reflect them in how we teach computer science and data science to the next generation of practitioners.

“Virtually all of the departments or centers at Tandon do research and collaborations involving AI in some way, whether artificial neural networks, various other kinds of machine learning, computer vision and other sensors, data modeling, AI-driven hardware, etc.,” says Jelena Kovačević, Dean of the NYU Tandon School of Engineering. “As we rely more and more on AI in everyday life, our curricula are embracing not only the stunning possibilities in technology, but the serious responsibilities and social consequences of its applications.”

Stoyanovich quickly realized as she looked at this issue as a pedagogical problem that professors who were teaching the ethics courses for computer science students were not computer scientists themselves, but instead came from humanities backgrounds. There were also very few people who had expertise in both computer science and the humanities, a fact that is exacerbated by the “publish or perish” motto that keeps professors siloed in their own areas of expertise.

“While it is important to incentivize technical students to do more writing and critical thinking, we should also keep in mind that computer scientists are engineers.  We want to take conceptual ideas and build them into systems,” says Stoyanovich.  “Thoughtfully, carefully, and responsibly, but build we must!”

But if computer scientists need to take on this educational responsibility, Stoyanovich believes that they will have to come to terms with the reality that computer science is in fact limited by the constraints of the real world, like any other engineering discipline.

“My generation of computer scientists was always led to think that we were only limited by the speed of light. Whatever we can imagine, we can create,” she explains. “These days we are coming to better understand how what we do impacts society and we have to impart that understanding to our students.”

Kovačević echoes this cultural shift in how we must start to approach the teaching of AI. Kovačević notes that computer science education at the collegiate level typically keeps the tiller set on skill development, and exploration of the technological scope of computer science — and a unspoken cultural norm in the field that since anything is possible, anything is acceptable.  “While exploration is critical, awareness of consequences must be, as well,” she adds.

Once the first hurdle of understanding that computer science has restraints in the real world is met, Stoyanovich argues that we will next have to confront the specious idea that AI is the tool that will lead humanity into some kind of utopia.

“We need to better understand that whatever an AI program tells us is not true by default,” says Stoyanovich. “Companies claim they are fixing bias in the data they present into these AI programs, but it’s not that easy to fix thousands of years of injustice embedded in this data.”

In order to include these fundamentally different approaches to AI and how it is taught, Stoyanovich has created a new course at NYU Tandon entitled Responsible Data Science. This course has now become a requirement for students getting a BA degree in data science at NYU. Later, she would like to see the course become a requirement for graduate degrees as well. In the course, students are taught both “what we can do with data” and, at the same time, “what we shouldn’t do.”

Stoyanovich has also found it exciting to engage students in conversations surrounding AI regulation.  “Right now, for computer science students there are a lot of opportunities to engage with policy makers on these issues and to get involved in some really interesting research,” says Stoyanovich. “It’s becoming clear that the pathway to seeing results in this area is not limited to engaging industry but also extends to working with policy makers, who will appreciate your input.”

In these efforts towards engagement, Stoyanovich and NYU are establishing the Center for Responsible AI, to which IEEE-USA offered its full support last year. One of the projects the Center for Responsible AI is currently engaged in is a new law in New York City to amend its administrative code in relation to the sale of automated employment decision tools.

“It is important to emphasize that the purpose of the Center for Responsible AI is to serve as more than a colloquium for critical analysis of AI and its interface with society, but as an active change agent,” says Kovačević. “What that means for pedagogy is that we teach students to think not just about their skill sets, but their roles in shaping how artificial intelligence amplifies human nature, and that may include bias.”

Stoyanovich notes: “I encourage the students taking Responsible Data Science to go to the hearings of the NYC Committee on Technology.  This keeps the students more engaged with the material, and also gives them a chance to offer their technical expertise.”

Innovative YUJIN 3D LiDAR, Now Shipping!

Post Syndicated from YujinRobot original https://spectrum.ieee.org/robotics/home-robots/innovative_yujin_3d_lidar_now_shipping


Recently Yujin Robot launched a new 3D LiDAR for indoor service robot, AGVs/AMRs and smart factory.  The YRL3 series is a line of precise laser sensors for vertical and horizontal scanning to detect environments or objects.  The Yujin Robot YRL3 series LiDAR is designed for indoor applications and utilizes an innovative 3D scanning LiDAR for a 270°(Horizontal) x 90°(vertical) dynamic field of view as a single channel.  The fundamental principle is based on direct ToF (Time of Flight) and designed to measure distances towards surroundings.  YRL3 collect useful data including ranges, angles, intensities and Cartesian coordinates (x,y,z).  Real-time vertical right-angle adjustment is possible and supports powerful S/W package for autonomous driving devices.

“In recent years, our product lineup expanded to include models for the Fourth Industrial Revolution,” shares the marketing team of Yujin Robot.  These models namely are Kobuki, the ROS reference research robot platform used by robotics research labs around the world, the Yujin LiDAR range-finding scanning sensor for LiDAR-based autonomous driving, AMS solution (Autonomous Mobility Solution) for customized autonomous driving.  The company continues to push the boundaries of robotics and artificial intelligence, developing game-changing autonomous solutions that give companies around the world an edge over the competition.

Designing Thermal Management Systems for Electronics Through Simulations

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/designing_thermal_management_systems_for_electronics_through_simulations

The impact of increasingly powerful electronics on our society cannot be overstated. These more powerful electronics produce significant heat that must be dissipated to prevent premature component failure. Engineers that design electronics face a significant thermal management challenge. Electrical engineers frequently seek to increase the power of critical components, and keeping these components cool represents a significant design challenge. This design task becomes even more challenging when the cooling systems rely on natural convection instead of forced convection from fans, due to the relatively short life expectancy of fans.

One solution to this engineering challenge is to use multiphysics software tools to improve the accuracy of the engineer’s calculations in comparison to analytic and single-physics simulation solutions. These simulations include heat generated by the component, airflow around the component, and radiative heat transfer between the component and the surroundings. Heat generation due to resistive heating in the board can be included with heat generated from components to determine the heat generated within the system. Airflow through the system due to either forced or natural convection can also be analyzed. For many systems, radiation must be considered for accurate temperature predictions due to the large amount of heat transfer that occurs via this mechanism in many electronic designs.

In this presentation, guest speakers Kyle Koppenhoefer and Joshua Thomas from AltaSim Technologies will discuss the development of an electronics cooling problem subjected to a complex thermal environment. The webinar will also include a live demo in the COMSOL Multiphysics® software and a Q&A session.

Satellite Mission Planning – the R&S®SLP Satellite Link Planner and the R&S®CSM Communication System Monitoring

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/satellite_mission_planning_the_r_s

Planning of satellite communication links or even whole networks is a very demanding task. In the first part of this webinar, we will present our software for satellite link planning that supports the user in a convenient way but takes into the account all relevant sources of impact. In the second part of the webinar, we will demonstrate our solution for monitoring satellite networks, either at one site or distributed worldwide. In addition, we will put focus on the identification of interference coming from unwanted satellite signals or terrestrial sources. We will also show how to make interfering signals that lie underneath the wanted satellite carrier visible.

Attendees of the webinar will learn about:

• The sources of impact affecting satellite links

• How to plan satellite links or whole satellite networks

• The best way to monitor your satellite connections automated and reliably

• How to detect and identify interferences

NASA’s Mars Rover Required a Special Touch for Its Robotic Arms

Post Syndicated from ATI Industrial Automation original https://spectrum.ieee.org/aerospace/robotic-exploration/nasa_mars_rover_required_a_special_touch_for_its_robotic_arms

In July, NASA launched the most sophisticated rover the agency has ever built: Perseverance. https://mars.nasa.gov/mars2020/ Scheduled to land on Mars in February 2021, Perseverance will be able to perform unique research into the history of microbial life on Mars in large part due to its robotic arms. To achieve this robotic capability, NASA needed to call upon innovation-driven contractors to make such an engineering feat a reality.

One of the company’s that NASA enlisted to help develop Perseverance was ATI Industrial Automation. https://www.ati-ia.com/ NASA looked to have ATI adapt the company’s own Force/Torque Sensor to enable the robotic arm of Perseverance to operate in the environment of space. ATI Force/Torque sensors were initially developed to enable robots and automation systems to sense the forces applied while interacting with their environment in operating rooms or factory floors.

However, the environment of space presented unique engineering challenges for ATI’s Force/Torque Sensor. The extreme environment and the need for redundancy to ensure that any single failure wouldn’t compromise the sensor function were the key challenges the ATI engineers faced, according to Ian Stern, Force/Torque Sensor Product Manager at ATI. https://www.linkedin.com/in/ianhstern/

“ATI’s biggest technical challenge was developing the process and equipment needed to perform the testing at the environmental limits,” said Stern. “The challenges start when you consider the loads that the sensor sees during the launch of the Atlas 5 rocket from earth. The large G forces cause the tooling on the end of the sensor to generate some of the highest loads that the sensor sees over its life.”

Once on Mars the sensor must be able to accurately and reliably measure force/torques in temperatures ranging from -110° to +70° Celsius (C). This presents several challenges because of how acutely temperature influences the accuracy of force measurement devices. To meet these demands, ATI developed the capability to calibrate the sensors at -110°C. “This required a lot of specialized equipment for achieving these temperatures while making it safe for our engineers to perform the calibration process,” added Stern.

In addition to the harsh environment, redundancy strategies are critical for a sensor technology on a space mission. While downtime on the factory floor can be costly, a component failure on Mars can render the entire mission worthless since there are no opportunities for repairs.

This need for a completely reliable product meant that ATI engineers had to develop their sensor so that it was capable of detecting failures in its

measurements as well as accurately measuring forces and torques should there be multiple failures on the measurement signals. ATI developed a patented process for achieving this mechanical and electrical redundancy.

All of this effort to engineer a sensor for NASA’s Mars mission may enable a whole new generation of space exploration, but it’s also paying immediate dividends for ATI’s more terrestrial efforts in robotic sensors.

“The development of a sensor for the temperatures on Mars has helped us to develop and refine our process of temperature compensation,” said Stern. “This has benefits on the factory floor in compensating for thermal effects from tooling or the environment.”

Stern points out as an example of these new temperature compensation strategies a solution that was developed to address the heat produced by the motor mounted to a tool changer. This heat flux can cause undesirable output on the Force/Torque data, according to Stern.

“As a result of the Mars Rover project we now have several different processes to apply on our standard industrial sensors to mitigate the effects of temperature change,” said Stern.

The redundancy requirements translated into a prototype of a Standalone Safety Rated Force/Torque sensor capable of meeting Performance Level d (PL-d) safety requirements.

This type of sensor can actively check its health and provide extremely high-resolution data allowing a large, 500 kilogram payload robot handling automotive body parts to safely detect if a human finger was pinched.

ATI is also leveraging the work it did for Perseverance to inform some of its ongoing space projects. One particular project is for a NASA Tech demo that is targeting a moon rover for 2023, a future mars rovers and potential mission to Europa that would use sensors for drilling into ice.

Stern added: “The fundamental capability that we developed for the Perseverance Rover is scalable to different environments and different payloads for nearly any space application.”

For more information on ATI Industrial Automation please click here.

Unlock Wireless Test Capabilities On Your RF Gear

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/unlock_wireless_test_capabilities_on_your_rf_gear

Discover how easy it is to update your instrument to test the latest wireless standards.

We’re offering 30-day software trials that evolve test capabilities on your signal analyzers and signal generators. Automatically generate or analyze signals for many wireless applications.

Choose from our more popular applications:

  • Bluetooth ®
  • WLAN 802.11
  • Vector Modulation Analysis
  • And more

Get More From Your Automotive Electronics Test Investment

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/whitepaper/get_more_from_your_automotive_electronics_test_investment

Keysight Automotive

Explore the powerful software behind Keysight’s high-precision hardware and discover how to meet emerging automotive electronics test requirements, while driving higher ROI from your existing hardware. Let Keysight help you cross the finish line ahead of your competition. Jump-start your automotive innovation today with a complimentary software trial

How to Improve Threat Detection and Hunting in the AWS Cloud Using the MITRE ATT&CK Matrix

Post Syndicated from IEEE Spectrum Recent Content full text original https://spectrum.ieee.org/webinar/how_to_improve_threat_detection_and_hunting_in_the_aws_cloud

SANS and AWS Marketplace will discuss the exercise of applying MITRE’s ATT&CK Matrix to the AWS Cloud. They will also explore how to enhance threat detection and hunting in an AWS environment to maintain a strong security posture.