With distance learning, students may not have a Professor nearby to help them setup and perform their labs. This leaves the student, the instruments and the device under test at risk. Share this troubleshooting flyer with your EE students to navigate some common issues.
The Hyperloop transportation system is composed of a constrained space characterized by a low-pressure environment that is usually represented by tubes/tunnels. The space also houses a dedicated rail responsible for the mechanical constraining of energy-autonomous vehicles (called capsules or pods) carrying a given payload. Hyperloop capsules are expected to be self-propelled and can use the tube’s rail for guidance, magnetic levitation, and propulsion. For an average speed in the order of two to three times larger than high-speed electric trains and a maximum speed in the order of the speed of sound, the Hyperloop is expected to achieve average energy consumption in the range of 30–90 Wh/passenger/km and CO2 emissions in the range of 5–20 g CO2/passenger/km. A key aspect to achieve this performance is the optimal design of the capsule propulsion. A promising solution is represented by the double-sided linear induction motor (DSLIM). The performance of high-speed DSLIM is affected by material properties and geometrical factors.
In this webinar, we describe how to model a DSLIM using the COMSOL Multiphysics® software to provide an accurate estimation of the exerted thrust by the motor. Furthermore, we illustrate how to carry out a simulation-driven optimization to find the best motor configuration in terms of maximum speed. The results of the simulations are compared with measurements carried out in an experimental test bench developed at the Swiss Federal Institute of Technology, Lausanne, within the context of the participation of the EPFLoop team to the 2019 SpaceX Hyperloop pod competition.
Join this webinar to learn how AWS customers are using automation and integrated threat intelligence to increase efficiency and scale their cloud security operations center (SOC).
In this webinar:
SANS and AWS Marketplace will explore real-world examples and offer practical guidance to help equip you with the needed visibility and efficiencies to scale. You will learn how to limit alert fatigue while enhancing SOC productivity through automating actionable insights and removing repetitive manual tasks.
Attendees of this webinar will learn how to:
Structure a cloud SOC to scale through technology
Integrate threat intelligence into security workflows
Utilize automated triaging and action playbooks
Leverage AWS services and seller solutions in AWS Marketplace to help achieve these goals
Master Bond EP30NS is a two component epoxy system with moderate viscosity and good flow. It contains a nanosilica filler which lends to its specific property profile by providing a much higher abrasion resistance than a typical epoxy, and much lower linear shrinkage upon cure. EP30NS passes ASTM E595 for NASA low outgassing.
To obtain optical properties, cure overnight at room temperature, followed by 2-3 hours at 150-200°F. The epoxy has a moderate mixed viscosity ranging from 25,000 to 45,000 cps. It is optically clear, with a refractive index of 1.56. EP30NS has been independently tested per ASTM D4060-14 for abrasion resistance for 1,000 cycles and exhibited a loss of weight of only 18.3 mg. It is thus able to withstand exposure to scuffing, gouging, scraping, scratching and wear.
This system has excellent electrical insulation, making it well suited for small potting applications. It forms dimensionally stable, rigid bonds. It bonds well to metals, glass, ceramics, composites, rubbers, and plastics. It is chemically resistant to water, fuels, oils, acids and solvents. The service temperature range is from -60°F to +300°F. This system is recommended for high tech applications in the aerospace, electronic, optical, opto-electronic and specialty OEM industries. It is available in both standard packaging and specialty gun dispenser packaging.
The two men, who were the key architects behind these fundamental technologies for 5G, have been leading one of the premier research institutes in mobile telephony since 2012: NYU Wireless, a part of NYU’s Tandon School of Engineering.
Ted Rappaport is the founding director of NYU Wireless, and one of the key researchers in the development of mmWave technology. Rappaport also served as the key thought leader for 5G by planting the flag in the ground nearly a decade ago that argued mmWave would be a key enabling technology for the next generation of wireless. His earlier work at two other wireless centers that he founded at Virginia Tech and The University of Austin laid the early groundwork that helped NYU Wireless catapult into one of the premier wireless institutions in the world.
These two researchers, who were so instrumental in developing the technologies that have enabled 5G, are now turning their attention to the next generation of mobile communications, and, according to them both, we are facing some pretty steep technical challenges to realizing a next generation of mobile communications.
“Ten years ago, Ted was already pushing mobile mmWave, and I at Bell Labs was pushing massive MIMO,” said Marzetta. “So we had two very promising concepts ready for 5G. The research concepts that the wireless community is working on for 6G are not as mature at this time, making our focus on 6G even more important.”
“In this paper, we said for the first time that 6G is going to be in the sub-terahertz frequencies,” said Rappaport. “We also suggested the idea of wireless cognition where human thought and brain computation could be sent over wireless in real time. It’s a very visionary look at something. Our phones, which right now are flashlights, emails, TV browsers, calendars, are going to be become much more.”
While Rappaport feels confident that they have the right vision for 6G, he is worried about the lack of awareness of how critical it is for the US Government funding agencies and companies to develop the enabling technologies for its realization. In particular, both Rappaport and Marzetta are concerned about the economic competitiveness of the US and the funding challenges that will persist if it is not properly recognized as a priority.
“These issues of funding and awareness are critical for research centers, like NYU Wireless,” said Rappaport. “The US needs to get behind NYU Wireless to foster these ideas and create these cutting-edge technologies.”
With this funding support, Rappaport argues, teaching research institutes like NYU Wireless can create the engineers that end up going to companies and making technologies like 6G become a reality. “There are very few schools in the world that are even thinking this far ahead in wireless; we have the foundations to make it happen,” he added.
Both Rappaport and Marzetta also believe that making national centers of excellence in wireless could help to create an environment in which students could be exposed constantly to a culture and knowledge base for realizing the visionary ideas for the next generation of wireless.
“The Federal government in the US needs to pick a few winners for university centers of excellence to be melting pots, to be places where things are brought together,” said Rappaport. “The Federal government has to get together and put money into these centers to allow them to hire talent, attract more faculty, and become comparable to what we see in other countries where huge amounts of funding is going in to pick winners.”
While research centers, like NYU Wireless, get support from industry to conduct their research, Rappaport and Marzetta see that a bump in Federal funding could serve as both amplification and a leverage effect for the contribution of industrial affiliates. NYU Wireless currently has 15 industrial affiliates with a large number coming from outside the US, according to Rappaport.
“Government funding could get more companies involved by incentivizing them through a financial multiplier,” added Rappaport
Of course, 6G is not simply about setting out a vision and attracting funding, but also tackling some pretty big technical challenges.
Both men believe that we will need to see the development of new forms of MIMO, such as holographic MIMO, to enable more efficient use of the sub 6 GHz spectrum. Also, solutions will need to be developed to overcome the blockage problems that occur with mmWave and higher frequencies.
Fundamental to these technology challenges is accessing new frequency spectrums so that a 6G network operating in the sub-terahertz frequencies can be achieved. Both Rappaport and Marzetta are confident that technology will enable us to access even more challenging frequencies.
“There’s nothing technologically stopping us right now from 30, and 40, and 50 gigahertz millimeter wave, even up to 700 gigahertz,” said Rappaport. “I see the fundamentals of physics and devices allowing us to take us easily over the next 20 years up to 700 or 800 gigahertz.”
Marzetta added that there is much more that can be done in the scarce and valuable sub-6GHz spectrum. While massive MIMO is the most spectrally efficient wireless scheme ever devised, it is based on extremely simplified models of how antennas create electromagnetic signals that propagate to another location, according to Marzetta, adding, “No existing wireless system or scheme is operating close at all to limits imposed by nature.”
While expanding the spectrum of frequencies and making even better use of the sub-6GHz spectrum are the foundation for the realization of future networks, Rappaport and Marzetta also expect that we will see increased leveraging of AI and machine learning. This will enable the creation of intelligent networks that can manage themselves with much greater efficiency than today’s mobile networks.
“Future wireless networks are going to evolve with greater intelligence,” said Rappaport. An example of this intelligence, according to Rappaport, is the new way in which the Citizens Broadband Radio Service (CBRS) spectrum is going to be used in a spectrum access server (SAS) for the first time ever.
“It’s going to be a nationwide mobile system that uses these spectrum access servers that mobile devices talk to in the 3.6 gigahertz band,” said Rappaport. “This is going to allow enterprise networks to be a cross of old licensed cellular and old unlicensed Wi-Fi. It’s going to be kind of somewhere in the middle. This serves as an early indication of how mobile communications will evolve over the next decade.”
These intelligent networks will become increasingly important when 6G moves towards so-called cell-less (“cell-free”) networks.
Currently, mobile network coverage is provided through hundreds of roughly circular cells spread out across an area. Now with 5G networks, each of these cells will be equipped with a massive MIMO array to serve the users within the cell. But with a cell-less 6G network the aim would be to have hundreds of thousands, or even millions, of access points, spread out more or less randomly, but with all the networks operating cooperatively together.
“With this system, there are no cell boundaries, so as a user moves across the city, there’s no handover or handoff from cell to cell because the whole city essentially constitutes one cell,” explained Marzetta. “All of the people receiving mobile services in a city get it through these access points, which in principle, every user is served by every access point all at once.”
One of the obvious challenges of this cell-less architecture is just the economics of installing so many access points all over the city. You have to get all of the signals to and from each access point from or to one sort of central point that does all the computing and number crunching.
While this all sounds daunting when thought of in the terms of traditional mobile networks, it conceptually sounds far more approachable when you consider that the Internet of Things (IoT) will create this cell-less network.
“We’re going to go from 10 or 20 devices today to hundreds of devices around us that we’re communicating with, and that local connectivity is what will drive this cell-less world to evolve,” said Rappaport. “This is how I think a lot of 5G and 6G use cases in this wide swath of spectrum are going to allow these low-power local devices to live and breathe.”
To realize all of these technologies, including intelligent networks, cell-less networks, expanded radio frequencies, and wireless cognition, the key factor will be training future engineers.
To this issue, Marzetta noted: “Wireless communications is a growing and dynamic field that is a real opportunity for the next generation of young engineers.”
Each year, at least 50% of tapeouts are late, with physical and circuit verification closure a significant contributing factor. Mentor incorporated the know-how of our industry-leading Calibre nmLVS sign-off tool with lessons learned from customers to create an innovative smart engine specifically engineered to help design teams find and fix high-impact systemic errors early in the design flow. As part of our growing suite of early-stage design verification technologies, the Calibre nmLVS-Recon tool enables designers to accelerate early-stage design analysis and debug cycles, and reduce the time needed to reach tapeout. We explain the concept behind our innovative technology, and introduce the first Calibre nmLVS-Recon use model – short isolation analysis.
Sensitive electronic components mounted on a circuit board often require protection from exposure to harsh environments. While there are several ways to accomplish this, the dam and fill method offers many benefits. Dam-and-filling entails dispensing the damming material around the area to be encapsulated, thereby restricting the flow of the fill from spreading to other parts of the board.
By using a damming compound such as Supreme 3HTND-2DM-1, you can create a barrier around the component. Then, with a flowable filler material like EP3UF-1, the component can be completely covered for protection.
To start, apply the damming compound, Supreme 3HTND-2DM-1, around the component. Supreme 3HTND-2DM-1 is readily dispensed to create a structurally sound barrier. This material will not run and will cure in place in 5-10 minutes at 300°F, in essence forming a dam.
After the damming compound has been applied and cured, a filling compound such as EP3UF-1 is dispensed to fill the area inside the dam and cover the component to be protected. EP3UF-1 is a specialized, low viscosity one part system with a filler that has ultra small particle sizes, which enables it to flow even in tiny spaces. This system cures in 10-15 minutes at 300°F and features low shrinkage and high dimensional stability once cured.
Both Supreme 3HTND-2DM-1 and EP3UF-1 are thermally conductive, electrically insulating compounds and are available for use in syringes for automated or manual dispensing.
Despite being a two step process, dam and fill offers the following advantages over glob topping:
Flow of the filling compound is controlled and restricted
It can be applied to larger sections of the board
Filling compound flows better than a glob top, allowing better protection underneath and around component
Stoyanovich believes the strikingly good performance of machine learning (ML) algorithms on tasks ranging from game playing, to perception, to medical diagnosis, and the fact that it is often hard to understand why these algorithms do so well and why they sometimes fail, is surely part of the issue. But Stoyanovich is concerned that it is also true that simple rule-based algorithms such as score-based rankers — that compute a score for each job applicant, sort applicants on their score, and then suggest to interview the top-scoring three — can have discriminatory results. “The devil is in the data,” says Stoyanovich.
As an illustration of this point, in a comic book that Stoyanovich produced withFalaah Arif Khan entitled “Mirror, Mirror”, it is made clear that when we ask AI to move beyond games, likechess or Go, in which the rules are the same irrespective of a player’s gender, race, or disability status, and look for it to perform tasks that allocated resources or predict social outcomes, such as deciding who gets a job or a loan, or which sidewalks in a city should be fixed first, we quickly discover that embedded in the data are social, political and cultural biases that distort results.
In addition to societal bias in the data, technical systems can introduce additional skew as a result of their design or operation. Stoyanovich explains that if, for example, a job application form has two options for sex, ‘male’ and ‘female,’ a female applicant may choose to leave this field blank for fear of discrimination. An applicant who identifies as non-binary will also probably leave the field blank. But if the system works under the assumption that sex is binary and post-processes the data, then the missing values will be filled in. The most common method for this is to set the field to the value that occurs most frequently in the data, which will likely be ‘male’. This introduces systematic skew in the data distribution, and will make errors more likely for these individuals.
This example illustrates that technical bias can arise from an incomplete or incorrect choice of data representation. “It’s been documented that data quality issues often disproportionately affect members of historically disadvantaged groups, and we risk compounding technical bias due to data representation with pre-existing societal bias for such groups,” adds Stoyanovich.
This raises a host of questions, according to Stoyanovich, such as: How do we identify ethical issues in our technical systems? What types of “bias bugs” can be resolved with the help of technology? And what are some cases where a technical solution simply won’t do? As challenging as these questions are, Stoyanovich maintains we must find a way to reflect them in how we teach computer science and data science to the next generation of practitioners.
“Virtually all of the departments or centers at Tandon do research and collaborations involving AI in some way, whether artificial neural networks, various other kinds of machine learning, computer vision and other sensors, data modeling, AI-driven hardware, etc.,” says Jelena Kovačević, Dean of the NYU Tandon School of Engineering. “As we rely more and more on AI in everyday life, our curricula are embracing not only the stunning possibilities in technology, but the serious responsibilities and social consequences of its applications.”
Stoyanovich quickly realized as she looked at this issue as a pedagogical problem that professors who were teaching the ethics courses for computer science students were not computer scientists themselves, but instead came from humanities backgrounds. There were also very few people who had expertise in both computer science and the humanities, a fact that is exacerbated by the “publish or perish” motto that keeps professors siloed in their own areas of expertise.
“While it is important to incentivize technical students to do more writing and critical thinking, we should also keep in mind that computer scientists are engineers. We want to take conceptual ideas and build them into systems,” says Stoyanovich. “Thoughtfully, carefully, and responsibly, but build we must!”
But if computer scientists need to take on this educational responsibility, Stoyanovich believes that they will have to come to terms with the reality that computer science is in fact limited by the constraints of the real world, like any other engineering discipline.
“My generation of computer scientists was always led to think that we were only limited by the speed of light. Whatever we can imagine, we can create,” she explains. “These days we are coming to better understand how what we do impacts society and we have to impart that understanding to our students.”
Kovačević echoes this cultural shift in how we must start to approach the teaching of AI. Kovačević notes that computer science education at the collegiate level typically keeps the tiller set on skill development, and exploration of the technological scope of computer science — and a unspoken cultural norm in the field that since anything is possible, anything is acceptable. “While exploration is critical, awareness of consequences must be, as well,” she adds.
Once the first hurdle of understanding that computer science has restraints in the real world is met, Stoyanovich argues that we will next have to confront the specious idea that AI is the tool that will lead humanity into some kind of utopia.
“We need to better understand that whatever an AI program tells us is not true by default,” says Stoyanovich. “Companies claim they are fixing bias in the data they present into these AI programs, but it’s not that easy to fix thousands of years of injustice embedded in this data.”
In order to include these fundamentally different approaches to AI and how it is taught, Stoyanovich has created a new course at NYU Tandon entitledResponsible Data Science. This course has now become a requirement for students getting a BA degree in data science at NYU. Later, she would like to see the course become a requirement for graduate degrees as well. In the course, students are taught both “what we can do with data” and, at the same time, “what we shouldn’t do.”
Stoyanovich has also found it exciting to engage students in conversations surrounding AI regulation. “Right now, for computer science students there are a lot of opportunities to engage with policy makers on these issues and to get involved in some really interesting research,” says Stoyanovich. “It’s becoming clear that the pathway to seeing results in this area is not limited to engaging industry but also extends to working with policy makers, who will appreciate your input.”
In these efforts towards engagement, Stoyanovich and NYU are establishing the Center for Responsible AI, to whichIEEE-USA offered its full support last year. One of the projects the Center for Responsible AI is currently engaged in is a new law in New York City toamend its administrative code in relation to the sale of automated employment decision tools.
“It is important to emphasize that the purpose of the Center for Responsible AI is to serve as more than a colloquium for critical analysis of AI and its interface with society, but as an active change agent,” says Kovačević. “What that means for pedagogy is that we teach students to think not just about their skill sets, but their roles in shaping how artificial intelligence amplifies human nature, and that may include bias.”
Stoyanovich notes: “I encourage the students taking Responsible Data Science to go to the hearings of the NYC Committee on Technology. This keeps the students more engaged with the material, and also gives them a chance to offer their technical expertise.”
Recently Yujin Robot launched a new 3D LiDAR for indoor service robot, AGVs/AMRs and smart factory. The YRL3 series is a line of precise laser sensors for vertical and horizontal scanning to detect environments or objects. The Yujin Robot YRL3 series LiDAR is designed for indoor applications and utilizes an innovative 3D scanning LiDAR for a 270°(Horizontal) x 90°(vertical) dynamic field of view as a single channel. The fundamental principle is based on direct ToF (Time of Flight) and designed to measure distances towards surroundings. YRL3 collect useful data including ranges, angles, intensities and Cartesian coordinates (x,y,z). Real-time vertical right-angle adjustment is possible and supports powerful S/W package for autonomous driving devices.
“In recent years, our product lineup expanded to include models for the Fourth Industrial Revolution,” shares the marketing team of Yujin Robot. These models namely are Kobuki, the ROS reference research robot platform used by robotics research labs around the world, the Yujin LiDAR range-finding scanning sensor for LiDAR-based autonomous driving, AMS solution (Autonomous Mobility Solution) for customized autonomous driving. The company continues to push the boundaries of robotics and artificial intelligence, developing game-changing autonomous solutions that give companies around the world an edge over the competition.
The impact of increasingly powerful electronics on our society cannot be overstated. These more powerful electronics produce significant heat that must be dissipated to prevent premature component failure. Engineers that design electronics face a significant thermal management challenge. Electrical engineers frequently seek to increase the power of critical components, and keeping these components cool represents a significant design challenge. This design task becomes even more challenging when the cooling systems rely on natural convection instead of forced convection from fans, due to the relatively short life expectancy of fans.
One solution to this engineering challenge is to use multiphysics software tools to improve the accuracy of the engineer’s calculations in comparison to analytic and single-physics simulation solutions. These simulations include heat generated by the component, airflow around the component, and radiative heat transfer between the component and the surroundings. Heat generation due to resistive heating in the board can be included with heat generated from components to determine the heat generated within the system. Airflow through the system due to either forced or natural convection can also be analyzed. For many systems, radiation must be considered for accurate temperature predictions due to the large amount of heat transfer that occurs via this mechanism in many electronic designs.
In this presentation, guest speakers Kyle Koppenhoefer and Joshua Thomas from AltaSim Technologies will discuss the development of an electronics cooling problem subjected to a complex thermal environment. The webinar will also include a live demo in the COMSOL Multiphysics® software and a Q&A session.
Planning of satellite communication links or even whole networks is a very demanding task. In the first part of this webinar, we will present our software for satellite link planning that supports the user in a convenient way but takes into the account all relevant sources of impact. In the second part of the webinar, we will demonstrate our solution for monitoring satellite networks, either at one site or distributed worldwide. In addition, we will put focus on the identification of interference coming from unwanted satellite signals or terrestrial sources. We will also show how to make interfering signals that lie underneath the wanted satellite carrier visible.
Attendees of the webinar will learn about:
• The sources of impact affecting satellite links
• How to plan satellite links or whole satellite networks
• The best way to monitor your satellite connections automated and reliably
In July, NASA launched the most sophisticated rover the agency has ever built: Perseverance. https://mars.nasa.gov/mars2020/ Scheduled to land on Mars in February 2021, Perseverance will be able to perform unique research into the history of microbial life on Mars in large part due to its robotic arms. To achieve this robotic capability, NASA needed to call upon innovation-driven contractors to make such an engineering feat a reality.
One of the company’s that NASA enlisted to help develop Perseverance was ATI Industrial Automation. https://www.ati-ia.com/ NASA looked to have ATI adapt the company’s own Force/Torque Sensor to enable the robotic arm of Perseverance to operate in the environment of space. ATI Force/Torque sensors were initially developed to enable robots and automation systems to sense the forces applied while interacting with their environment in operating rooms or factory floors.
However, the environment of space presented unique engineering challenges for ATI’s Force/Torque Sensor. The extreme environment and the need for redundancy to ensure that any single failure wouldn’t compromise the sensor function were the key challenges the ATI engineers faced, according to Ian Stern, Force/Torque Sensor Product Manager at ATI. https://www.linkedin.com/in/ianhstern/
“ATI’s biggest technical challenge was developing the process and equipment needed to perform the testing at the environmental limits,” said Stern. “The challenges start when you consider the loads that the sensor sees during the launch of the Atlas 5 rocket from earth. The large G forces cause the tooling on the end of the sensor to generate some of the highest loads that the sensor sees over its life.”
Once on Mars the sensor must be able to accurately and reliably measure force/torques in temperatures ranging from -110° to +70° Celsius (C). This presents several challenges because of how acutely temperature influences the accuracy of force measurement devices. To meet these demands, ATI developed the capability to calibrate the sensors at -110°C. “This required a lot of specialized equipment for achieving these temperatures while making it safe for our engineers to perform the calibration process,” added Stern.
In addition to the harsh environment, redundancy strategies are critical for a sensor technology on a space mission. While downtime on the factory floor can be costly, a component failure on Mars can render the entire mission worthless since there are no opportunities for repairs.
This need for a completely reliable product meant that ATI engineers had to develop their sensor so that it was capable of detecting failures in its
measurements as well as accurately measuring forces and torques should there be multiple failures on the measurement signals. ATI developed a patented process for achieving this mechanical and electrical redundancy.
All of this effort to engineer a sensor for NASA’s Mars mission may enable a whole new generation of space exploration, but it’s also paying immediate dividends for ATI’s more terrestrial efforts in robotic sensors.
“The development of a sensor for the temperatures on Mars has helped us to develop and refine our process of temperature compensation,” said Stern. “This has benefits on the factory floor in compensating for thermal effects from tooling or the environment.”
Stern points out as an example of these new temperature compensation strategies a solution that was developed to address the heat produced by the motor mounted to a tool changer. This heat flux can cause undesirable output on the Force/Torque data, according to Stern.
“As a result of the Mars Rover project we now have several different processes to apply on our standard industrial sensors to mitigate the effects of temperature change,” said Stern.
The redundancy requirements translated into a prototype of a Standalone Safety Rated Force/Torque sensor capable of meeting Performance Level d (PL-d) safety requirements.
This type of sensor can actively check its health and provide extremely high-resolution data allowing a large, 500 kilogram payload robot handling automotive body parts to safely detect if a human finger was pinched.
ATI is also leveraging the work it did for Perseverance to inform some of its ongoing space projects. One particular project is for a NASA Tech demo that is targeting a moon rover for 2023, a future mars rovers and potential mission to Europa that would use sensors for drilling into ice.
Stern added: “The fundamental capability that we developed for the Perseverance Rover is scalable to different environments and different payloads for nearly any space application.”
For more information on ATI Industrial Automation please click here.
Explore the powerful software behind Keysight’s high-precision hardware and discover how to meet emerging automotive electronics test requirements, while driving higher ROI from your existing hardware. Let Keysight help you cross the finish line ahead of your competition. Jump-start your automotive innovation today with a complimentary software trial
SANS and AWS Marketplace will discuss the exercise of applying MITRE’s ATT&CK Matrix to the AWS Cloud. They will also explore how to enhance threat detection and hunting in an AWS environment to maintain a strong security posture.
The Tower of Pisa is indeed a famous monument. Yet, it is also a monumental error of civil engineering. Built in 1173 with no foundations on a flood plain, the white marble tower started tipping on its southern side even before it was completed. Its peculiar inclination is like a spectacular warning to all builders around the world.
Yet, people have studied the ground under their feet, way before the 12th century. They have done so ever since they started extracting rock, building houses and bridges and digging irrigation systems. At first purely empirical, soil investigation has been rationalised since the 17th century and has given rise to geotechnics, a technoscience combining geology and geomechanics.
Today, the most frequently used measurement instrument in geotechnics is the penetrometer. “Imagine it as a giant hydraulic press that digs a measurement cone in the ground…” explains Paolo Bruzzi, Pagani Geotechnical’s sales manager. The Italian company, whose factory is based in Piacenza, near Milan, has become a global leader in the field of geotechnical equipment.
Penetrometers render high-fidelity images: “Our equipment detects layers – sand, clay or other – as thin as 10-15 cm.” Enough to make reliable estimates on soil behaviour when building a road or a bridge, digging foundations or simply setting up the pillar of a ski lift.
As for all measurement instruments, the quality of penetrometers depends on their reliability. “The system verifies itself its accuracy after every measurement”, explains Paolo Bruzzi. “Incoherent data would immediately signal that the cone had been damaged. So, we can be sure that our measurements are always absolutely precise.” Furthermore, the cones require mandatory calibration every year, a further warranty of correct measurement. Material and processes are standardised defacto on an international level. The cone sizes, the forces applied, the penetration speed … everything is defined to enable traceability, repeatability and data sharing.
Penetrometer tests can be used for other types of measurements as well. In particular, for seismic measurements. “In such cases, we stop penetrating after every meter and create a seismic wave from the surface” explains Bruzzi. “Its amplitude and propagation speed is measured by a sensor on the cone, which makes it possible to evaluate the soil’s behaviour in case of earthquakes.”
Anecdotally, the “elastic” soil, isolating the structure from earthquakes, which provoked the tipping of the Tower of Pisa, also protected it from several earthquakes.
The instant results obtained by the penetrometers have greatly contributed to the popularity of these instruments. Carried out in situ, the tests do not require any soil sampling, nor waiting for laboratory analysis results. “They disturb the soil much less than core drilling, so they are less likely to influence the results,” says Paolo Bruzzi.
Whether disturbed or not, ground is not easy to deal with. The equipment must possess huge power to drive in a cone. “In the past, the only solution was using heavy duty trucks, up to 20 tons” recalls Bruzzi. “Such trucks are still used in certain cases and they usually cost in excess of 400,000 euros, require a heavy vehicle driver, an entire team and, since the measurements need to be carried out vertically, a flat, large enough piece of land …”
In a nutshell, a costly and constraining solution. The idea of developing an alternative is how the story of Pagani Geotechnical began.
It all started back in the seventies in Italy. As building requirements were being strengthened, Ermanno Pagani created his geotechnical consulting company. Tests became widely used and the entrepreneur realised that engineers were increasingly using heavy trucks for projects that were much smaller than building bridges or blocks of flats, such as family homes. He wanted to carry out tests with equipment that would be much less disproportionate. Wouldn’t it be possible to have a penetrometer capable of analysing with precision the first 20-25m of soil (deep enough for a large number of projects), but that would be more compact, easy to use and much less costly than geotechnical trucks? Since he couldn’t find anything to meet such needs, he developed his own equipment. As it attracted his customers’ attention, he could foresee the potential market and launched his business. Since then, Pagani Geotechnical stopped being a consultant, and became a manufacturer. His first penetrometers were sold in 1983.
A year later, the company launched its TG 73-200 model, a modular and mobile device. Its mast can be tilted forward and backward enabling measurement even on sloping terrain. It anchors automatically into the ground so that it can exert the necessary thrust, in spite of its modest 3 tons. Handling, anchoring and measurements are automated to such an extent that only a single operator is needed to carry out the tests.
Pagani has put a particular accent on the robustness of the product. “The TG 73-200 was built to be indestructible” laughs Bruzzi. “It withstands all types of “abuse” – very difficult terrain or heavy-handed, clumsy operators!”
Thanks to these “over the top” characteristics, the 73-200 remained Pagani’s high-end model, selling five of them a year. “Its customers are large companies that require no-compromise performance for some highly demanding applications.” As for other applications, Pagani Geotechnical has taken another step forward.
The TG 63-150, even easier to use, was launched in 1989. It is slightly bigger than 1m by 2m and weighs only a ton. The engineer can transport it himself in a van (no longer a need for a truck and a truck driver) and carry out the measurement on his own. It is a first in its field which simplified the tests and cut the costs considerably. The price (44,000 euros, which is half the price of a 73-200 and close to one tenth of a truck) contributes to broadening the client base – medium-sized companies, consultancy firms, universities, laboratories…
“The 63-150 was the first of its kind”, says Paolo Bruzzi. “It had immediate success. With 800 units sold in over 70 countries, it has even become the best-selling compact penetrometer in the world.” It is still Pagani’s best-seller, who sell over sixty of them every year.
The TG 30-20 and 63-100 completed Pagani’s range of penetrometers. The Italian company, still managed by its founder, employs 25 people. Its factory produces between 70 and 80 machines a year and its 800 customers come from almost 90 countries.
Apart from the engines and hydraulic systems, everything is developed and produced “in-house”: accessories, electronic cones, seismic modules, power units… Even its data acquisition systems, including the new CPT AS, launched this spring, fully fitted with LEMO connectors. “This watertight system needs to operate on all terrain, from snow-covered northern countries to the Amazonian rainforest” explains Bruzzi. “We have chosen IP65 certified LEMO connectors for their resistance and compactness, as well as for aesthetic reasons – the excellence of our solutions also derives from design!”
Pagani’s material is robust (its penetrometers are used for “an average of more than 20 years”). Technical components remain stable (“there hasn’t been found anything better for exploring the soil!”). Improvements are made essentially in the electronics system and the accessories. Two or three annual upgrades optimise measurement precision and ease of use. Safety is reinforced to follow the continuous evolution of regulations. Applications have become mobile.
“Many innovations arise from our partnerships with universities and research centres in Italy, Brazil, England or other countries, and, obviously, from feedback from our 800 customers from almost 90 countries, who use our technologies regularly in all possible conditions: in jungles, frozen soil, deserts …”
Pagani, proudly claiming “Made in Italy”, is happy to be associated with high quality. The durability of its machines hasn’t hindered regular sales progress for the last few years. For what reason? There’s been a growing demand for geotechnical tests. “The quality of infrastructures has been improving, requirements have become stricter and additional countries, in particular in emerging economies, have started performing tests.” In short, everything is done to ensure that the Tower of Pisa stays unrivalled.
Learn about automotive design, test and measurement tips anytime, anywhere. Join our experts as they cover the latest topics including: Delivering Quiet Power to Automotive Electronics and Challenges and Solutions of Advanced Automotive Radar System Design. Explore these topics and more at your convenience.
Engineering the 5G World provides the information you need to master the complexities of 5G and bring your products to market successfully. Whether you’re designing chipsets, components, devices, or base stations or bringing 5G networks online, we’ve got you covered.
It’s a new world for businesses everywhere. For small and medium sized companies who thrive on in-person relationships and customer interaction, it’s a difficult transition to an all remote- stay at home workforce. That’s on top of whether your business model can sustain some level of operation, revenue generation, and customer retention.
For those companies who can continue to operate at some level, the current environment creates numerous opportunities for a bad actor to exploit your employees, or for a bad employee to steal from your company. Join us in this webinar to discuss strategies that will help protect your company now by learning the indicators professionals look for in potential problem employees and review the motivations that drive employees to cross the line. You will be equipped with strategies and questions you should ask your leadership to better protect you company in these unprecedented times.
Threat actors are closely monitoring public events happening around the world, and quickly employing those themes in attack vectors to take advantage of the opportunity. That said, various Advanced Persistent Threat (APT) groups are using the coronavirus pandemic as a theme in several malicious campaigns.
By using social engineering tactics such as spam and spear phishing with COVID-19 as a lure, cybercriminals and threat actors increase the likelihood of a successful attack. In this paper, we:
Provide an overview of several different APT groups using coronavirus as a lure.
Categorize APT groups according to techniques used to spam or send phishing emails.
Describe various attack vectors, timeline of campaigns, and malicious payloads deployed.
Analyze use of COVID-19 lure and code execution.
Get ready to dig into the details of each APT group, their origins, what they’re known for and their latest strike.
The collective thoughts of the interwebz
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.