Explore the future of
miniaturization in technology
Explore the engineering topic of miniaturization along with featured products from Molex through a series of articles, videos, and infographics that delve into cutting-edge trends shaping our connected future. In the article "Smaller Connectors for Bigger Tech Advances," uncover the meaning behind the drive to make things smaller, trace its history, and explore the various applications propelling component miniaturization. But how small is too small, and what does the future hold for miniaturization connector solutions? In the video on miniaturization trends, get a quick but insightful overview of the latest developments.
Shifting gears, dive into the realm of augmented and virtual reality in the article, "Miniaturization and Power Management," focusing on the future of AR and VR hardware. Then, examine the intersection of technology and healthcare in the article on medical and consumer wearables—from healthcare trends driving remote patient monitoring to next-gen wearables like patches, rings, and clothing. Uncover the secrets of miniaturization technologies in an infographic detailing how tiny connectors are made and the challenges of assembling miniature components.
Next, switch gears to the high-frequency world of mmWave applications in a two-part article. Review the current landscape of 5G deployment and its impact on next-gen mobile devices, then hone in on the intricate realm of mmWave sensors in various applications, including ADAS systems. Conclude our tech journey with an insightful IoT article, which will unravel the current state of IoT, delve into sensor node miniaturization, and showcase products like the Zero-Hachi 0.80mm-pitch connector system and modular OneBlade 1.00mm-pitch, wire-to-board connectors that drive IoT application development forward.
Smaller Connectors for Bigger Tech Advances
By Jon Gabay for Mouser Electronics
Miniaturization has been an ongoing process for millennia. Ancient humans learned to create and optimize tools, mainly as they migrated for survival's sake. Reducing size and weight without sacrificing functionality or reliability became important early on. This process continues today.
Modern electronic devices have undergone an amazing miniaturization process within a single lifetime. From bulky tubes and transformers to solid-state circuitry and switching power supplies, the ability to shrink a device while maintaining—or improving—performance has been a key marketing bullet for almost every commercial, consumer, aerospace, and military product.
The functional density of integrated circuits is a prime example of how semiconductor geometry has increased circuit density a millionfold. Old static rams, for example, used DIP packaging to house 256-bit x 4-bit memory. Modern RAM houses gigabits of memory in smaller micro-surface mount packages. This opens the door for smaller devices and higher amounts of functionality and interconnectivity.
Connection technology has had to evolve as well. Older, larger, bulky devices had plenty of room to use big, sloppy connectors—no more. Modern day devices use multi-point connections in smaller micro-size connectors that handle relatively high amounts of power and signal speed in smaller and smaller packages. This trend is not expected to stop.
Watching the trend
One of the first major electronic connection miniaturizations occurred when digital watches replaced mechanical watches. Not only did miniature buttons have to shrink, but so did batteries and their miniaturized holders. What used to be metallic planes of contacts became wires in precision housings that used the elasticity of materials for spring and electrical contact action.
These older miniature connections and switches were unreliable, and users sometimes had to jiggle and strike them to make connections. Material science was not as evolved as today, so oxidation and contaminants were also an issue. High-reliability applications like miniaturized military equipment had the luxury of using non-tarnishing gold contacts. Still, most consumer products were stuck with copper, aluminum, and iron-based contact materials, all tarnishing and oxidizing with humidity and time.
As tower computers moved to laptops, tablets, and smartphones, the shrinking of switches and connectors improved. Better metals and higher precision made these connectors much more reliable. The shrinking of USB to mini- and micro-USB is a case in point. The adoption of USB 3, especially USB 3.1C, made what was once a fairly large polarized four-pin connector into a non-polarized 20-pin connector with power up to 125 watts. These connectors' once 12Mbit/sec throughput capacity was upped to 10Gbits/sec. This accomplishment was no small task, and connector manufacturers like Molex led the way for many connection-related technologies.
A key example is medical device technology. What used to be a watch is now a fitness tracker, health monitor, and wearable medical device. Connectivity between a processing module and medical sensor, for example, could use a single-point connection—like a wire going to an implanted or subcutaneous glucose level sensor—or a multi-point connector—like an EKG heart rate monitor to detect events before they become fatal.
Not every wearable or implantable medical device component can connect to a PCB or substrate. For instance, medical devices that dispense drugs may need ultra-small connections to a smart auto-injector and an audible device to alert the patient. Devices, connectors, and their associated parts need to be small enough so as not to discourage the patient from wearing them.
Another key example is hearing aids. Microminiature circuitry needs to connect batteries or recharge ports, as well as connect to a transducer that stimulates the tympanic membrane. Patients often don't want anyone to know they are even wearing a hearing aid, so ultra-small size and comfort must be in play here. Even the materials for connectors and wires need to be bio-safe and nonallergenic.
Wireless technology impacts designs
The explosive use of wireless technology has dramatically impacted the design of many electronic systems. With wireless links, the need for external wires and connectors has decreased in some areas but presented new challenges in others.
Take antennas, for example. Antennas used to need large BNC-style connections with large coax cables. RF signal wavelengths have shrunk to millimeter sizes, and as a result, many PCB mountable antennas have emerged. Still, the best performance may not allow PCB-mounted antennas. As a result, microminiature RF connectors like the Molex SMP-MAX and SMP-MAX EVO 50Ω RF Connectors provide a small PCB SMT-mounted 50 Ohm plug detent connection to an external antenna that can be placed for optimal performance (Figure 1).
In addition, these connectors are often used in applications with more than one RF link active at a time. Passive intermodulation occurs when multiple carrier aggregation may create interference from undesirable sources. Often called the rusty bolt effect, more miniature connectors give designers more flexibility to place various RF sources so as not to interfere.
MIM's the word
Manufacturing technologies have had to adapt to provide smaller connection technologies that perform as desired. Older cast and formed connection housings with sloppy insert pins won't do anymore. Companies like Molex are pioneering Metal Injection Molding (MIM) technology, providing better surface finishing and corrosion resistance. The new single-mold process supports complex contours with precision holes and geometries and even allows the use of high-temperature conductors. This is important for aerospace and even automotive applications.
Modern automotive applications are benefiting greatly from miniaturized connectivity. Electronics are everywhere, and fly-by-wire networking technology has evolved from CAN-style networks to 100Mbit/sec Ethernet networks (IEEE 802 TSN). Wired Ethernet using established RJ-45 connectors just can't cut it anymore. They are too large, vibration sensitive, and not environmentally sealed. As a result, two-wire twisted pair automotive Ethernet has become the dominant and emerging standard for new vehicles.
Automobile subsystems will need precision PCB-mounted two-wire Ethernet connections, and the cables must be tough, rugged, and environmentally robust. For example the Molex HSAutoLink I low-profile connector system optimizes device-side space saving while providing a fully protected shield case around the signals (Figure 2). These compact connectors have multiple polarization and color-coding options to ensure correct assembly, anyone who has worked on cars will appreciate this feature.
The next generation of automotive signaling will involve many design challenges, including current carrying capacities in small spaces. Consider this: what used to be a mechanical level to control airflow is now replaced by networked processors, sensors, and actuators with connectivity to the car's supercomputer. Molex has reduced the size of wiring harness terminals carrying heavy currents from the standard 1.5mm2 to just 0.13mm2. Lower-current terminals have been reduced from 0.64mm2 to 0.5mm2.
The center of it all
One area of significant importance where connectivity miniaturization is key is with data centers. Data centers can house thousands of server racks with hundreds of fiber optic and copper cables. The demands of a data center facility can be daunting, including energy usage, maintenance, traffic flow, cooling, and more. Cable densities must be very high, and fiber optics modules must run cool and reliably.
Service technicians need to quickly find and identify single duplex, full duplex, single mode, and multimode connectors and easily be able to unplug them and plug them back in when a piece of equipment needs to be replaced.
Molex is leading the way here with next-generation multi-position high-density fiber optics interface cassette-style connectors. The Molex Rack-Mount LGX Fiber Enclosures and Cassettes are single-mode 24-position fiber optics connectors that convert LC style to MTP/APC style in compact form factors (Figure 3).
As miniature as possible, the 123.4 x 105 x 28.8mm form factor houses the 24 input connections with two MPO adapters on the back end. The 2X LCTO 12 MPO fan-out assemblies are housed inside. These allow equipment manufacturers to feature highly dense fiber connectivity, providing more ports in the same space.
In addition to fiber optics connectors and adaptors, Molex provides cable assemblies, fiber guides, optical amplifiers, wavelength-selective modules, and optical transceivers.
Conclusion
Devices are increasing in functionality while decreasing in size. This presents challenges for device designers as well as connectivity designers. Machines are getting more complex, and while wireless connectivity is helping reduce some interconnectivity issues, applications like automotive, factory automation (with numerous embedded sensors and actuators), medical, and telecommunications are challenging companies to offer better solutions. Companies like Molex are rising to the challenge and providing exciting new products to tackle these issues.
Photo/imagery credits (in order of display)
Gorodenkoff - stock.adobe.com, Myst - stock.adobe.com
Top 5 Applications Where Miniaturization Meets High Performance
Miniaturization and Power Management: Building the Next Generation of VR Hardware
By Cassiano Ferro Moraes for Mouser Electronics
Virtual Reality (VR) has transformed how we interact with digital content by providing immersive and captivating experiences within virtual environments. However, behind the awe-inspiring experiences is a complex engineering set, which includes high-resolution displays, accurate tracking, powerful rendering techniques, graphics processing units (GPUs), efficient data, heat transfer technologies, miniaturization, and more.
This article steps into the fascinating but challenging world of VR hardware engineering. It explores technologies used to create high-efficient, lightweight, and miniaturized headsets.
Optics and display systems
VR headsets use high-resolution displays and optics to create immersive visual experiences. These devices are designed to transport users into virtual worlds by presenting stereoscopic images mimicking real-life depth perception.
In the context of miniaturizing VR hardware, optimizing optics and display systems becomes crucial. Maintaining high-resolution displays while reducing the headset's size, weight, and power consumption is complex. This requires advancements in display technologies, such as miniaturized OLED or LCD panels with high pixel density. Nanotechnology plays a crucial role in advancing display technology for VR applications, particularly in achieving higher resolutions. It enables the fabrication of smaller pixels, resulting in higher pixel density and enhanced image sharpness. In addition, the demanding data transfer protocols necessary for VR, such as 5G, heavily rely on advancements in nanotechnology.
To address size constraints, engineers have focused on compacting the optical systems while maintaining wide fields of view. Advanced lens design techniques, such as aspherical lenses, can help achieve a wider field of view (FOV) while minimizing the physical size of the optics. Additionally, distortion correction algorithms play a significant role in compensating for geometric distortions and ensuring image quality across the entire display area.
Reducing motion blur is another challenge in VR hardware. Increasing the display's refresh rate and optimizing response times of the display technology help minimize motion blur during head movements. Advanced motion smoothing techniques like motion vector prediction and frame interpolation can further enhance the visual experience.
By considering display technologies with higher pixel density and improved sub-pixel arrangement, miniaturized VR headsets can address the pixelated mesh appearance known as the screen door effect (SDE). Additionally, incorporating optical filters like diffusers or anti-reflective coatings can help reduce the visibility of pixel gaps and enhance the overall visual quality.
Tracking and positional accuracy
Another vital piece of the VR hardware set is the tracking system. However, capturing user movements accurately while minimizing the size and power requirements of the tracking components has been a big challenge.
Traditional tracking systems employ active infrared (IR) and laser-based systems for tracking. Active IR tracking involves emitting infrared light from a source and using sensors to detect reflections from markers on the user's head or peripherals. On the other hand, laser-based tracking systems use lasers and sensors to measure the time the rays return after hitting markers, allowing for accurate motion tracking. However, adapting these systems for miniaturized hardware requires advancements in sensor miniaturization and energy-efficient designs.
Also, integrating sensors and combining data from multiple sensors is crucial in improving accuracy and reducing latency. By integrating sensor functionality into a compact form factor, the overall footprint of the tracking system can be minimized. Additionally, leveraging prediction and filtering algorithms helps smooth out movements and ensures real-time responsiveness.
Occlusion is another challenge in tracking systems when objects obstruct the line of sight between sensors and markers, leading to inaccurate or lost tracking. In order to minimize these problems, techniques like sensor integration, inverse kinematics, and sensor fusion can be employed.
Achieving precise positional accuracy in VR can be challenging due to sensor limitations, calibration requirements, environmental factors, and system latency. Calibration techniques can align sensors and markers for accuracy, while sensor resolution and precision provide detailed and precise tracking. Furthermore, sensor fusion can also optimize positional tracking by combining data from multiple sensors.
Graphics Processing Units and rendering techniques
VR systems have been taken to a higher level due to the advancements in GPU technology. Some of these improvements are higher frame rates and lower latency, which can be achieved by dividing complex tasks into smaller subtasks and executing them simultaneously across multiple parallel processing units. The performance of GPUs can be further improved through optimization techniques, such as fine-tuning algorithms and system configurations, to maximize efficiency and reduce processing overhead.
Asynchronous Timewarp (ATW) and Asynchronous Spacewarp (ASW) are techniques that have been used to ensure a smooth, immersive VR experience. ATW predicts head movements and warps previously rendered frames accordingly, reducing visual lag. ASW extrapolates intermediate frames based on head and hand motion, reducing latency and improving system responsiveness.
VR is also enhanced by real-time rendering algorithms that create immersive visuals. These algorithms utilize techniques like Level of Detail (LOD) management, which adjusts the level of detail in objects based on their distance from the viewer for efficient use of computational resources. Culling and occlusion techniques are also used to eliminate the need to render objects that are not visible, thereby reducing unnecessary computations. Furthermore, dynamic lighting and shading techniques simulate realistic lighting conditions, creating more visually natural virtual environments.
Connectivity and data transfer
When transmitting data without physical connections, VR systems rely on wireless communication protocols like Bluetooth® and Wi-Fi®. For a seamless experience, low latency and high bandwidth are crucial. Furthermore, interference mitigation and Quality of Service (QoS) techniques help maintain a robust wireless communication link by prioritizing essential data and minimizing the impact of external signals.
Good signal range and stable connections are necessary to prevent disruptions or dropouts during usage. Therefore, the signal range determines the maximum distance between the transmitter and receiver, while stability refers to the reliability of the connection.
Nanotechnology plays a pivotal role in establishing seamless and responsive communication between various virtual reality elements by bridging the gaps between processors and input environments. Moreover, data compression techniques can be used to minimize the amount of data transmitted over wireless channels. Lossless compression preserves all original data, while lossy compression sacrifices some details to achieve higher compression ratios. Still, adaptive compression adjusts the compression level based on the available bandwidth and quality requirements.
Minimizing delays during data compression and decompression processes to maintain real-time interactions in VR is also complex. To mitigate this issue, synchronization methods like time stamping enable precise timing coordination between devices, while buffering and interpolation techniques smooth out variations in data transmission rates.
Power management and heat dissipation
Ensuring maximum battery life and reducing power stress in VR headsets depends on power efficiency techniques, which can be enhanced by leveraging nanotechnology. This involves carefully selecting energy-efficient components and implementing power optimization methods like voltage scaling, clock gating, and dynamic power management. Nanoscale materials and structures can improve power efficiency by enabling better control over power consumption and minimizing energy loss.
Moreover, thermal management is crucial during the operation of VR headsets, as they generate substantial heat. Nanotechnology can play a role in thermal management by offering advanced materials and designs that facilitate efficient heat distribution and dissipation. Heat spreading techniques, ventilation systems, and airflow mechanisms can be enhanced by nanoscale features, allowing for better heat dissipation and maintaining optimal performance.
Nanotechnology also enables the development of thermal interface materials that enhance heat transfer between different components, thus optimizing the overall cooling process. Additionally, nanosensors can be integrated into VR headsets for precise temperature monitoring and control, ensuring that cooling mechanisms operate within safe temperature limits.
Hardware miniaturization and portability
Creating high-performing and portable VR devices is a complex task that involves reducing components to nanoscale and improving ergonomics. Significant advancements in this field have made VR experiences more accessible and comfortable for a broader range of users.
Miniaturization
The miniaturization of components, especially with the advancements in microelectromechanical systems (MEMS) technology, has played a crucial role in VR hardware engineering. Smaller, more power-efficient sensors like accelerometers, gyroscopes, and magnetometers enable accurate tracking within compact VR devices. Additionally, battery miniaturization has also been essential in ensuring that VR hardware is portable and has longer usage times. Optics and lens systems also have improved, resulting in smaller, lighter lenses with high quality.
Molex Quad-Row Board-to-Board Connectors
When making VR devices smaller, it's crucial to use connectors that can compact the electronic circuitry. Molex Quad-Row Board-to-Board Connectors satisfy this requirement by providing a space-saving solution in various space-constrained applications (Figure 1). They offer 30 percent space savings over conventional connectors. These connectors are 0.60mm high and 2.0mm wide, and they have an insert-molded, shielded rod design that provides robustness and protection on the inside cover.
Regarding electrical capabilities, the Molex Quad-Row Board-to-Board Connectors can support 3.0A of current in an extremely compact package, ensuring robust power. They can handle voltages up to 50V, withstand 250V dielectric voltage, and have an insulation resistance of 100MΩ. Furthermore, these quad-row connectors are made of LCP UL 94V-0 material, which provides excellent strength and thermal resilience. It operates within a wide temperature range from -40°C to +85°C.
Overall, Molex Quad-Row Board-to-Board Connectors offer an adaptable and flexible solution for VR hardware. When it comes to engineering VR with the advantages of nanotechnology, this option saves space and maintains the required reliability and performance.
Integrated system-on-chip
Integrated System-on-Chip (SoC) circuits also have been employed to integrate processors, GPUs, memory, and sensors in a single chip. Therefore, using SoCs allows power consumption and performance to be optimized.
Lightweight materials
Another feature that made it possible to minimize weight and enhance the comfort of VR headsets is lightweight materials, including advanced polymers, fabrics, and textiles. This allows the engineers to focus on ergonomics and comfort. Therefore, considering weight distribution, adjustable straps and interfaces, ventilation, heat dissipation, and user interface design ensures a comfortable and immersive user experience.
Overall, the progress made in miniaturization, lightweight materials, and ergonomic design has significantly improved VR hardware engineering. Apple's Vision Pro is an excellent demonstration of how VR technology can be made smaller and more portable. It lets users blend digital content with their surroundings, making it easy to navigate using only their eyes, hands, and voice. Furthermore, it has a sturdy design that includes 3D glass and an aluminum frame that fits comfortably on the face.
Conclusion
VR hardware engineering encompasses a multitude of complex challenges, from optics and display systems to tracking accuracy, GPU performance, connectivity, power management, and heat dissipation. Nanotechnology plays a pivotal role in addressing several of these challenges, particularly in advancing display technologies, data transfer protocols, and power consumption minimization. By leveraging nanoscale materials and structures, VR headsets can achieve higher resolutions, improved power efficiency, and enhanced thermal management. As VR technology continues to evolve, further advancements in nanotechnology and engineering practices will continue to push the boundaries of what is possible in the realm of virtual reality.
Photo/imagery credits (in order of display)
Finn - stock.adobe.com, Olly - stock.adobe.com, Gorodenkoff - stock.adobe.com
Transforming Health Care with Big Data
By Alex Pluemer for Mouser Electronics
Medical wearable technology isn't new—physicians have used remote heart monitors as diagnostic tools to monitor irregular cardiac activity and arrhythmia for decades. Data collected over time as patients go about their daily lives have more diagnostic value than what can be gleaned from an hour in a doctor's office. As a result, prescribing remote heart monitors with wireless transmission capabilities has become standard practice.
The technology has improved since its inception; data is now transmitted to mobile apps that send it to the cloud for monitoring and analysis by algorithms that can help predict future cardiac episodes before they occur. The real revolution in wearable medical tech has been in the commercial market with devices like Apple Watches, Fitbits, and Oura Rings—monitors in the form of fashion accessories that can measure heart rate, body temperature, blood pressure, glucose levels, and other vital health statistics. Other popular functions in commercial medical wearables include exercise tracking and sleep monitoring, which require movement and location tracking functionality. The next wave of wearable tech will be lighter and more flexible and will be able to be woven into specially designed clothing—or maybe even your favorite old sweatshirt.
Data collection
Commercially available medical wearables increasingly offer enhanced functionality in a similar (or even smaller) form factor; in addition to monitoring heart rate, blood pressure, and body temperature, they can detect and quantify motion (i.e., steps counters), determine the length and depth of sleep cycles, and even detect spikes in blood sugar (Figure 1). Measuring these variable and disparate health factors requires an array of sensors. For example, measuring body temperature and heart rate is relatively straightforward, but more complex functions require more advanced technology.
Fitness trackers measure the number of steps a person takes as they're out for a stroll or just as they go about their day, but counting their steps is more complicated than just tracking their location. Step counters must differentiate between self-propelled motion and being propelled by another force, like a car or a plane.
Fitness trackers usually employ accelerometers to measure the speed changes that naturally occur as a person walks while ignoring the small, incidental accelerations that occur from stomping a foot or waving an arm. Other step counters alternatively or additionally use motion sensors to detect the motion of the hips as people put one foot in from the other.
Monitoring sleep patterns is also a function of measuring movement, but these movements are more subtle. Smart devices employ actigraph units—which measure changes in movement like accelerometers do—in conjunction with heart rate monitors and skin temperature conductance to determine what stage of sleep the user is in as well as the duration and depth of their sleep. Traditional sleep studies have been performed in controlled laboratory-like settings, limiting the amount of available data to what can be gleaned while a patient is sleeping under observation. With remote monitoring, doctors and researchers can now find out how well a patient has been sleeping by consulting their smartwatch app.
Data analysis
Sleep monitoring would be much more difficult if researchers were responsible for organizing and drawing conclusions from these disparate data points; fortunately, modern medical wearables employ algorithms to process raw data and alert users or medical professionals when a health problem may be occurring (or about to). Medical wearables don't typically feature onboard processors, meaning data must be transmitted to a central processor before it can be collated and analyzed. Modern medical wearables usually interface with the cloud via mobile apps that receive the data from the device directly and then relay it to the cloud for processing.
In sleep monitoring, for example, an algorithm would incorporate the raw data from the various sensors to create a whole picture of the user's sleep patterns. People's tiny body movements, heart rate, and body temperature tend to increase when they're in REM sleep in a demonstratively greater way than during less restful periods of the night. Incorporating all these data points into a single, clear picture wouldn't be possible on such a large scale without data analytics to do most of the work. This kind of data analysis doesn't just inform the user's present state of health; it can also help predict future events. Monitoring changes in heart rhythm over time can help predict heart attacks and stroke, while measuring changes in body temperature can help prevent potential dehydration and heat stroke. The more data algorithms work with individual users and the user base, the better they can predict potential health issues in the future.
The medical wearables market
The market for commercially sold medical wearables has exploded in the past five years, with smartwatches and fitness trackers making up the bulk of those sold in the US. Although estimates vary, somewhere between 80 and 100 million Americans wear a smartwatch of fitness tracker while they exercise or for everyday use. Home health care devices like heart rate monitors, blood pressure cuffs, and electrocardiograms have also experienced an increase in popularity as the baby boomer population continues aging and remote health care options for older patients become more standard practice (Figure 2).
The future of the medical wearables market may be in smart textiles like the clothing and accessory space. Sportswear companies like Nike and Adidas are reportedly developing t-shirts and pullovers that monitor heart rate, blood oxygenation, calorie expenditure, and other health metrics. Just like mobile phones and MP3 players, medical wearables will get smaller in form factor and less intrusive as the technology evolves. Future medical wearables may be indistinguishable from regular clothing or accessories like jewelry, belts, or eyeglasses.
According to Future Market Insights , the market for medical wearables in the US grew about $59 billion in 2023 and is expected to grow almost six times larger to nearly $370 billion in the next decade. The bulk of that growth will likely come from younger users employing smart devices and fitness trackers as preventative health care tools. As daily activities become more remote, actual visits to the doctor's office will be fewer and farther between. The more preventative health care people can do themselves at home or work, the healthier they'll be. It's not hard to imagine a world in which medical wearable devices supplant much of the diagnostic work typically performed in clinical settings.
Conclusion
The potential applications of medical wearable technology almost sound like the stuff of science fiction: devices you can wear that can detect diseases and medical problems before they happen. In 2019, researchers at the University of Michigan created a wrist-worn device to detect cancer cells in the bloodstream by taking small blood samples and screening them for circulating tumor cells. Researchers can use millions of data points collected from fitness monitors over time to develop algorithms that detect changes in a patient's walking pace, gait, and motion to predict the onset of Alzheimer's, Parkinson's, and other degenerative neurological diseases. The ability to discover and identify underlying conditions that can result in severe health ramifications is also a potential boon to health care professionals.
Medical researchers have developed noninvasive diagnostic machine-learning tools that implement wearable biosensors to detect potentially serious conditions that can lead to heart attacks, strokes, or even death. Certain underlying cardiac conditions can be challenging to see in clinical settings because the symptoms can be extremely sporadic and often go undiagnosed until a patient develops more severe problems as a result. Early detection methods used in medical wearables that provide constant monitoring could detect conditions in patients that otherwise may never have been discovered at all.
1Future Market Insights, "Smart Wearables Market," Future Market Insights, accessed October 4, 2023, https://www.futuremarketinsights.com/reports/smart-wearables-market.
2Tae Hyun Kim et al., "A Temporary Indwelling Intravascular Aphaeretic System for in Vivo Enrichment of Circulating Tumor Cells," Nature Communications 10, no. 1478 (April 1, 2019), https://doi.org/10.1038/s41467-019-09439-9.
Photo/imagery credits
ipopba - stock.adobe.com
Manufacturing microminiature connectors from digital twins to precision assembly
How are microminiature connectors made?
Design
Digital twins streamline connector design, reducing design cycles by enabling rapid testing and troubleshooting before prototypes
Manufacturing
Molex designs the machines that build their connectors, enabling precision manufacturing of tiny connectors
Shipping
Tiny connector pins need to be protected during shipping and manufacturing
What are the technological and manufacturing challenges of creating smaller and smaller connectors?
Shrinking pitches
The typical pitch has shrunk from 2.54mm to 0.35mm, requiring precision with less tolerance for error
RF Signal
Power
Multiple functions
Connectors perform multiple functions, such as carrying both RF signals and power
Smaller connectors
Smaller connectors face signal loss, higher resistance & heat challenges that necessitate careful design considerations
Signal loss
High resistance
More heat
How are miniature components assembled at the component manufacturer?
Custom assembly lines
Manufacturers need custom assembly lines that can handle small connectors in hard-to-reach places
Small as a grain of rice
Connectors as small as a grain of rice are fragile and require specialized assembly lines with finely tuned mating force
Unique geometries
Smaller devices may have unique geometries, creating challenges when mating hard-to-reach connectors inside the device
Smart watch
Earbud headphone
What are the key markets factor driving demand?
Automotive wiring
Shrinking heavy-current wiring terminals from 1.5mm2 to 0.13mm2, which saves space & weight in wiring harnesses weighing over 150lbs
5G connectivity
62% of cell phones from 2023 are 5G-ready. To prevent signal loss in millimeter signals, connectors with pitches of 0.5mm or less are needed
Industrial Single Pair Ethernet (SPE)
Transforming industrial connectivity by achieving gigabit data rates with just one pair of copper wires, replacing legacy solutions requiring four pairs
Wearable medical devices
Nearly 30% of US adults embrace wearable medical devices; miniaturized connectors are an essential component
Source: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7600024
mmWave Radar Beyond Automotive ADAS
By JJ DeLisle for Mouser Electronics
Introduction
Millimeter-wave (mmWave) radar technology has been instrumental in developing advanced driver assistance system (ADAS) features and is being integrated into new automotive technology. Due to the success of mmWave radar ADAS systems, mmWave radar technology has become much more accessible, and this accessibility has led to the development and use of mmWave radar technology in a variety of other applications. Industrial automation is one of the largest emerging applications for mmWave radar, as the high resolution and accuracy of mmWave radar have proven extremely useful for person and object detection. mmWave radar technology is even sensitive enough to detect changes in human blood pressure and other biological signatures indicative of fatigue and other conditions.
With the growth in applications of mmWave radar and the advantages that high-frequency radar brings to antenna miniaturization, there is a push to develop more compact and portable mmWave radar systems. These systems are intended for various applications, from medical, in-building automation, industrial automation, and robotics to future automotive use cases. Achieving these higher thresholds of performance and miniaturization requires substantial development efforts in miniaturizing the mmWave radar modules themselves, as well as the interconnect that bridges the sensor module output to advanced electronic control units (ECUs) and artificial intelligence/machine learning (AI/ML) cores. Naturally, miniaturization of interconnect and increased throughput demands comes with its own challenges in ensuring signal integrity.
This article aims to educate readers on the nuances and state of mmWave radar technology used in automotive and other applications. This article will cover mmWave radar frequencies, considerations associated with miniaturizing mmWave antennas and modules, and signal integrity challenges that emerge when approaching such significant levels of miniaturization.
mmWave radar frequencies
Automotive radar uses several frequency bands; of these, four main mmWave frequency bands are currently in use. However, the major developers of automotive radar technologies and automotive manufacturers have somewhat settled on the 77GHz band (76GHz-81GHz). The 24GHz band (21GHz-26GHz) is still in use for ultra-wideband (UWB) radar and communication applications but is now no longer preferred for automotive radar applications for ADAS. There are also 24GHz narrow-band (NB) applications in the ISM band, the initial automotive radar frequency band of choice due to its international availability. However, the miniaturization potential and substantially wider bandwidth availability of 77GHz radar make using that mmWave frequency band far more attractive. For instance, the 24GHz ISM band offers only 250MHz bandwidth, while the 77GHz band offers 5GHz bandwidth.
Moreover, the relative size of antenna technology is proportional to the frequency. Hence, 77GHz antenna technology is roughly a third of the size of 24GHz antenna technology, making integrating 77GHz mmWave radar modules much easier than a larger 24GHz automotive antenna.
Two main ranges exist in the 77GHz mmWave radar frequency band: 76GHz-77GHz and 77GHz-81GHz. The 1GHz bandwidth section from 76GHz to 77GHz is predominately used for automotive long-range radar (LRR), and the 4GHz bandwidth section from 77GHz to 81GHz is primarily used for automotive short-range radar (SRR). However, for nonautomotive applications, mmWave radar frequencies are being developed (and likely to be deployed) in the 24GHz band, 60GHz/v-band (57GHz-71GHz), and E-band (71GHz-76GHz, 81GHz-86GHz, and 92GHz-95GHz).
mmWave radar antenna miniaturization considerations
The relative size of antennas is a function of frequency/wavelength. The smaller the wavelength, the smaller an antenna can be to efficiently transduce electromagnetic fields into conducted electric energy.
However, there is a price to this natural miniaturization. Higher-frequency antenna elements also tend to be lower gain than antenna structures optimized for lower frequencies. This is due to the smaller electrical length of smaller high-frequency antennas.
Attenuation through the atmosphere and most materials is also a function of frequency. This is why atmospheric attenuation is higher at mmWaves than microwave frequencies. Moreover, conductive losses and radiative losses also increase with frequency. This means that mmWave antennas individually tend to be far less efficient than lower-frequency antennas for transmitting and receiving over the same distance. However, the higher mmWave frequencies have substantial amounts of available bandwidth, resulting in higher-precision radar functionality and higher-throughput communications.
To overcome this limitation, mmWave antennas for radar applications are most often advanced/active antenna systems (AAS), commonly phased array antennas. Using AAS array antenna technology allows for enhanced gain and greater control of the antenna pattern at the cost of greater complexity and additional hardware components. Traditionally, mmWave technology has been the domain of militaries, aerospace, and satellite communications. There still isn't the same level of product availability for the RF components at mmWave frequencies, especially upper mmWave frequencies such as the 77GHz band.
The individual component cost for mmWave components is also generally much higher, as the volumes are still small compared to most lower-frequency components. The overall cost of fabricating mmWave circuits is also higher, as the laminates needed as circuit substrates to efficiently carry the mmWave signals in planar transmission lines and waveguides are more expensive, and the processing is at a higher cost associated with greater precision and smaller geometric features. There are also fewer PCB fabrication facilities that are set up to work on mmWave laminates and at the necessary levels of precision.
This is why there are generally fewer engineering service companies equipped to tackle mmWave projects and fewer engineers with the background and expertise to assist in developing mmWave radar circuits. Having this type of technology developed requires care, and there are few turnkey solutions to having mmWave radar developed, so much of the work needs to be done in-house.
Maintaining good signal integrity in dense mmWave radar interconnect
The additional antenna structures and hardware components needed for mmWave radar antenna systems require more complex and higher-density mmWave and digital-signal routing within the mmWave radar module, as well as between the radar module and the control system electronics for the platform.
In many cases, the electronics used for mmWave radar modules are highly integrated on specialized high-frequency laminates and with specialized mmWave board-to-board interconnect. Given the precision necessary to ensure signal integrity and good performance with mmWave electronics, these systems need to be environmentally sealed and ruggedized to avoid any form of debris or moisture ingress and damage. Therefore, the interconnect to the mmWave radar modules must also be sealed and rugged.
The main benefit of miniaturizing mmWave radar is to be able to integrate the technology into mobile and even portable systems. Mobile and portable systems are often subjected to harsh environments, shock, and vibration, especially in automotive, industrial, and military/aerospace applications. The connectors and cabling for the interconnect used in these applications need to provide high signal integrity in all these extremes. This is a daunting design challenge, as connectors for mmWave radar also need to be extremely compact with low-profile and high-pitch design features. For automotive, industrial, and military/aerospace applications, standards and quality control systems must be implemented for the connectors to be used in specific applications and environments. There are few connectors available on the market that meet all these requirements.
Conclusion
mmWave antenna system and circuit design is fraught with layers of complex considerations where every aspect of the design needs to be optimized extensively and precisely manufactured. Ensuring a high level of signal integrity within the mmWave radar module circuits and between the mmWave radar module and control electronics is essential in realizing reliable mmWave radar systems. This requires ruggedized connectors that can handle high digital throughput even in extreme conditions. Moreover, mmWave radar modules are being increasingly miniaturized to enable use in additional applications beyond automotive. This miniaturization adds higher density and low-profile interconnect design requirements on top of the rugged and high-speed interconnect requirements.
Photo/imagery credits (in order of display)
Robert - stock.adobe.com, Blue Planet Studio - stock.adobe.com
Shrinking IoT Sensor Nodes
By Steve Taranovich for Mouser Electronics
Automated factories provide unprecedented productivity and efficiency, but they lack flexibility and intelligence. Industry 4.0 and 5.0 aim to make factories smarter by taking advantage of wireless and computing advances to bring real-time insights to factory equipment. This data-driven approach transforms the factory from a passive automated machine to a network of distributed intelligent devices that can make real-time and predictive decisions.
The information that powers the connected industrial world comes from an increasing number of wireless sensor nodes. A key advantage of a wireless sensor node is the capability to continuously monitor many parameters. For example, these types of sensors can be deployed in pipes to monitor flow and other key parameters of fluids. Wireless sensor nodes, which can vary in size from that of a shoebox to nearly imperceptibly small, usually communicate via wireless radio modules.
Widely used in environmental monitoring, industrial control, infrastructure security, and other fields, wireless sensor nodes typically comprise a microcontroller (MCU), transceiver, memory unit, power source, and one or more sensors that perceive the ambient environment (Figure 1).
In industrial settings, the deployment of wired networks of sensor nodes can be challenging due to the complex and dynamic nature of manufacturing environments. Wireless sensor networks address these challenges by eliminating the need for extensive cabling, providing greater mobility, and enabling rapid deployment and reconfiguration of monitoring systems. Wireless sensor networks may contain thousands of sensors connected over a mesh network such as Zigbee or WirelessHART.
Wireless sensor networks are employed for a variety of purposes, including condition monitoring, predictive maintenance, asset tracking, and environmental monitoring. These networks consist of small, battery-powered sensors distributed throughout the facility, communicating with each other and, in some cases, with a central control system. The sensors measure parameters such as temperature, humidity, pressure, vibration, and more, depending on the specific requirements of the application.
Smaller, more accurate sensors
New sensors must fit into the existing factory space without interfering with the workflow. Miniaturization helps sensors fit seamlessly into industrial environments without space or power concerns. For example, a sensor on a robot arm must be small and light enough to not interfere with the arm's movement. In predictive maintenance scenarios, small wireless sensors can be strategically placed on machinery components to monitor conditions and detect anomalies without interfering with normal operations. These sensors can be embedded directly into machinery or production lines or even worn by workers, providing a high degree of spatial coverage without disrupting the operational workflow.
Despite their smaller form factors, sensors must perform increasingly accurate and complex operations to power real-time, data-driven decisions in smart factories. The accuracy of these sensors is imperative to ensure that the data collected reflects the true state of the industrial environment. Inaccurate sensor readings can lead to misguided decisions, potentially causing operational inefficiencies, increased downtime, and compromised product quality. For example, when sensors are used in predictive maintenance, accurate sensor data ensures that downtime for maintenance is necessary.
Achieving high accuracy in wireless sensors involves careful calibration, regular maintenance, and adherence to stringent quality standards. Calibration ensures that sensors deliver precise and consistent measurements over time, accounting for factors like environmental changes and sensor drift. Sensors can also be calibrated to self-detect sensor anomalies. If a sensor reading is outside of the calibrated range, the sensor may determine that the reading is erroneous and indicate that it needs to be serviced.
Smart industrial environments must automatically perform predictive or corrective actions in real time. For smart factories to be trusted to make these decisions, they need to run on trusted data. Calibrated sensors provide accurate readings, and redundancy provides reliable results. Measuring a variety of parameters gives deeper insights into what's happening on the factory floor and enables smart machines to make more informed decisions.
Powering wireless sensors
One of the critical challenges of wireless sensor node design is the power supply. The challenge lies in balancing the need for long operational lifetimes with the constraints of small form factors and limited energy storage capacities in sensor nodes. Sensors are also often deployed in remote places where it is not possible to run a power cable. Low-power design principles, such as optimizing communication protocols and using energy-efficient components, play a crucial role in minimizing energy consumption.
Batteries are a common power source for wireless sensor nodes due to their convenience and reliability. However, the finite energy capacity of batteries necessitates careful energy management strategies to extend the network's lifespan. Additionally, recharging or replacing batteries may be impractical or impossible for nodes deployed in remote or inaccessible locations.
Advanced energy storage technologies, such as rechargeable batteries and supercapacitors, are employed to enhance the sustainability of wireless sensor nodes. Supercapacitors, on the other hand, are energy storage devices that store electrical energy via the separation of charge. Unlike batteries, supercapacitors can deliver rapid bursts of power and have a longer cycle life. They are particularly well-suited for applications where quick energy discharge is essential, such as in situations where sudden bursts of sensor activity are required. However, supercapacitors typically have lower energy density compared to batteries, which means they may not sustain the sensor nodes for extended periods without recharging or energy harvesting support.
One approach to powering wireless sensor nodes is to add energy-harvesting capabilities to the node. Energy harvesters can scavenge the available energy—be it mechanical, thermal, or even photovoltaic—from the local environment and convert it into electrical energy.
Solar energy harvesting is one of the most common energy-harvesting methods, utilizing photovoltaic cells to convert sunlight into electrical power. In outdoor and well-lit environments, solar energy can provide a continuous and reliable source of energy, enabling long-term operation of wireless sensor nodes without the need for battery replacement. However, the effectiveness of solar harvesting depends on factors like geographical location, weather conditions, and the energy requirements of the sensor nodes.
Other forms of energy harvesting include piezoelectric materials that generate electrical power from mechanical vibrations, and thermoelectric generators that convert temperature differences into electricity. These methods are particularly useful in industrial settings where vibrations or temperature differentials are prevalent. While energy harvesting offers the potential for indefinite operation without the need for battery replacement, it is highly dependent on environmental conditions. The intermittent and variable nature of ambient energy sources requires efficient energy storage and management systems to ensure continuous and reliable operation.
Harsh sensor environments
Wireless sensors often operate in harsh environments, ranging from industrial settings to remote outdoor locations. These environments pose unique challenges that necessitate robust design considerations for the sensor nodes to ensure reliable operation. In industrial contexts, sensors may be exposed to extreme temperatures, high levels of humidity, corrosive substances, and mechanical vibrations. For example, in manufacturing plants, sensors may be subjected to rapid temperature fluctuations or exposure to chemicals during certain processes. In these scenarios, sensors need to be ruggedized and equipped with protective enclosures to withstand harsh conditions, ensuring their functionality is not compromised.
Outdoor applications, such as environmental monitoring or agriculture, can expose sensor nodes to varying weather conditions, including rain, snow, and extreme temperatures. In these cases, the sensors must be designed to be waterproof, dustproof, and resistant to temperature extremes to maintain accurate data collection. In addition to environmental factors, wireless sensor nodes in harsh environments often contend with electromagnetic interference (EMI), which can arise from heavy machinery, power equipment, or other wireless devices operating in the vicinity. To mitigate interference, robust communication protocols and signal processing techniques are employed to ensure data integrity and network reliability.
Advances in materials science, sensor technology, and communication protocols continue to drive innovations in creating resilient wireless sensor nodes and networks tailored for diverse harsh environments and applications. For example, the Molex Contrinex (Figure 2) series of inductive sensors have a small form factor ideal for limited-space applications and an IP67- or IP69K-rated enclosed construction that protects the sensor from harsh environments.
Conclusion
Wireless sensor nodes empower smart industrial automation systems to implement predictive maintenance strategies, optimize energy consumption, and enhance overall operational efficiency. Their small form factors, often coupled with energy-efficient designs, make them suitable for integration into machinery, equipment, and even hazardous or hard-to-reach locations, providing a scalable and flexible solution for modernizing industrial processes.
Photo/imagery credits (in order of display)
Damian Sobczyk - stock.adobe.com, EwaStudio - stock.adobe.com
Brasil


