WT | News

Discover our incredible news!

News

X
Text dummy
Text Link

AI Robot Dog Tackles Complex Obstacle Courses

AI robot has now been trained to navigate challenging, never-before-seen obstacle courses.

Although acrobatic robot displays seem like a wonderful marketing gimmick, they are usually well planned and expertly rehearsed. A four-legged AI robot has now been trained by researchers to navigate challenging, never-before-seen obstacle courses in practical settings.

The real world's inherent complexity, the quantity of information robots can gather about it, and the speed at which judgments must be made to perform dynamic motions make building nimble robots difficult.

Organizations such as Boston Dynamics have frequently published films of their robots performing a variety of tasks, including parkour and dancing. Even while these achievements are astounding, they usually require people to laboriously program each step or repeatedly practice in extremely controlled situations, reports Edd Gent in Singularity Hub.

The ability to apply abilities in the real world is severely limited by this approach. However, using machine learning, researchers from ETH Zurich in Switzerland have taught their robot dog, ANYmal, a set of fundamental locomotive skills. With these skills learned, the dog can now navigate a wide range of difficult obstacle courses both indoors and outdoors at up to 4.5 miles per hour.

After segmenting the problem into three sections, the researchers allocated a neural network to each part in order to develop a system that was both adaptable and capable. Initially, they developed a perception module that builds an image of the terrain and any obstructions in it using data from lidar and cameras.

Related Robot Completes Surgery in Space

They integrated this with a locomotion module that has picked up a wide range of abilities, such as jumping, climbing, crouching, and climbing down, to help it get over various barriers. Ultimately, these modules were combined with a navigation module that could determine which abilities to use to overcome each obstacle and plot a path through a sequence of them.

Instead of using human examples throughout the training process, the researchers exclusively used reinforcement learning, or trial and error. This allowed them to train the AI model on a huge number of randomized scenarios without having to manually label each one.

The fact that the robot uses chips that are implanted within it rather than relying on external computers is another amazing aspect. The researchers also demonstrated that ANYmal could overcome falls or slides in order to finish the obstacle course, in addition to being capable of handling a wide range of conditions.

Nevertheless, the research shows that robots are getting better at functioning in challenging real-world settings. That implies that they might soon be considerably more noticeable everywhere.

Text Link

Printed and Flexible Sensor Market Poised to Grow

Printed and flexible sensor technology sectors are expected to experience growth.

In the modern world, sensors—of which some are printed and flexible—are essential. They serve as the link between the real and virtual worlds, measuring an enormous variety of physical properties. Printed sensors, as the name suggests, are sensors that are printed onto rigid or flexible substrates utilizing functional inks that may be processed in a solution. As a result, printed sensors may be made at drastically lower costs in huge quantities utilizing proven manufacturing processes.

Printed and flexible sensors can measure a plethora of physical interactions, including touch, force, pressure, displacement, temperature, electrical signals, as well as detecting gases. One of the earliest, and now most ubiquitous, printed sensor technologies is printed force sensors, which are found in cars for seat belt occupancy detection. Printed sensors find applications in commercial sectors such as automotives, healthcare, wearables, consumer electronics, industry, and logistics.

With regard to this subject, IDTechEx is in a very special position. The analyst team draws from many years of following new technology markets, with a focus on printed electronics—a crucial component of printable and flexible sensors—among them. In the past, IDTechEx has provided assistance for this by concurrently hosting the top industry conferences and exhibitions pertaining to printed, flexible, and wearable electronics. The analysis in this report was made easier by IDTechEx's special capacity to create a network in certain subject areas.

This report critically evaluates eight printed sensor technologies, covering printed piezoresistive sensors and force sensors (FSRs), piezoelectric sensors, photodetectors, temperature sensors, strain sensors, gas sensors, capacitive touch sensors, and wearable electrodes. The report also discusses areas of innovation in manufacturing of printed sensors, including focus on emerging material options as well as the technology underlying the manufacturing process. This report characterizes each application of printed sensors, discussing the relevant technology, product types, competitive landscape, industry players, pricing, as well as key meta-trends and drivers for each sector. The report also contains detailed printed and flexible sensors market forecasting over 10 years for each of the key printed sensor technology areas.

The research behind the report has been compiled over many years by IDTechEx analysts. It builds on existing expertise in areas such as sensors, wearable technology, flexible electronics, stretchable and conformal electronics, smart packaging, conductive inks, nanotechnology, future mobility and electronic textiles. The methodology involved a mixture of primary and secondary research, with a key focus on speaking to executives, engineers, and scientists from companies developing printed and flexible sensors. As such, the report analyses all known major companies and projects, including over 35 profiles.

Related The Potential of 3D Printed Electronic Skin

Regarding the eight printed sensor technology sectors that are concerned, this study offers vital market intelligence. This comprises:

This report provides critical market intelligence about the 8 printed sensor technology areas involved. This includes:

A review of the context and technology behind printed and flexible sensors:

• History and context for each technology area
• General overview of important technologies and materials
• Overall look at printed and flexible sensor trends and themes within each technology area
• Benchmarking and analysis of different players throughout

Text Link

Fraunhofer FEP’s Microdisplays and Sensors Business Unit

Fraunhofer FEP’s microdisplays and sensors business unit has been integrated into Fraunhofer IPMS.

The Microdisplays and Sensors business unit at the Fraunhofer Institute for Organic Electronics, Electron Beam and Plasma Technology FEP will be integrated into the Fraunhofer Institute for Photonic Microsystems IPMS with retroactive effect from January 1, 2024. Both institutes are closely connected, particularly within this business unit, and share infrastructure at the Dresden site. By pooling expertise and streamlining structures, we anticipate the creation of synergies that will strengthen the research field, expedite development and thus benefit customers and partners.

There is rapid development in the market for the microdisplays used in augmented reality (AR), virtual reality (VR) and mixed reality (MR) applications (often collectively referred to as XR) and this will be an important growth market of the future. The integration of OLED and µLED frontplane technologies in CMOS backplanes is not only the key to success in this sector but also the technological basis for near-to-eye visualization of information. Fraunhofer IPMS and Fraunhofer FEP have now decided, in consultation with the Fraunhofer-Gesellschaft, to integrate the Fraunhofer FEP Microdisplays and Sensors business unit into Fraunhofer IPMS. Their goal is to leverage synergies in the area of infrastructure, pool expertise and establish a unique profile for both institutes. Fraunhofer IPMS has long been one of the leading institutes in microelectronics and microsystems engineering.

Over the past ten years, the Microdisplays and Sensors business unit has developed into a globally successful and established player under the umbrella of Fraunhofer FEP. At the current stage of development, the transfer to an institute specialized in microelectronics offers suitable conditions to further develop the business unit. This will also allow Fraunhofer FEP, as a process-oriented institute, to focus even more on its expertise in electron beam and plasma technology. This transfer provides technological solutions for the growing demand in the fields of energy, sustainability, life sciences and environmental technologies for industry and society — now and in the future.

“By integrating the Fraunhofer FEP Microdisplays and Sensors business unit into Fraunhofer IPMS, we are pooling our expertise and ensuring the best possible use of our infrastructure. This will also increase our chances to win projects with the Microelectronics group. The transfer is a good example of the strategic development of a research field and the leveraging of synergies across institutes,” says Prof. Holger Hanselka, President of the Fraunhofer-Gesellschaft. “This will strengthen the research field and pave the way for new technological capabilities in the field of microdisplays by leveraging the synergies of the existing microelectronics infrastructure. The close relationship of the institutes at the Dresden site will ensure seamless and continuous advancement in this field. My special thanks go to all those involved for their contributions.”

Related Fraunhofer ISE Develops World's Most Efficient Solar Cell

Harald Schenk, Director of Fraunhofer IPMS, added: “In the future, Fraunhofer IPMS will increase its activities in this area and focus more on the heterogeneous integration of various chiplet technologies in conjunction with CMOS microelectronics. This future-oriented technology includes the integration of organic semiconductors (e.g., OLEDs) and novel emitter technologies (e.g., µLEDs), which will open up new avenues in micro/optoelectronics and microsystems engineering.”

Elizabeth von Hauff, Director of Fraunhofer FEP, said: “The Microdisplays and Sensors business unit has played a significant role in Fraunhofer FEP’s dynamic development. We are proud of this and would like to thank our employees and managers for their dedication. Transferring to Fraunhofer IPMS will open up additional development potential for the business unit and enable Fraunhofer FEP to focus on strategic topics in the field of electron beam and plasma technologies.”

About Fraunhofer IPMS

The Fraunhofer Institute for Photonic Microsystems IPMS is one of the leading research and development service providers in the application fields of intelligent industrial solutions and manufacturing, medical technology and health, and mobility. Research focuses on miniaturized sensors and actuators, integrated circuits, wireless and wired data communication and customized MEMS systems. In state-of-the-art clean rooms, the institute researches and develops solutions on 200 mm and 300 mm wafers. Services range from consulting and process development to pilot production.

About Fraunhofer FEP

The Fraunhofer Institute for Organic Electronics, Electron Beam and Plasma Technology FEP focuses on the development of innovative solutions, technologies and processes for surface finishing. This work is based on the institute’s expertise in the fields of electron beam technology, plasma-assisted large-area and precision coating, roll-to-roll technologies and the development of key technological components.

Fraunhofer FEP thus offers a broad spectrum of research, development and pilot production options, especially for the treatment, sterilization, structuring and refinement of surfaces but also liquids and gases.

Text Link

Biocompatible Sticker Detects Post Surgical Leaks

A sticker that allows medical professionals to check on the condition of a patient's deep tissues.

Patients recuperating from gastrointestinal surgery may soon find their lives saved by a little, straightforward sticker.

Researchers led by Northwestern University and Washington University School of Medicine in St. Louis have developed a new, first-of-its-kind sticker that enables clinicians to monitor the health of patients' organs and deep tissues with a simple ultrasound device.

BioSum, an acronym for "Bioresorbable, Shape-adaptive, Ultrasound-readable Materials," was created by Northwestern University's Prof. John A. Rogers and postdoctoral fellow Jiaqi Liu. Dr. Hammill initiated the study, and led the evaluation of the prototype.

The BioSUM takes the form of a thin, flexible, biocompatible sticker, made up of several spaced-apart metal discs embedded in a pH-responsive hydrogel base. When attached to an organ, the soft, tiny sticker changes in shape in response to the body's changing pH levels, which can serve as an early warning sign for post-surgery complications such as anastomotic leaks. Clinicians then can view these shape changes in real time through ultrasound imaging. As long as no leakages occur, the BioSUM stays in its default state, reports NewAtlas.

Currently, no existing methods can reliably and non-invasively detect anastomotic leaks - a life-threatening condition that occurs when gastrointestinal fluids escape the digestive system. By revealing the leakage of these fluids with high sensitivity and high specificity, the non-invasive sticker can enable earlier interventions than previously possible. Then, when the patient has fully recovered, the biocompatible, bioresorbable sticker simply dissolves away; bypassing the need for surgical extraction.

"These leaks can arise from subtle perforations in the tissue, often as imperceptible gaps between two sides of a surgical incision," said Northwestern's John A. Rogers, who led device development with postdoctoral fellow Jiaqi Liu. "These types of defects cannot be seen directly with ultrasound imaging tools. They also escape detection by even the most sophisticated CT and MRI scans. We developed an engineering approach and a set of advanced materials to address this unmet need in patient monitoring. The technology has the potential to eliminate risks, reduce costs and expand accessibility to rapid, non-invasive assessments for improved patient outcomes."

Related Johnson & Johnson Partners With Microsoft For Digital Surgery Solutions

"Right now, there is no good way whatsoever to detect these kinds of leaks," said gastrointestinal surgeon Dr. Chet Hammill, who led the clinical evaluation and animal model studies at Washington University with collaborator Dr. Matthew MacEwan, an assistant professor of neurosurgery. "The majority of operations in the abdomen; when you have to remove something and sew it back together; carry a risk of leaking. We can't fully prevent those complications, but maybe we can catch them earlier to minimize harm. Even if we could detect a leak 24- or 48-hours earlier, we could catch complications before the patient becomes really sick. This new technology has potential to completely change the way we monitor patients after surgery."

To evaluate the efficacy of the new sticker, Hammill's team tested it in both small and large animal models. In the studies, ultrasound imaging consistently detected changes in the shape-shifting sticker -; even when it was 10 centimeters deep inside of tissues. When exposed to fluids with abnormally high or low pH levels, the sticker altered its shape within minutes.

Rogers and Hammill imagine that the device could be implanted at the end of a surgical procedure. Or, because it's small and flexible, the device also fits (rolled up) inside a syringe, which clinicians can use to inject the tag into the body.

Text Link

Solar Panels in Your Eyeballs to Restore Vision

Scientists are working on implanting small solar panels inside people's eyes to restore vision.

While it may sound like science fiction, a group of Australian scientists are actually working on implanting small solar panels inside people's eyes. Patients with irreversible eye illnesses may have a significantly better quality of life because to novel technologies.

UNSW researcher Dr Udo Roemer is an engineer who specializes in photovoltaics, known more commonly as solar panel technology. He is in the early stages of researching how solar technology can be used to convert light entering the eye into electricity, bypassing the damaged photoreceptors to transmit visual information to the brain.

“People with certain diseases like retinitis pigmentosa and age-related macular degeneration slowly lose their eyesight as photoreceptors at the center of the eye degenerate,” Dr Roemer says.

“It has long been thought that biomedical implants in the retina could stand in for the damaged photoreceptors. One way to do it is to use electrodes to create voltage pulse that may enable people to see a tiny spot.

“There have already been trials with this technology. But the problem with this is they require wires going into the eye, which is a complicated procedure.”

But an alternative idea is to have a tiny solar panel attached to the eyeball that converts light into the electric impulse that the brain uses to create our visual fields. The panel would be naturally self-powered and portable, doing away for the need to have cables and wires into the eye, report Lachlan Gilbert in UNSW News.

Dr Roemer isn’t the first to investigate the use of solar cells assisting in restoring sight. But rather than focus on silicon-based devices, he has turned his attention to other semiconductor materials such gallium arsenide and gallium indium phosphide, mainly because it’s easier to tune the materials’ properties. It’s also used in the solar industry at large to make much more efficient solar panels, although it’s not as cheap as the all-purpose silicon.

“In order to stimulate neurons, you need a higher voltage than what you get from one solar cell,” Dr Roemer says.

“If you imagine photoreceptors being pixels, then we really need three solar cells to create enough voltage to send to the brain. So we’re looking at how we can stack them, one on top of the other, to achieve this.

“With silicon this would have been difficult, that’s why we swapped to gallium arsenide where it’s much easier.”

Related Spiral Lens Gives you Clearer Vision

So how far along is this research?

Dr Roemer says it’s in the proof-of-concept stage.

“So far we’ve successfully put two solar cells on top of each other in the lab on a large area – about 1cm2, which has got some good results.”

The next step will be to make them into the tiny pixels required for sight, and etching the grooves to separate them. It will then be a small step to increase the stack to three solar cells.

Dr Roemer envisages by the time this technology will be able to be tested in humans – after extensive testing in the lab, followed by testing in animal models – the device will be about 2mm2 in size with pixels measuring about 50 micrometers (five hundredths of a millimeter). He stresses that it’s still a way down the track before this technology will be implantable in the retinas of people with degenerative eye diseases.

“One thing to note is that even with the efficiencies of stacked solar cells, sunlight alone may not be strong enough to work with these solar cells implanted in the retina,” he says.

“People may have to wear some sort of goggles or smart glasses that work in tandem with the solar cells that are able to amplify the sun signal into the required intensity needed to reliably stimulate neurons in the eye.”

Text Link

FDA Clears Dexcom Stelo Glucose

FDA has cleared Stelo by Dexcom – the first glucose biosensor that doesn’t require a prescription.

DexCom, the global leader in real-time continuous glucose monitoring for people with diabetes, announced today that the FDA has cleared Stelo by Dexcom – the first glucose biosensor that doesn’t require a prescription. There are approximately 25 million people in the U.S. living with Type 2 diabetes who do not use insulin and who can benefit from continuous glucose monitoring (CGM) technology. Today, Dexcom G7 is available for them with a prescription. Stelo, cleared for use without a prescription, will make it even easier for this population to access leading CGM technology, and will provide an option for those who do not have insurance coverage for CGM.

Related DarioHealth to Integrate Dexcom CGM Data

“Dexcom continues to lead innovation in the CGM market, with a long list of first-in-market advances. Dexcom was the first to connect CGM to multiple insulin delivery devices, the first to connect CGM to a smartphone, the first to replace fingersticks for treatment decisions, and now is creating a new category by bringing the first glucose biosensor cleared for use over-the-counter,” said Jake Leach, executive vice president and chief operating officer at Dexcom. “Based on our experience serving people with Type 2 diabetes not using insulin, we have developed Stelo with their unique needs in mind.”

Continuous glucose monitoring plays an integral role in the management of Type 2 diabetes and the benefits are proven when used alone, or alongside other diabetes and weight management medications.8 Studies show the use of Dexcom continuous glucose monitoring by people with Type 2 diabetes is associated with clinically meaningful improvement in time in range, A1c and quality of life.9-12, reports Business Wire.

“Use of CGM can help empower people with diabetes to understand the impact of different foods and activity on their glucose values,” said Dr. Tamara Oser, MD, Family Physician. “For people newly diagnosed with Type 2 diabetes or not taking insulin, these devices are often not covered by insurance and Stelo presents an opportunity to provide valuable information that can impact their diabetes management.”

Stelo will be available for purchase online without a prescription starting summer 2024.

About DexCom

DexCom, Inc. empowers people to take real-time control of health through innovative continuous glucose monitoring (CGM) systems. Headquartered in San Diego, Calif., and with operations across Europe and select parts of Asia/Oceania, Dexcom has emerged as a leader of diabetes care technology. By listening to the needs of users, caregivers, and providers, Dexcom works to simplify and improve diabetes management around the world.

Text Link

Smart Glasses Use Eye Tracking Via Sonar

Researchers have developed prototypes of a technology that uses sonar for tracking eye movements.

Prototypes of a sonar-like device have been created by Cornell University in New York, and it may eventually replace cameras for tracking eye movements. It makes use of tiny speakers that produce music at frequencies higher than 18 kHz for each eye. The majority of people cannot hear these.

Four microphones on either side of the headset receive sound that is aimed at the wearer's face, reflects it, and then records it. These sound waves are then interpreted by an algorithm known as GazeTrak, which helps the researchers discern the direction of the wearer's gaze, reports Mixed.

The research team believes that sonar technology should provide a number of benefits. When compared to camera-based systems, it uses less electricity and gives consumers greater privacy because the cameras are not continuously recording. Additionally, it might lessen the weight and production costs of VR headsets.

Sonar-based eye tracking demonstrated an accuracy of up to 3.6 degrees in tests involving 20 individuals. Compared to modern high-end gadgets like the Apple Vision Pro, this is not as accurate. But according to the experts, this performance ought to be adequate for the majority of virtual reality applications.

Related These AR Glasses Can Translate Languages and Detect Images

As of now, the technology has one significant flaw: each user's eyeball is unique, hence the AI model employed by GazeTrak needs to be trained individually. It would take the collection of sufficient data to produce a universal model in order to market the eye-tracking sonar.

Virtual reality relies heavily on eye tracking technology, which lets you focus at certain spots on menus to navigate or make eye contact with other avatars in virtual surroundings. Right now, Apple Vision Pro is showing off how accurate eye tracking can enhance the user experience.

Additionally, eye tracking allows for rendering—such as on the Playstation VR 2—that takes into account the user's focus by displaying a precise depiction of the area being watched and a less detailed picture in the periphery. Ingenious control techniques are also made possible by technology; one example is the VR game Before Your Eyes, which you can only control with your eyes.

Text Link

New AI Algorithm Receives FDA Clearance

A Technology that enables AI-powered sleep diagnosis using pulse oximetry devices.

EnsoData, a pioneer in healthcare AI, achieved FDA 510(k) clearance for groundbreaking technology that enables AI-powered sleep diagnosis using FDA-cleared pulse oximetry devices. Powered by EnsoSleep PPG scoring, widely available and wearable pulse ox technology can be deployed for a high quality, accessible, and cost-effective approach to diagnosing sleep disorders, including sleep apnea.

Sleep apnea is a highly prevalent but often undiagnosed condition that exacerbates cardiovascular diseases like high blood pressure and congestive heart failure, neurodegenerative diseases like Alzheimer's, metabolic disorders including diabetes, stroke, and more. It is estimated that over 29.4 million Americans have sleep apnea, with more than 80% of cases still undiagnosed.

With early and accurate diagnosis of sleep apnea, clinicians can help prevent complications and reduce healthcare expenses, not only saving lives but also having a profound impact on healthcare economics.

EnsoSleep, EnsoData's previously FDA-cleared diagnostic AI analysis and sleep scoring solution, uses machine learning to analyze data from traditional sleep studies to aid physicians in diagnosing sleep disorders. With this new clearance, EnsoSleep PPG will provide more opportunities for clinicians to effectively reach an undiagnosed patient population by enabling AI-driven analysis using the photoplethysmogram (PPG) signals recorded by pulse oximeters.

Related Sleep Disorder Diagnosis Software Receives FDA Approval

"Expanding EnsoData's capability to collect and analyze PPG signals from simple, wearable pulse ox devices will accelerate the identification, diagnosis and treatment of sleep disordered breathing events, including sleep apnea," said Justin Mortara, President and CEO of EnsoData. "With this latest FDA clearance, we expect to build upon and diversify our partner ecosystem to reach more patients with our leading AI solutions."

Compared to earlier generations of sleep diagnostic equipment, pulse ox devices are smaller and less expensive. They can be as simple to wear as a ring or watch and record physiological data related to sleep and breathing, such as a patient's oxygen saturation levels and heart rate.

Using this data, EnsoSleep PPG's deep learning models automatically detect respiratory events, including sleep disordered breathing events such as apneas or hypopneas, sleep stages including REM, deep sleep, light sleep, wake, and other sleep measures, which may be displayed and edited by a qualified healthcare professional and then exported into a final sleep report for a patient.

By lowering the barrier for patients to receive an accurate sleep test to more widely available pulse ox devices, clinicians can expedite the diagnostic process and provide patients with answers to their health problems more quickly.

"Our interoperable AI tools are democratizing the ability to accurately measure sleep and aid in diagnosis of sleep disorders broadly – for the existing category of FDA-cleared pulse oximetry devices and sensors that are already widely deployed, in-use clinically, and growing in their adoption daily. With PPGs among the most commonplace of medical waveforms collected across healthcare settings, from diagnostic tests to bedside monitors and consumer wearables, this will be transformative for patient access and outcomes to achieve better sleep and overall health," said Chris Fernandez, Co-founder and CRO of EnsoData. "This FDA clearance marks a pivotal moment in sleep diagnostics, and also long-term therapy monitoring and management, where enhanced accessibility and affordability can create a new normal in sleep care."

About EnsoData

EnsoData is a healthcare technology company that uses artificial intelligence (AI) and machine learning (ML) technology to perform complex and time-consuming data interpretation and analysis to help diagnose and treat health conditions.

Text Link

Smart Prosthetic Lets Man Feel Hot and Cold

A new device that makes it possible for a person with an amputation to sense temperature.

The history of prosthetics begins in ancient Egypt. There are indications that the "Cairo toe" is more than just a cosmetic deformity. The wood-and-leather digit was made to be flexible even 3,000 years ago, indicating that it served a purpose in addition to being aesthetically pleasing.

Since then, a lot of effort has gone into coming up with creative ways to make the lives of those who have had amputations better. Engineers started adding wires, gears, and springs to prostheses as early as the 15th century to allow users to grab objects and bend joints, albeit in a restricted way. Prosthetics were designed to carry out particular functions, including playing the piano or shield-holding. Additionally, they become increasingly accustomed to handling the materials of discovery, such as rubber, polymers, and thin metals.

Now, a team of researchers in Italy and Switzerland have developed a new device that makes it possible for a person with an amputation to sense temperature with a prosthetic hand. The advancement of technology is a step toward the creation of prosthetic limbs that fully restore a person's senses, increasing their utility and acceptance by the wearer, reports Science News.

The device, dubbed "MiniTouch," was affixed to the prosthetic hand of a 57-year-old man named Fabrizio who had his wrist amputated above. The researchers were based in Switzerland and Italy. In experiments, the guy demonstrated perfect accuracy in identifying cold, cool, and hot bottles of liquid; much greater accuracy than chance in distinguishing between plastic, glass, and copper; and approximately 75% accuracy in sorting steel blocks according to temperature.

The prosthetic works by applying heat or cold to the skin on the upper arm in specific locations that trigger a thermal sensation in the phantom hand.

Related New Exoskeleton Helping Disabled Walk, Stand

“In a previous study, we have shown the existence of these spots in the majority of amputee patients that we have treated,” says Solaiman Shokur at the Swiss Federal Institute of Technology in Lausanne.

Fidati was also able to accurately identify glass, copper, and plastic by touch when wearing a prosthetic, with a precision of slightly over two-thirds, exactly like he could with his unharmed left hand.

In a different, recently published study, Shokur and his associates demonstrated that individuals with amputations who use prosthetics that sense temperature can distinguish between moist and dry objects.

“We could provide a wetness sensation to amputees and… they were as good at detecting different levels of moisture as with their intact hands,” says Shokur.

Text Link

Robot Startup Secures $675M, Inks Partners With OpenAI

Figure has raised $675M in Series B funding and entered into a collaboration agreement with OpenAI.

Figure, an AI robotics company developing general purpose humanoid robots, announced that it has raised $675M in Series B funding at a $2.6B valuation with investments from Microsoft, OpenAI Startup Fund, NVIDIA, Jeff Bezos (through Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures, and ARK Invest. This investment will accelerate Figure's timeline for humanoid commercial deployment.

In conjunction with this investment, Figure and OpenAI have entered into a collaboration agreement to develop next generation AI models for humanoid robots, combining OpenAI's research with Figure's deep understanding of robotics hardware and software. The collaboration aims to help accelerate Figure's commercial timeline by enhancing the capabilities of humanoid robots to process and reason from language.

"We've always planned to come back to robotics and we see a path with Figure to explore what humanoid robots can achieve when powered by highly capable multimodal models. We're blown away by Figure's progress to date and we look forward to working together to open up new possibilities for how robots can help in everyday life," said Peter Welinder, VP of Product and Partnerships at OpenAI.

The Figure team, made up of top AI robotics experts from Boston Dynamics, Tesla, Google DeepMind, and Archer Aviation, has made remarkable progress in the past few months in the key areas of AI, robot development, robot testing, and commercialization (Figure recently announced its first commercial agreement with BMW Manufacturing to bring humanoids into automotive production). This new capital will be used strategically for scaling up AI training, robot manufacturing, expanding engineering headcount, and advancing commercial deployment efforts.

Figure will leverage Microsoft Azure for AI infrastructure, training, and storage. "We are excited to collaborate with Figure and work towards accelerating AI breakthroughs. Through our work together, Figure will have access to Microsoft's AI infrastructure and services to support the deployment of humanoid robots to assist people with real world applications," said Jon Tinter, Corporate Vice President of Business Development at Microsoft.

Related Robot Completes Surgery in Space

"Our vision at Figure is to bring humanoid robots into commercial operations as soon as possible. This investment, combined with our partnership with OpenAI and Microsoft, ensures that we are well-prepared to bring embodied AI into the world to make a transformative impact on humanity," said Brett Adcock, Founder and CEO of Figure. "AI and robotics are the future, and I am grateful to have the support of investors and partners who believe in being at the forefront."

About Figure

Figure is an AI Robotics company developing autonomous general purpose humanoid robots. Our Humanoid is designed for initial deployment into the workforce to address labor shortages, jobs that are undesirable or unsafe, and to support supply chain on a global scale. Figure is a Sunnyvale, California-based company with a team of 80 employees, founded 21 months ago.

Text Link

Dubai Startup Launches Iron Man Inspired Contact Lens

Xpanceo has released prototypes of smart contact lenses, which were inspired by the Iron Man movies.

Dubai-based start-up Xpanceo has unveiled prototypes of its smart contact lens, which aims to mimic the technology featured in the Iron Man films.

Xpanceo is founded by Roman Axelrod, a former engineer at Google and Facebook, who was inspired by the fictional character Tony Stark also known as Iron Man, and his advanced artificial intelligence system, JARVIS. Axelrod set out to design a wearable that would provide a smooth and engaging interface between people and machines without requiring cumbersome screens, headphones, or glasses.
Axelrod said that the business, which raised $40 million in a seed fundraising round in October, intends to put the device on stores by 2027 following human studies that are expected to be completed in two years, reportsITC.

"It usually takes from half a year to several years [to conduct human trials]. After that, they will be on shelves, I would say [by] 2027 or 2028, in optical stores," he said.

The four Xpanceo smart contact lenses with different functions shown at Mobile World Congress 2024 will be combined into one universal device in the future.

The purpose of the sensor lens is to gauge eye pressure and notify the user of any signs of glaucoma. Nanoparticles are used by the "super vision" lens to enlarge images or enhance visibility in low light. People did not, however, put the lenses into their eyes during the show; instead, their capabilities were displayed on specialized platforms.

Related Smart Contact Lens Powered by Salt Water

Xpanceo intends to incorporate a neuro interface into the apparatus, enabling direct mental control of the lens. The glucose level sensors should be added to the list of monitoring levels of cortisol, blood pressure, etc. If your blood pressure is elevated, the lens will be able to alert you to the need to avoid drinking another cup of coffee.

In addition to giving vision superpowers, the nanoparticles will treat vision-related disorders like myopia and strabismus. The lens will dynamically alter its own properties as needed to provide consistently good vision.

Because it offers a customized, interactive, and augmented reality experience, the smart contact lens has the potential to completely transform a number of industries, including communication, education, entertainment, health, and security. Those with diseases, disabilities, or visual impairments may benefit from the lens in terms of their quality of life and talents.

Text Link

Ultrahuman’s CGM Hits US Market

Ultrahuman stated that it has now launched the M1 CGM in the United States.

Ultrahuman is a health-tech brand that's best known for its wearable devices, particularly its smart rings. Now, the company has launched its continuous glucose monitor in the US. The Ultrahuman M1 CGM (continuous glucose monitor) was released in the the UK, Netherlands and India back in June 2021, formerly known as Ultrahuman Cyborg. It proved to be a huge success, with many consumers raving about its insights.

The Ultrahuman M1 CGM is worn on the back of the arm and measures the user's blood sugar continually for a period of 14 days. After that, it pairs with the Ultrahuman app over Bluetooth to show the gathered data in real time. The Ultrahuman app is designed to provide users suggestions on how to increase their metabolism by demonstrating how their food and level of fitness affect it. It may easily fit into any daily routine because it is water-resistant for 30 minutes and up to 2 meters.

A New Era In Metabolic Health

The primary attraction of the M1 is its capacity to offer prompt feedback on an individual's glucose levels, a crucial indicator of metabolic well-being.

Related South Korean CGM Receives Regulatory Approval

By pairing the device with the Abbott FreeStyle Libre sensor, customers may use a smartphone app to track their blood sugar levels in real time. This feature is a game-changer for anyone trying to learn more about how different foods impact their body; it's more than just a convenience. The M1 is easy to use without compromising accuracy or dependability, with suggestions to replace the adhesive sensor patch every two weeks and a simple NFC scanning process for Bluetooth connection. Additionally, its 30-minute 30-meter water resistance guarantees that it blends in perfectly with its users' varied and active lifestyles, reports bnn.

You can order the Ultrahuman M1 CGM now in a variety of bundles (as spotted by T3). It costs $239 for a month (with two sensors), $559 for three months (six sensors), or $1,919 for a year (a total of 26 sensors). That's with the current 20% offer applied, which will only last for a limited time.

Text Link

Archinisis Unveils Rowing Performance Analysis System

Archinisis’ new rowing analysis system offers real-time feedback and detailed performance insights.

Archinisis is redefining rowing performance analysis with its latest sensor technology. This innovation captures real-time data, offering athletes and coaches immediate insights into stroke efficiency and rowing dynamics. Designed for simplicity and effectiveness, it seamlessly integrates into daily training routine to enhance training and competitive performance.

To enhance the rowing experience, Archinisis' technology is not only about capturing data but also about understanding it in the context of rowing dynamics. The tool's design focuses on easy setup and operation, allowing coaches to concentrate on observing athletes and providing feedback rather than technical configurations and cumbersome data analysis. By providing detailed yet easily understandable feedback on stroke rate, boat speed, and other crucial metrics, it allows for a modern, data-driven training approach.

realtime_dashboard.png

The real-time feedback offers coaches immediate insights into a boat’s performance, allowing for on-the-spot adjustments. By attaching a sensor to the boat, it captures and streams motion data to a server that calculates key metrics like stroke rate, speed, and pace. This information is readily available on any device, anywhere, with no distance limitations affecting access.

Selected national and high-performance university rowing teams are already utilizing this technology for various purposes, including monitoring training sessions, tracking progress, optimizing team composition, and analyzing race performance.

splits_table.png

For more information visit Archinisis’ website: https://www.archinisis.ch/sports/rowing.html

About Archinisis:

Archinisis, founded in 2018, is at the forefront of developing sophisticated performance analysis technology, aimed at enhancing athletic performance across various sports disciplines. Their mission revolves around providing precise and actionable insights through advanced yet easy-to-use sensor technology and analytics, enabling athletes and coaches to refine strategies and optimize performance.

Text Link

New Exoskeleton Helping Disabled Walk, Stand

A new robotic exoskeleton could allow people to stand up and even walk.

Those who have lost the ability to move their legs may be able to stand and maybe walk again thanks to a new robotic exoskeleton. Through holding them up and guiding their motions while they participate in rehabilitative therapy, it may also help them walk unassisted once more.

The lower-body exoskeleton, called Twin, was shown on Friday at a press conference at the Museum of Science and Technology in Milan. It is the product of Italian design, reports Ben Coxworth in New Atlas.

Scientists from the Istituto Italiano di Tecnologia (the Italian Institute of Technology) and the Istituto Nazionale Assicurazione Infortuni sul Lavoro (the National Institute for Insurance against Accidents at Work) are developing it; it is now only in prototype form.

It is designed for individuals whose lower body motor function is limited or nonexistent. The motors in question move the patient's legs through the knee and hip joints. The onboard battery that powers those motors is said to be capable of providing four hours of operation on a single hour of charge.

Twin can be utilized in three different operational modes.

The exoskeleton moves the user's legs in the Walk mode, which is designed for individuals who have no use of their legs at all. It also assists the user in sitting and standing. As with previous helpful exoskeletons, such those manufactured by ReWalk, the user still requires crutches for balance.

Related Exoskeleton Designed to Prevent Overextension of Finger

Retrain mode lets patients walk as independently as possible while providing an adjustable level of help as necessary. It is designed for those who still have some lower-limb motor function. The exoskeleton is directing them toward a preset ideal leg-movement trajectory during the procedure.

Lastly, there is the TwinCare mode, designed for people who can fully use one leg but not the other. In this instance, the exoskeleton increases the afflicted leg's range of motion to match that of the healthy leg. Using a wirelessly connected Android tablet, a physiotherapist or the user themselves can adjust gait parameters including stride length/type and walking speed in all three modes.

Twin's modular design, which permits components to be removed for transportation or updating, and its use of lightweight materials—aluminum alloy rather than steel, for example—are two qualities that, according to its designers, set it apart from other exoskeletons of a similar type.

Since the end of 2013, the device has been in development, and production should presumably start soon. In the video below, you can see it in action.

Text Link

Ultra-Thin Minimally Invasive Pacemaker

A group of scientists from the University of Chicago have created a wireless, light-powered gadget.

Sometimes our bodies need a boost. Millions of Americans rely on pacemakers—small devices that regulate the electrical impulses of the heart in order to keep it beating smoothly. But to reduce complications, researchers would like to make these devices even smaller and less intrusive.

A team of researchers with the University of Chicago has developed a wireless device, powered by light, that can be implanted to regulate cardiovascular or neural activity in the body. The featherlight membranes, thinner than a human hair, can be inserted with minimally invasive surgery and contain no moving parts, reports Louise Lerner in UChicago News.

Published Feb. 21 in Nature, the results could help reduce complications in heart surgery and offer new horizons for future devices.
“The early experiments have been very successful, and we’re really hopeful about the future for this translational technology,” said Pengju Li, a graduate student at the University of Chicago Pritzker School of Molecular Engineering and first author on the paper.

A new frontier

The laboratory of Prof. Bozhi Tian has been developing devices for years that can use technology similar to solar cells to stimulate the body. Photovoltaics are attractive for this purpose because they do not have moving parts or wires that can break down or become intrusive—especially useful in delicate tissues like the heart. And instead of a battery, researchers simply implant a tiny optic fiber alongside to provide power.

But for the best results, the scientists had to tweak the system to work for biological purposes, rather than how solar cells are usually designed.

“In a solar cell, you want to collect as much sunlight as possible and move that energy along the cell no matter what part of the panel is struck,” explained Li. “But for this application, you want to be able to shine a light at a very localized area and activate only that one area.”

For example, a common heart therapy is known as cardiac resynchronization therapy, where different parts of the heart are brought back into sync with precisely timed charges. In current therapies, that’s achieved with wires, which can have their own complications.

Related Novel Cable System for Heart Pumps Doesn’t Cause Infections

Li and the team set out to create a photovoltaic material that would only activate exactly where the light struck.

The eventual design they settled on has two layers of a silicon material known as P-type, which respond to light by creating electrical charge. The top layer has many tiny holes—a condition known as nanoporosity—which boost the electrical performance and concentrate electricity without allowing it to spread.

The result is a miniscule, flexible membrane, which can be inserted into the body via a tiny tube along with an optic fiber—a minimally invasive surgery. The optic fiber lights up in a precise pattern, which the membrane picks up and turns into electrical impulses.

The membrane is just a single micrometer thin—about 100 times smaller than the finest human hair—and a few centimeters square. It weighs less than one fiftieth of a gram; significantly less than current state-of-the-art pacemakers, which weigh at least five grams. “The more lightweight a device is, the more comfortable it typically is for patients,” said Li.

“This advancement is a game-changer in cardiac resynchronization therapy,” said Narutoshi Hibino, professor of surgery at the University of Chicago Medicine and co-corresponding author on the study. “We're at the cusp of a new frontier where bioelectronics can seamlessly integrate with the body's natural functions.”

Light use

Though the first trials were conducted with heart tissue, the team said the approach could be used for neuromodulation as well—stimulating nerves in movement disorders like Parkinson’s, for example, or to treat chronic pain or other disorders. Li coined the term ‘photoelectroceuticals’ for the field.

Tian said the day when they first tried the pacemaker in trials with pig hearts, which are very similar to those of humans, remains vivid in his memory. “I remember that day because it worked in the very first trial,” he said. “It's both a miraculous achievement and a reward for our extensive efforts.”

The research team is currently working with the UChicago Polsky Center for Entrepreneurship and Innovation to commercialize the device.

Text Link

March 2024: Revolution in Diabetes, Painless Smartpatch

Medicsen revolutionizes diabetes treatment with a painless Smartpatch.

In the ever-evolving field of healthcare technology, Medicsen emerges as a bold innovator, shaping the future for people living with diabetes. Driven by the desire to eliminate the pain and inconvenience associated with traditional diabetes management, Medicsen has developed a revolutionary solution: the Medicsen Smartpatch.

Imagine, a world where managing diabetes is painless, discreet, and personalized. The Medicsen Smartpatch makes this vision a reality. This needle-free, wearable device acts as a smart companion, simulating the functions of the pancreas and delivering essential medications directly through the skin.

Utilizing harmless waves, the Smartpatch painlessly opens natural skin pores, allowing macromolecules like insulin and heparin to enter the body. This innovative technology ensures maximum comfort and discretion, tucked away in a small, wearable device.

Image credits: Medicsensors S.L

About Medicsen

Medicsen was founded in 2023 and has quickly developed into an award-winning start-up in the field of medical technology. The company specializes in pain-free drug delivery through the skin and wearable devices for patients with chronic diseases. The company focuses on developing user-friendly and pain-free solutions for a better quality of life.

Text Link

Smart Gloves Could Teach New Physical Skills

Researchers at MIT have developed smart gloves that can record personalized haptic feedback.

MIT researchers have created smart gloves that have the ability to record, transmit, and provide customized haptic input. Technology has the potential to improve both in-person and virtual learning environments.

The training code and experimental data have been made openly available to interested parties by the researchers.

The scientists weave haptic actuators, similar to those found in smartphones, into textiles using a digital sewing machine. The sensations that can simulate holding an object or hitting a button are replicated in the present iteration, reports TechSpot.

The gloves have the potential to bring novel teaching approaches by means of recording feelings and transferring them between users. One way a piano teacher tested the technology was by having students record a song by feeling the keys as they were pressed. When hovering over the appropriate keys, students using smart gloves might experience the same sensation, bringing a tangible component to digitally transmitted education to mimic hands-on training.

Training gloves could also be used by firefighters, pilots, and surgeons. They might also assist humans in teaching robots or provide them with more precise direct control. The researchers used haptic feedback to guide a robot in precisely how much pressure to use when handling fragile goods.

Additionally, a machine learning phase adapts the haptic feedback and gloves to each user according on their hand measurements and reactions. It takes only 15 seconds to personalize a pair of gloves, and about 10 minutes to make a pair for a new user. The fact that each person experiences tactile feedback differently makes this technological feature essential.

Related Researchers Develop Smart Gloves for Safe Surgery

Video games that were tailored to use the haptic feedback for activities like driving and rhythmic following were also used in the experiments. More accurate tactile reception is made possible by individualized sensation, as demonstrated by the superior performance of players with optimized feedback compared to those with unoptimized haptics.

Accuracy might rise with additional development, and the technology could be used for other activities. Smart textiles for less sensitive body parts—like the hands—may become possible with stronger haptics. More sophisticated AI might be used to imitate more difficult jobs like flying an airplane or shaping clay. More user data may result in more realistic tactile simulation and better-fitting gloves.

Text Link

Linxens Expands Its Knowledge of Micro-Connectors

Linxens is approaching new markets in addition to growing into related industries.

Linxens is a leading company in the world for designing and producing smartcard inlays and connections. In addition to expanding into allied fields like RFID and reel-to-reel flexible electronics, Linxens is entering new markets for its early warning odor sensors for Li-ion batteries used in electric cars.

The CEO of Linxens, Cuong Duong, stated that the company has a long history, having created the micro-connector more than 40 years ago, which is a crucial part in the creation of the smart card. Linxens carried on with its firm, concentrating on creating cutting-edge technologies in various fields.

Linxens has expanded its scope of expertise over time to include RFID antennas and inlays, which are essential parts of RFID tags used in a variety of applications such as supply chain management, contactless payment systems, and hospitality, reports David Savastano, Editor of Printed Electronics.

“The world's biggest technology players entrust us with their projects in the fields of payments, telecoms, identity, access control, hospitality and leisure and logistics, etc.,” Duong said. “Leveraging on its knowledge in micro-connectors, Linxens contributed to the development of flexible electronic solutions, which are applied in diverse industries such as wearables, healthcare, and automotive.

Read more Wellysis ECG Patch Hits US, Indian Markets

“Today, with over 120 billion micro-connectors and more than 6 billion RFID antennas produced, Linxens is the global specialist in the design and manufacture of these electronic components,” added Duong. “Linxens is exploring new fields in which to apply its technological expertise, such as connected healthcare or authentication and traceability for the IoT or identification for the government.”

Duong pointed out that Linxens places a high priority on innovation and customer service in a number of areas, including government, healthcare, and the Internet of Things.

Duong believes Linxens has a promising future as it seeks to enter new markets.

“Linxens’ objective is to be the leader in all its historical market segments by enabling technology to facilitate the consumer’s life and providing more secure, reliable and sustainable solutions,” Duong concluded. “Linxens is striving to maximize its diversification strategy through innovation on new markets, including healthcare and IoT solutions.”

Text Link

New Smart Glass Uses Snapdragon AR1 Platform

Qualcomm and Applied Materials are working with Avegant to develop a blueprint for smart glasses.

Building off the announcement of the Snapdragon AR1 Platform in October 2023, Avegant is collaborating with Qualcomm Technologies, Inc. and Applied Materials, Inc. to create a blueprint for lightweight, wireless AI smart glasses. This sleek design is under evaluation by Fortune 100 companies, with products expected to be in the market soon.

This AI smart glass architecture brings together industry-leading augmented reality (AR) technology providers to unlock a new wave of AR glass innovation. It features the smallest 30° LCoS light engine in the market -- Avegant's AG-30L2 -- along with Applied Materials' high-efficiency waveguides and Qualcomm Technologies' latest Snapdragon AR1 Gen 1 Platform, delivering full color, binocular, bright daylight-capable experiences.

"Our AG-30L2 incorporates innovative illumination and optical designs to significantly reduce the light engine volume, enabling our customers to build true, glasses-like form factor products. The AG-30L2 is in production now and seeing extraordinary adoption and excitement from customers thanks to its small form factor and performance. We are excited to help bring AI smart glasses powered by the Snapdragon AR1 to market," said Ed Tang, CEO of Avegant.

"The Snapdragon AR1 Platform is the first XR platform designed for AI smart glasses. Packed with next-generation technology, the platform is the perfect blend of intelligence and style with support for binocular displays, premium dual ISPs, powerful on-device AI, and blazing fast connectivity. We are excited to work with Avegant and Applied Materials to bring this exciting AI glass category to life," said Said Bakadir, Senior Director, Product Management at Qualcomm Technologies, Inc.

Related These AR Glasses Can Translate Languages and Detect Images

"Consumers want stylish AR smart glasses that provide brilliant clarity and are comfortable to wear on a daily basis. Our waveguide technology offers unparalleled efficiencies and enables crisp and clear images in a lightweight form factor," said Paul Meissner, VP and GM of Applied Materials' Photonics Platforms Business in the Office of the CTO. "We are excited by the opportunity to co-optimize our leading component devices with Qualcomm Technologies and Avegant to create compelling user experiences."

About Avegant

Avegant is a well-funded, venture-backed technology company developing next-generation display technology to enable previously impossible augmented reality experiences. The company uses its deep scientific understanding of human sight and head-mounted display ergonomics together with its consumer electronics manufacturing experience to develop displays that enable realistic AR experiences for consumers. Avegant's AR Light Engines will enable customers to provide a compelling AR experience in a consumer wearable AR device.

Text Link

Robot Avatar Lets People See and Feel Things Remotely

People can attend events without traveling by using a humanoid robot that can transmit touch.

People can attend events without traveling by using a humanoid robot that can transmit touch and video sensations to an individual hundreds of kilometers away, provided they are wearing haptic feedback gloves and a virtual reality (VR) headset.

The iCub 3 robot has 54 points of movement throughout its plastic and aluminum alloy body. It weighs 52 kg and is 125 centimeters tall. Two cameras replace the eyes on its head, and an internet-connected computer sits where the brain should be. The robot's "brain" receives data from sensors all over its body in addition to the cameras. A remote human operator then dons a suit and VR headgear to simulate these sensations, reports Chris Stokel-Walker in NewScientist.

The suit's sensors detect the operator's motions in response to their senses, and the robot mimics those movements. “The key is to translate every signal and bit of numeric data that can be sent through the network,” explains Stefano Dafarra, an iCub 3 team member from the Italian Institute of Technology. The operator can reduce this by moving a little more slowly than usual. There may be a tiny delay of up to 100 milliseconds between the time the video footage is captured and transmitted.

The robot was shown off by the team during the Venice Biennale, where it navigated an exhibit while its operator watched from Genoa, 290 kilometers away.

Read more Accelerating the Production of Soft Robots

Dafarra anticipates that more people will utilize the iCub 3 to attend events virtually, cutting down on travel time. However, he notes that a fall may currently seriously harm the robot and that it's not sure if it could get back up on its own.

“iCub 3 is an interesting robot and offers clear advantages from the previous iteration,” says Jonathan Aitken at the University of Sheffield, UK, whose laboratory owns a prior version of the robot. However, he is disappointed that the team wasn’t clear in its research about the data transmission requirements of the new version of the robot. “It would be good to know just how much data was required, and what the upper and lower bounds were,” he says.

Text Link

January 2026: Nutromics Lab-on-a-Patch

Skin-worn patch enabling continuous, real-time biomarker monitoring for personalized healthcare.
Text Link

December 2025: Miniaturized Temperature Sensing Accuracy

AS6223 – Miniaturized temperature sensing accuracy for next-generation wearables.
Text Link

November 2025: Transforming Cancer Care with Wearables

Wearable implant delivering continuous, personalized cancer therapy for everyday life.
Text Link

October 2025: The New Era of Meta Smart Glasses

Meta Smart Glasses 2025: Sleek, AI-powered eyewear for hands-free capture and connection.
Text Link

September 2025: Innovation in Oxygen Monitoring

OxiWear - Innovation in wearable health, protecting you from silent hypoxia every day.
Text Link

August 2025: Ultra-Thin Battery Revolution in Wearables

NGK's 0.45mm EnerCera Battery: Non-Swelling, Non-Flammable Power for Wearables
Text Link

July 2025: Mudra Link - Neural Gesture Control Wristband

Touchless neural wristband for seamless gesture control across devices and platforms.
Text Link

June 2025: Biobeat’s Next-Generation Wearable Solution

AI-powered wearable for continuous, cuffless vital sign monitoring in clinical and home settings.
Text Link

May 2025: Breakthrough in Continuous Glucose Monitoring

Needle-free biosensor patch for real-time glucose monitoring and metabolic health insights.
Text Link

April 2025: Robeauté’s Brain Microrobot

Robeauté's microrobot enables precise, minimally invasive brain intervention with cutting-edge tech.
Text Link

March 2025: The Future of Cognitive Health

G.Brain boosts focus and brain health with AI-powered neurotechnology.
Text Link

February 2025: Revolutionizing Women's Health

Nettle™ by Samphire Neuroscience: A non-invasive, drug-free solution for women's health.
Text Link

January 2025: The Future of Heated Apparel

Revolutionizing heated clothing with sensor-driven, real-time temperature control.
Text Link

December 2024: Remote Health with Smart Patches

Wearable tech enables non-invasive, continuous health monitoring, transforming patient care.
Text Link

November 2024: Bearmind Launches Brain Health Wearable

Bearmind’s helmet sensor tracks head impacts in real time, advancing safety in contact sports.
Text Link

October 2024: Ambiq Empowers Digital Health with Edge AI

Ambiq’s low-power chips enable personal AI on-device for digital health and remote monitoring.
Text Link

September 2024: The Revolutionary .lumen Glasses

Empowering the visually impaired with smart, award-winning technology for greater independence.
Text Link

August 2024: Breakthrough in the Field of Health Monitoring

BioButton: award-winning sensor for continuous vital health monitoring with advanced AI technology.
Text Link

July 2024: Innovation in the Fight Against Voice Disorders

Speaking without vocal cords, thanks to a new AI-assisted wearable device.
Text Link

June 2024: World's Most Accurate Hydration Sensor

To prevent cramps and collapses, the company FLOWBIO has launched its hydration sensor S1.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.