Skip to main content

Smart A.I. bodysuits could reveal when babies are developing mobility problems

In sci-fi shows like Star Trek, people wear jumpsuits because, well, it’s the future. In real life, babies could soon wear special high-tech jumpsuits designed to help doctors monitor their movements and look for any possible mobility issues that are developing.

The smart jumpsuit in question has been developed by medical and A.I. researchers in Finland’s Helsinki Children’s Hospital. In a recent demonstration, they fitted 22 babies, some as young as four months, with jumpsuits equipped with motion sensors. These enabled the suits to register the acceleration and positional data of wearers and relay it to a nearby smartphone. A neural network was then trained to recognize posture and movement by comparing data from the suits with video shot by the researchers.

“We [showed] that it is possible to monitor infant’s motor activity very accurately in an out-of-hospital setting,” Sampsa Vanhatalo, a professor in the Department of Neurosciences, told Digital Trends. “This allows detailed assessment in a native environment, [such as] an infant’s home, which has been unreachable in the past. The current practice is to assess infants in the hospital rooms, during brief visits to doctor or physiotherapists. This assessment is mostly qualitative. It is known, however, that infant’s spontaneous performance in their native environment may be very different, and the current clinical assessment protocols fall short in their capture.”

Baby smart bodysuit
Mrs. Taru Häyrinen

The hope is that being able to identify possible movement-related (or movement-indicated) issues at an early stage could allow necessary therapy or other interventions to be introduced more rapidly. While some people will likely balk at the idea of putting data-gathering wearables on very young kids, this could therefore turn out to be a very useful invention from a healthcare perspective.

“The current study shows that our system is as reliable as a human observer in assessing motility in up to seven-month-old infants,” Vanhatalo continued. “[That means] infants that have not yet learned sitting or standing. Our goal during the first half of 2020 is to train the algorithms for walking as well, and to enable longer at-home recordings. We are looking into commercialization as a medical product that could be used as part of neurological assessment in the clinical or academic use. The first clinical trials to validate our methods could start as early as [this year].”

A paper describing the work, titled “Automatic Posture and Movement Tracking of Infants with Wearable Movement Sensors,” is available to read online at electronic preprint repository arXiv.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
A.I. headphones could warn distracted pedestrians when there’s traffic around
PAWS headphones 1

Headphones have the ability to seal us in our own isolated sound bubbles; putting an invisible wall around wearers, even in public spaces. At least, it can feel that way. In reality, while the world might seem like it disappears when you put on your fancy AirPods Pro, it doesn’t actually. As walking across a busy street without paying attention would quickly remind you.

Could machine intelligence help where human intelligence fails us?

Read more
Revisiting the rise of A.I.: How far has artificial intelligence come since 2010?
christie's auction house obvious art ai

2010 doesn’t seem all that long ago. Facebook was already a giant, time-consuming leviathan; smartphones and the iPad were a daily part of people’s lives; The Walking Dead was a big hit on televisions across America; and the most talked-about popular musical artists were the likes of Taylor Swift and Justin Bieber. So pretty much like life as we enter 2020, then? Perhaps in some ways.

One place that things most definitely have moved on in leaps and bounds, however, is on the artificial intelligence front. Over the past decade, A.I. has made some huge advances, both technically and in the public consciousness, that mark this out as one of the most important ten year stretches in the field’s history. What have been the biggest advances? Funny you should ask; I’ve just written a list on exactly that topic.

Read more
Mind-reading A.I. analyzes your brain waves to guess what video you’re watching
brain control the user interface of future eeg headset

Neural networks taught to "read minds" in real time

When it comes to things like showing us the right search results at the right time, A.I. can often seem like it’s darn close to being able to read people’s minds. But engineers at Russian robotics research company Neurobotics Lab have shown that artificial intelligence really can be trained to read minds -- and guess what videos users are watching based entirely on their brain waves alone.

Read more