A new app successfully detects one of the telltale characteristics of autism in toddlers.
The technology could one day become an inexpensive and scalable early screening tool, the research suggests.
Researchers created the app to assess the eye gaze patterns of children while they watched short, strategically designed movies on an iPhone or iPad, then applied computer vision and machine learning to determine whether the child looked more often at the human in the video, or objects.
“We know that babies who have autism pay attention to the environment differently and are not paying as much attention to people,” says Geraldine Dawson, director of the Duke Center for Autism and Brain Development, and co-senior author of a study in JAMA Pediatrics.
“We can track eye gaze patterns in toddlers to assess risk for autism,” Dawson says. “This is the first time that we’ve been able to provide this type of assessment using only a smart phone or tablet. This study served as a proof-of-concept, and we’re very encouraged.”
In a sample clip from one of the child development digital app’s strategically designed movies, a man blows bubbles in the same frame that shows an inanimate object.
Eye gaze a clue for autism in toddlers
Dawson and colleagues, including lead author Zhuoqing Chang, postdoctoral associate in the electrical and computer engineering department, began collaborating several years ago to develop the app. In this latest version, the researchers strategically designed movies that would enable them to assess a young child’s preference for looking at objects more than at people.
One movie, for example, shows a cheerful woman playing with a top. She dominates one side of the screen while the top she is spinning is on the other side. Toddlers without autism scanned the entire screen throughout the video, focusing more often on the woman.
Toddlers later diagnosed with autism, however, more often focused on the side of the screen with the toy. Another similarly-designed movie showed a man blowing bubbles. The researchers observed differences in eye gaze patterns for toddlers with autism across several movies in the app.
Researchers have previously used eye-tracking to assess gaze patterns in people with autism, but this has required special equipment and expertise to analyze the gaze patterns.
The app, which takes less than 10 minutes to administer and uses the front-facing camera to record the child’s behavior, only requires an iPhone or iPad, making it readily accessible to primary care clinics and useable in home settings.
“This was the technical achievement many years in the making,” Chang says. “It required our research team to design the movies in a specific way to elicit and measure the gaze patterns of attention using only a hand-held device.
“It’s amazing how far we’ve come to achieve this ability to assess eye gaze without specialized equipment, using a common device many have in their pocket.”
Greater access to autism screening
To test the device, the researchers included 993 toddlers ages 16-38 months; the average age was 21 months, when autism spectrum disorder (ASD) is often identified. Forty of the toddlers were diagnosed with ASD using gold-standard diagnostic methods.
Ongoing validation studies are underway, Dawson says. Additional studies with infants as young as 6 months are investigating whether the app-based assessment could identify differences in children who are later diagnosed with autism and neurodevelopmental disorders during the first year of life.
“We hope that this technology will eventually provide greater access to autism screening, which is an essential first step to intervention. Our long-term goal is to have a well-validated, easy-to-use app that providers and caregivers can download and use, either in a regular clinic or home setting,” Dawson says. “We have additional steps to go, but this study suggests it might one day be possible.”
Support came from the National Institutes of Health, Autism Centers of Excellence Award, the National Institute of Mental Health, the National Science Foundation, the Marcus Foundation, the Simons Foundation, the Office of Naval Research, the National Geospatial-Intelligence Agency, Apple, Inc., Microsoft, Inc., Amazon Web Services, and Google, Inc. The funders/sponsors had no role in the design and conduct of the study.
Coauthors Dawson, Chang, Guillermo Sapiro, Jeffrey Baker, Kimberly Carpenter, Steven Espinosa, and Adrianne Harris developed technology related to the app that has been licensed to Apple, Inc., and both they and Duke University have benefited financially. Additional conflicts are disclosed in the study.
Source: Duke University