Georgia Tech creating high-tech tools to study autism

Recently Barfi was released depicting the problems faced by the autistic people. we have seen these days many tech people coming forward to attend autistic people with the high end technologies. scientist and engineers and Georgia tech are also working for the same

Researchers in Georgia Tech’s Center for Behavior Imaging have developed two new technological tools that automatically measure relevant behaviors of children, and promise to have significant impact on the understanding of behavioral disorders such as autism. One of the tools—a system that uses special gaze-tracking glasses and facial-analysis software to identify when a child makes eye contact with the glasses-wearer—was created by combining two existing technologies to develop a novel capability of automatic detection of eye contact. The other is a wearable system that uses accelerometers to monitor and categorize problem behaviors in children with behavioral disorders. Both technologies already are being deployed in the Center for Behavior Imaging’s (CBI) ongoing work to apply computational methods to screening, measurement and understanding of autism and other behavioral disorders. Children at risk for autism often display distinct behavioral markers from a very young age. One such marker is a reluctance to make frequent or prolonged eye contact with other people. Discovering an automated way to detect this and other telltale behavioral markers would be a significant step toward scaling autism screening up to much larger populations than are currently reached. This is one goal of the five-year, $10 million “Expeditions” project, funded in fall 2010 by the National Science Foundation under principal investigator and CBI Director Jim Rehg, also a professor in Georgia Tech’s School of Interactive Computing. The eye-contact tracking system begins with a commercially available pair of glasses that can record the focal point of their wearer’s gaze. Researchers took video of a child captured by a front-facing camera on the glasses, worn by an adult who was interacting with the child. The video was then processed using facial recognition software available from a second manufacturer. Combine the glasses’ hard-wired ability to detect wearer gaze with the facial-recognition software’s ability to detect the child’s gaze direction, and the result is a system able to detect eye contact in a test interaction with a 22-month-old with 80 percent accuracy. The study was conducted in Georgia Tech’s Child Study Lab (CSL), a child-friendly experimental facility richly equipped with cameras, microphones and other sensors. “Eye gaze has been a tricky thing to measure in laboratory settings, and typically it’s very labor-intensive, involving hours and hours of looking at frames of video to pinpoint moments of eye contact,” Rehg said. “The exciting thing about our method is that it can produce these measures automatically and could be used in the future to measure eye contact outside the laboratory setting. We call these results preliminary because they were obtained from a single subject, but all humans’ eyes work pretty much the same way, so we’re confident the successful results will be replicated with future subjects.” The other new system, developed in collaboration with the Marcus Autism Center in Atlanta and Dr. Thomas Ploetz of Newcastle University in the United Kingdom, is a package of sensors, worn via straps on the wrists and ankles, that uses accelerometers to detect movement by the wearer. Algorithms developed by the team analyze the sensor data to automatically detect episodes of problem behavior and classify them as aggressive, self-injurious or disruptive (e.g., throwing objects). Researchers first developed the algorithms by putting the sensors on four Marcus clinic staff members who together performed some 1,200 different behavior instances, and the system detected “problem” behaviors with 95 percent accuracy and classified all behaviors with 80 percent accuracy. They then used the sensors with a child diagnosed along the autism spectrum, and the system detected the child’s problem-behavior episodes with 81 percent accuracy and classified them with 70 percent accuracy. “These results are very promising in leading the way toward more accurate and reliable measurement of problem behavior, which is important in determining whether treatments targeting these behaviors are working,” said CSL Director Agata Rozga, a research scientist in the School of Interactive Computing and co-investigator on the Expeditions award. “Our ultimate goal with this wearable sensing system is to be able to gather data on the child’s behavior beyond the clinic, in settings where the child spends most of their time, such as their home or school. In this way, parents, teachers and others who care for the child can be potentially alerted to times and situations when problem behaviors occur so that they can address them immediately.” “What these tools show is that computational methods and technologies have great promise and potential impact on the lives of many children and their parents and caregivers,” said Gregory Abowd, Regents’ Professor in the School of Interactive Computing and a prominent researcher in technology and autism. “These technologies we are developing, and others developed and explored elsewhere, aim to bring more effective early-childhood screening to millions of children nationwide, as well as enhance care for those children already diagnosed on the autism spectrum.” Both technologies were presented in early September at the 14th ACM International Conference on Ubiquitous Computing (Ubicomp 2012). Among the other devices under study at CSL are a camera/software system that can track children’s facial expressions and customized speech analysis software to detect vocalization patterns.

Related Posts Plugin for WordPress, Blogger...