UTA professor’s aim: robots to assist with everyday tasks
Computer scientist developing robots to help people with disabilities lead more independent lives
Grant and Award AnnouncementA University of Texas at Arlington computer science researcher is developing a robotic system that helps people with spinal cord injuries perform everyday tasks, and she’s involving members of the UTA Movin’ Mavs wheelchair basketball team in the effort.
Fillia Makedon, a distinguished professor in the Department of Computer Science and Engineering, received a $218,000 grant from the National Science Foundation’s (NSF) Disability and Rehabilitation Engineering (DARE) program. The project, “Collaborative Research: DARE: A Personalized Assistive Robotic System that Assesses Cognitive Fatigue in Persons with Paralysis,” represents a collaboration with Santa Clara University Professor Maria Kyrarini, who received a similar-sized grant to find their research.
“Assistive robots can play a significant role in assisting persons with disabilities at home, improving independence and everyday quality of life,” Makedon said. “For example, a robot may assist an individual with motor impairments to perform a task such as preparing lunch or getting ready for a meeting or work. That would be a huge plus for elderly people who need some assistance but aren’t ready or willing to move into a full-fledged assisted living facility.”
Makedon’s goal is to design a personalized assistive robotic system, which she has named Intelligent Robotic Cooperation for Safe Assistance (iRCSA), that can recognize, assess and respond to a person’s cognitive fatigue level during tasks such as cooking. To do these human-robot collaboration (HRC) tasks, Makedon and her team will develop a multi-sensory system that collects physiological data like facial expressions from the human teammate during an HRC task. The system then applies advanced machine learning/deep learning methods to automatically assess the individual’s cognitive fatigue.
“Based on the cognitive fatigue assessment, the iRCSA system will adapt the robot’s behavior in order to provide personalized support,” Makedon said. “We will develop human-robot collaboration scenarios where a person suffering from a spinal cord injury and a robot can cooperate easily to perform daily tasks.
“For the design, development, and evaluation of iRCSA, we will follow the participatory action research approach by involving in the system design students suffering from spinal cord injury. UTA’s Movin Mavs basketball team will participate in the project from its early phase. Their valuable insight and feedback will be crucial to ensuring the acceptability and usability of the proposed system.”
Hong Jiang, Wendell H. Nedderman Endowed Professor and chair of the Department of Computer Science and Engineering, said Makedon’s project could greatly aid people with spinal cord injuries or mobility difficulties.
“Collecting and using data that could immediately be used to help people has the opportunity to be life-changing,” Jiang said. “This grant supplies that important link between data analysis and helping people.”
Makedon, who joined UT Arlington in 2006, has received many NSF grants for research projects in the areas of human-computer interaction, human-robot interaction, pervasive computing, machine learning, computational multimedia, disability computing and cognitive computing.
She currently directs the Heracleia Human-Centered Computing Laboratory, which applies advanced artificial intelligence methods to develop technologies for human behavior monitoring, risk assessment and rehabilitation. She is member of several journal editorial boards and chair of the international conference PETRA. She currently supervises a large team of doctoral students, several research undergraduates and master’s students.
NJIT experts in augmented reality help train caretakers for the elderly
Augmented reality and artificial intelligence can merge to help healthcare students
Grant and Award AnnouncementWith the world's population of geriatric patients increasing faster than enough caretakers can be trained for the difficult job, NJIT Assistant Professor of Informatics Salam Daher thinks augmented reality technology may help close the gap.
Daher and her students are prototyping a digital model of an older person which is aware of its feelings and environment. Existing models only cover physical aspects, so it's opening new ground to have a patient simulator that teaches caretakers about the emotional and psychological aspects of their daily work.
This could attract more people to the field. People who are studying to become certified nurse assistants will interact with the patient through virtual reality or projections. The patient will be controlled initially by teachers and eventually through artificial intelligence.
For training purposes and patient outcomes, "One day, 30 or 40 years from now, I might be on the other end of it so I better do a good job now," Daher said. "We want to do training that improves communications, empathy and perceptions. We want to create a proof of concept for this type of training, and use it to investigate if it makes a difference."
The project will also be significant for pushing the limits of augmented reality. Daher said their software will cause the virtual patient to remember conversations or take actions such as turning off a television when a nursing student enters the room — "It's a new class of virtual agents … They may make comments that give the illusion that they're aware of you or the environment around them, as opposed to this is canned or unaware of what's going on."
She's the right person for the job, as an award-winning specialist in healthcare simulation who previously developed mock patients using digital assistants, interactive video and physical models. Her latest work is funded by a $110,000 grant from the National Science Foundation. Her co-principal investigator is Distinguished Professor Julie Ancis. Collaborators include the University of Delaware and University of Central Florida.
Behind the scenes, Daher's team is developing their software in Unity, which is popular for applications such as mobile phone games. Latency could be a technical challenge, because realism would suffer if patients respond too slowly. Inherent limitations of artificial intelligence could also be problematic, especially if caretakers go off-script. Considering that humans often have misunderstandings with each other, asking software to understand body language, humor or nuance is a tall order, she noted.
Kimia Naeiji, a senior information technology major from Westfield, is working on the 3D modeling. She grew up wanting to become a dentist before discovering her passion for software, and said she is happy to have found a way to combine her interests in healthcare and technology. She also came to understand the importance of elderly care when her grandmother had a stroke.
Naeiji uses software called Maya to build her model. "It's pretty interesting. I learned a lot about 3D characters. I learned a lot about bones, joints and how I can make a character look realistic," she said.
The character is female but does not yet have a name. "We definitely have to give it a name. When it goes to the VR and doctors and nurses talk to this person, they have to call it something. I am currently working on the texture. I will also improve the rigging and gestures."
No comments:
Post a Comment