As nurse Trina slowly rolled into the room, the patient looked a little skeptical.
As the massive and mechanical robot brought the patient a red plastic cup, the expression on Michele Kuszajewski’s face turned to downright frightened.
“It felt scary when it was coming at me,” said Kuszajewski, who was pretending to be a patient in a simulation lab in Duke University’s School of Nursing.
Trina, which stands for Tele-Robotic Intelligent Nursing Assistant, is a first-generation robot being developed and refined through a collaboration between Duke’s School of Engineering and School of Nursing students and staff.
Sign Up and Save
Get six months of free digital access to The News & Observer
Since the Ebola outbreak in 2014, new technologies, including robots, are being tested as alternatives to human contact to diminish risks for providers as they care for patients with infectious diseases.
“We are not trying to replace nurses,” said Margie Molloy, an assistant nursing professor, but trying to create a safer environment for health care providers.
When health care providers work with patients infected with diseases like Ebola, they dress in multiple layers of protective clothing, wipe everything down with bleach and utilize multiple rooms.
Duke officials are hoping to improve the process with Trina, a remote-controlled robot that nurses and doctors can navigate – in another room – and direct to move linens, take vital signs and pass food and medications.
Funded by a National Science Foundation grant, Duke officials started working on the $85,000 robot about a year-and-a-half ago.
Trina looks like a mix heavy on science-fiction Transformers, with a dash of Rosie the housecleaning robot from “The Jetsons.” Its face is a tablet, which shows its human operator, like on Skype. On its head sits a gray wig, topped with a surgical cap.
“We were trying to make it at least a little bit more personable,” said Ryan Shaw, an assistant nursing professor.
Currently robots used in surgery help doctors perform tasks with precision and flexibility, but the machines don’t move about the room and perform tasks like preparing drinks or adjusting an oxygen mask.
Despite robot portrayals in popular culture, creating mobile, human-like devices isn’t as advanced or easy as people may think.
Around noon Friday, nursing students moved about in the nursing school’s simulation lab. Covered with protective clothing, they simulated working with a patient with Ebola as two engineering students watched through a glass window.
The nursing students were learning what to do in a real-life situation, but the doctorate engineering students were observing to better understand the tasks that Trina needs to perform.
After the nursing students were done, engineering student Fan Wang moved to a console just outside the simulation lab and drove Trina in. The day marked the second time Trina visited the lab. The trip requires a U-Haul and care to make sure the robot isn’t damaged.
“Today we are trying to see what our robot can do in a simulation scenario,” Wang said.
Trina rolled in slowly, and Wang’s face appeared on the tablet screen. Jianqiao Li, also an engineering student, observed inside of the room, taking photos and video.
Trina’s tasks included delivering a red cup, a bowl, pills and a stethoscope to Kuszajewski. The robot’s movements were abrupt and clumsy. Kuszajewski remained still and calm, but her eyes grew big as Trina approached.
Li and others said this is just a start. They want to make this Trina, or the next generation, more nimble, able to collect and test fluids and look more friendly and human-like.
“We need to establish a better interface with the human and the robot to make them work together and be more comfortable,” Li said.