In a significant step forward for the field of medical robotics, a team of researchers from the University of Illinois Urbana-Champaign and Duke University have been awarded $1.2 million by the National Institutes of Health. The funding will be used to enhance and expand their groundbreaking robotic eye examination system.
The team, led by U. of I. computer science professor Kris Hauser, has already made impressive strides in the field of automated medical examinations with their creation of a robotic system that can accurately position examination sensors to scan human eyes. The system currently employs an optical scan technique that can function from a safe distance from the eye. Now, with the aid of this newly awarded funding, the researchers are set to add more features that will enable the robot to perform most steps of a standard eye exam.
The implications of this development are significant. As Hauser noted, “A robotic system can automatically perform routine examinations, reducing the time patients spend in a doctor’s office and allowing for more widespread screening. This could lead to better health outcomes for many people.” However, achieving this goal requires the development of safer and more reliable controls, a challenge that this funding will help the team tackle.
Automated medical examinations have the potential to make routine medical services more accessible and enable healthcare workers to treat more patients. However, the safety concerns associated with these examinations are unique and complex. The robots must operate reliably and safely in proximity to sensitive body parts, something that is especially important when dealing with eye examinations.
Earlier versions of the robotic eye examination system developed by Hauser and his team used optical coherence tomography to create a three-dimensional map of the eye’s interior. This allowed for the diagnosis of many conditions, but the researchers are now looking to expand the system’s capabilities by including a slit eye examiner and an aberrometer. These additions will require the robot arm to operate within two centimeters of the eye, underlining the need for enhanced robotic safety.
Hauser compared the system’s control mechanisms to those used in autonomous vehicles. Just as self-driving cars must avoid “at-fault collisions”, the robotic eye examination system must be capable of responding to human movements to ensure patient safety.
The funding will also enable the team to conduct large-scale reliability testing. To ensure maximum accessibility, the researchers have developed a second robot that uses mannequin heads to emulate a range of human behaviors. This robot will also randomize the heads’ appearances to account for different skin tones, facial features, hair, and coverings. This will help the researchers understand and mitigate any potential algorithmic bias in their system.
While the system is being designed for use in clinical settings, Hauser envisages a future where such systems could be deployed in retail settings, similar to blood pressure stations. “An automated examination system like this could be used in an eyeglass store to scan your eyes for a prescription, or it could provide a diagnostic scan in a pharmacy and forward the information to your doctor,” he said.
The project, co-led by Duke University professors Joseph Izatt of biomedical engineering and Anthony Kuo of ophthalmology, is cosponsored by the National Robotics Initiative and the funding will be distributed over three years. The integration of electronics, computers, and advanced programming languages in this project represents a significant leap forward in medical robotics and coding capabilities, promising a future where access to basic healthcare services is more widespread and efficient.