Robo-doctors are still the domain of science fiction. But researchers in engineering and medicine are developing human-controlled advanced technologies that will improve healthcare. The promise of robotics in healthcare is often equated with the development of self-driving cars. The latter technology has accelerated and these cars are now being tested on roads, although not without challenges. Could we one day have autonomous medical droids performing surgery, like those seen in the Star Wars movies? Not for a long time yet, caution researchers. Although robotics and artificial intelligence (AI) technologies are being applied in medicine, humans remain firmly in the picture. “Many people think AI will replace clinical duties, especially those of the radiologist who reviews and interprets medical images. But we can’t look forward to this any time soon,” said Dr Kwok Ka-wai of the Department of Mechanical Engineering, who specialises in surgical robotics, AI and related systems. “Taking the driver out of a car is easy in comparison. Taking out a clinician is a lot more difficult.” That is because there are many more hurdles to overcome, says Dr Vince Vardhanabhuti of the Medical Faculty’s Department of Diagnostic Radiology, who uses big data and analytics in his research. Medical decisions are based on a wide variety of data – not just medical images but laboratory tests, pre-existing conditions, gender, age and a host of other factors. Plus, there is the liability factor. “Who is responsible when AI gets it wrong? At the very least, this is why doctors need to remain in the loop with overall responsibility until such times when people can fully trust AI. I think we are still a long way from that,” he said. “In the short term, I think it is more likely that AI and humans will work in collaboration – the AI will be used as a team member, a bit like in the multidisciplinary collaborative teams that we now see in medical practice.” To that end, the two scholars and their teams have been developing ways to use robotics and AI alongside humans to improve diagnosis and treatment. Robots in the machine Dr Kwok’s big breakthrough has been developing a robot that can operate within a magnetic resonance imaging (MRI) Tasks requiring dexterous movements can be carried out by the robotic arms even when channelled through an endoscope more than one-metre long. Dr Kwok Ka-wai’s Group for Interventional Robotic and Imaging Systems (IRIS) attending the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems in Macau. MRI-guided needle robot for percutaneous ablation of primary liver cancer. KEEPING HUMANS IN THE LOOP machine. The strong magnetic force means no ferromagnetic metals can be used, so his robot is made of plastic or MR-safe materials, and runs on hydraulic fluid. “We already have a lot of minimally invasive surgical tools like endoscopy and laparoscopy where a camera is mounted on a scope that goes inside the body to visualise the surgical site. Surgeons can then do procedures remotely by manipulating different instruments, like a puppet. “My work is to see if we can work with the MRI’s ability to see through the whole body to provide safer, more accurate and more effective positioning for clinical surgeries.” MRI-guided surgeries can already be done manually but are cumbersome – the surgeon checks the image, enters the MRI room, takes the patient out of the ‘donut’ for treatment, puts the patient back in, then checks the computerised image to see what should be done next. An MRI robot can be manipulated remotely to do the procedure, without needing to move the patient so many times and risk displacement. Dr Kwok’s robot is being developed for brain surgery. Pre-clinical tests have been performed on cadavers and while there are many hurdles to getting clinical approval from the US Food and Drug Administration, collaborations are underway with a potential industry partner. Another output from his laboratory is a tiny robot that can manoeuvre within an endoscope tunnel that is less than 2.8 millimetres wide, to perform procedures such as polyp dissection. This robot is easier to implement than the MRI robot. A spinoff company, Agilis, is making the robot available for clinical approval. Learning the boundaries Dr Vardhanabhuti has worked with Dr Kwok on the MRI robot project, as well as another project that uses AI to improve MRI imaging of suspected prostate cancer tumours, which reduces variability between the observations of different radiologists. Separately, he has also had success using AI to improve predictions of cancer patient-related outcomes based on imaging. As the end-user of advanced technologies, Dr Vardhanabhuti thinks there is potential to make his work more efficient and accurate but is cautious about over-selling it at this point. “I don’t think we know yet how best to work in conjunction with these technologies,” he said. “For imaging and diagnosis, AI could be seen as a second reader but even then, it can depend on how confident the person is in their diagnosis and whether they are swayed by the second reader.” As an example, he cites mammograms using computer-assisted diagnosis (CAD). The technology was widely adopted before clinical trials were conducted and when a proper trial was finally held, it found humans performed worse when they relied on CAD because they called too many false positives. “From a mass screening perspective, this is not a good result because it results in too many unnecessary biopsies and distresses the patients,” he said. In the short term, I think it is more likely that AI and humans will work in collaboration – the AI will be used as a team member, a bit like in the multidisciplinary collaborative teams that we now see in medical practice. DR VINCE VARDHANABHUTI 08 The University of Hong Kong Bulletin | May 2021 09 COVER STORY
RkJQdWJsaXNoZXIy ODI4MTQ=