The first attempt to use robotics for medical purposes was undertaken by DARPA in the 1990’s. However, at the time, communication networks were incapable of providing the necessary connectivity, rendering, and processing power. Now, with advances in robotics and AI, our capabilities are expanding at a rapid pace; and a new frontier has arrived.

Recently, the U.S. Department of Defense funded trailblazing research at Carnegie Mellon University to create an autonomous robotic trauma care system, designed to treat soldiers who have been injured in remote locations. The resulting technology has enabled such devices as “CorPath GRX” by Corindus Vascular Robotics. For patients located at a distance, robotic systems can be accessed on mobile devices or personal computers–thereby enabling increased accessibility to vital resources. This can be applied across the board, well beyond military scenarios.

Meanwhile, social companion robots like Jibo, Pepper, Paro, Zora, and Buddy are now equipped with sensors, cameras, microphones, and even some rudimentary machine-learning technology. This makes them capable of myriad tasks–from serving as prosthetics to simply giving reminders for taking medication. Users can even have conversations with such devices, making the interface easy and convenient.

But robots aren’t just there to help patients; they are there to empower medical professionals as well. Hospital personnel can benefit from the help of robots like “Moxi”, created by Diligent Robotics. This robot is capable of re-stocking shelves, delivering items, and cleaning rooms / equipment…all so that nurses can spend more time attending to patients.

Robotic systems never get tired. They are not hampered by emotional swings, and are always 100% focused. In other words: They have some of the key features of the perfect surgeon. Such “Waldo surgeons” can bridge the gap between humans and machines, performing tasks with incredible precision. As long as the program is correctly set for the procedure, human surgeons can take a secondary, supervising role. Such robots can use machine vision to guide themselves to problematic areas; and make surgeons aware of things that they may have otherwise missed.

AI makes this even more helpful, as it can help surgeons by creating a customized visual overlay–in real time–during the surgery, thereby highlighting blood vessels and other sensitive places. If a robotic arm is used, the knowledge library can suggest various tools based on a catalogue of best practices.

And so it goes: Robots can be used for precision operations during surgery…and various other medical procedures that stop short of full automation. Thus the human element is ratained as needed. Medical robotics companies are now using AI (spec. machine-learning software) to automate their systems. Notably, Intuitive Surgical and Medrobotics have developed surgical robots to perform non-invasive surgeries. In addition, machine vision software can be used in robotic camera arms that provide a clearer view of body structures.

Surgical robots can provide helpful information and recommendations during surgery. This includes monitoring heart rate, blood loss, and other relevant physiological reactions–thereby ensuring surgeons are always fully informed. Intuitive’s robot includes an arm with a camera attached offering greater visibility of the operation.

The catch, though, is that a “smart” robotic surgery system (equipped with machine-learning capabilities) requires extensive training…and a massive data-set from which to work. Consequently, it may take a healthcare company a long time to acquire enough data to properly train a machine-learning algorithm.

When it comes to enhanced diagnostics, AI can detect patterns that describe various conditions. Accurate diagnoses can be made by studying electronic health / patient records (EHRs / EPRs). A smart system can scour thousands of cases and look for correlations between hundreds of variables–some of which may not even be listed in current medical glossaries. Such systems can rival the best doctors. An endoscopic system from Japan detects colon cancer in real time; and is 86% accurate.

And IBM Watson has already hit the 99% mark in cancer diagnosis. (!) Recently, IBM partnered with Memorial Sloan Kettering Cancer Center to create an algorithm for improving the diagnosis and treatment of various cancers. The program was named “Watson for Oncology”. The primary obstacle for this system was a limitation in being able to pull relevant patient information from EHRs / EPRs.

AI software is not only useful for automating medical (surgical) robotics. It can also be employed to streamline diagnostics. Indian software company Sigtuple developed an AI-based tele-pathology system equipped with “smart” microscopes–enabling it to take photos and send images to the cloud. Running on software called “Shonit”, the robotic system uses microscopes–fitted to a movable robotic base–that are connected to a smartphone camera. “Shonit” runs via an app on the smartphone, which connects it to the cloud. The microscope slides around on its robotic base, allowing the lens to hone in on the desired area and take multiple pictures.

Here’s where smart technology comes in. The cloud satellite that receives the images uses machine vision to label them according to blood cell count (and any anomalies within the blood). The pictures can then be sent to a remote pathologist, who makes a diagnosis based on these pre-labeled, high-resolution images. The cloud-based portion uses the machine vision to recognize possible illnesses. This is especially useful for remote healthcare workers. After all, any smartphone camera with the Shonit app can be used to conduct such analyses.

In addition, micro-bots guided by smart technology can go precisely where they are needed in order to deploy medicine locally…and even to perform targeted micro-surgeries (such as unclogging blood vessels).

Currently, when it comes to transferring the onus onto AI, issues pertaining to liability are difficult to resolve. For it is sometimes unclear exactly how a smart system may have come to its conclusion. So, before we can enjoy their full potential, our healthcare system must evolve to accommodate the implications of these new technologies.