25 November 2020

Augmented Reality and the future operating room

The futuristic Augmented Reality looks set to have a major impact on image-guided surgery. We spoke to Dr Atul Gupta, the Chief Medical Officer for Image Guided Therapy at Philips about their collaboration with Microsoft to develop the use of Augmented Reality to enhance image guided surgery.

Philips, in collaboration with Microsoft, last year showcased for the first time a minimally invasive image-guided surgical concept that looks so futuristic, one would think it is science fiction. But it isn’t. It’s quite real and Philips has been trialling the concept at a number of locations in Europe and the US for some time now.

In essence, Philips has combined their state-of-the-art Azurion imageguided therapy platform with Microsoft’s HoloLens 2 holographic imaging headset to provide live imaging of the anatomy and physiology in the surgical field, as well as other vital data currently displayed on large 2D screens into a 3D holographic image, creating a so-called augmented reality (AR) environment. The physician, wearing a HoloLens 2 headset and immersed in this augmented reality environment, can control the 3D hologram with their voice, eyes and hand gestures while performing the surgery. Fundamentally this enables the physician to be more focussed on the patient and provide a more precise image-guided surgery procedure.

We spoke to Atul Gupta, MD, the Chief Medical Officer for Image Guided Therapy at Philips and a practicing interventional and diagnostic radiologist. He has played an important part in this collaboration since the start of this initiative. He explained: “This concept allows me to see the real world superimposed with the live data and 3D medical imagery needed to guide our precision therapy, and importantly also lets me control Azurion with voice recognition, eye tracking and hand gestures. It’s all about keeping our focus on the patient.

“Interventional specialists like myself must make use of increasing sources of 2D and 3D data as we navigate our miniature instruments through the body to treat disease minimally invasively. It is crucial to stay focussed on our patients and not be distracted by controlling equipment nor having to mentally reconstruct 2D images into a 3D anatomy map.”

As a practising interventional radiologist, Dr Gupta outlined the benefits of this procedure saying the transition from open surgery to image-guided minimally invasive procedures has driven a seismic shift in improving patient outcomes and reducing costs – not least by dramatically reducing the length of time a patient stays in a hospital after their procedure.

hospital after their procedure. “For the patient, it means no big incisions, so no big scars and oftentimes a local rather than a general anaesthetic. That results in a faster recovery and a better overall experience of their treatment.

“Excitingly, there are many new procedures that we can perform with image guided therapy that simply aren’t possible with traditional open surgery,” he added.

Interventional procedures, such as image-guided surgery, however, require the team treating the patient to keep track of many different sources of information and make a lot of quick decisions based on that data.

“When we launched our Philips Azurion image-guided therapy platform in 2017, we took a big step forward in terms of integration, with all of that information combined and intuitively controlled on a large LED screen suspended above the patient table.

“With augmented reality, we can take it to the next level, fully immersing each member of the team carrying out the procedure in an environment tailored to their specific role.”

Using an analogy of driving a car, Dr Gupta said: “You want to keep both hands on the steering wheel and your eyes on the road, not get distracted by looking at screens and having to press buttons. Similarly, when I’m working on a patient I want to keep my hands on my instruments and my eyes on the patient.”

The concept Philips has developed with Microsoft enables him to do this. Although, to be clear, it is still in the concept phase.

“Regardless of the disease or body part, one thing that is a constant is that cardiac, vascular, oncologic and neurologic procedures are becoming more complex with more clinical data and medical images necessary to guide the intervention. Also, physicians are working from more positions in the room, and are part of larger multi-disciplinary teams. So, we see the potential benefits of AR across many interventional procedures, from head to toe.

“We’re using the concept developed with Microsoft to gather further clinical insights to support the development of future commerciallyavailable augmented reality solutions.”

Dr Gupta pointed out that AR technology can help by projecting just the relevant information right where and when the surgical team needs it, adding that AR could help to declutter the crowded environment of the interventional suite creating a customized cockpit and enabling more intuitive control of the operative or interventional system.

“We are conducting pre-clinical and clinical studies to validate several AR concepts where clinicians and staff are using head-mounted displays, such as HoloLens 2. These include:

  • AR flexible screens:
    In this concept, we create a personalized cockpit of holographic screens that provide me the information I need, which typically reside on physical screens in the interventional suite such as the Azurion FlexVision. The holographic screens are more ergonomic since I can put them where I want and resize it how I want it. The system anticipates my needs, so it will only show the data relevant to me at that time of the procedure.
  • 3D holograms:
    Presented as holograms, we can explore three-dimensional images virtually, which gives me a full spatial understanding of the anatomy.
  • Intuitive control:
    It would be great if clinicians could interact with the interventional suite intuitively by using gestures, eye-tracking, and voice-control on the head-mounted displays. Adding touchless control to existing interfaces such as the Azurion Touch Screen Module overcomes sterility barriers and allows for seamless interaction from anywhere in the suite.
  • Virtual teamwork:
    We look at how specific information can be tailored to the needs of each member of the interventional team related to the task they are performing. The customized information is presented to every team member regardless of where they are located in the room.
  • Remote presence and collaboration:
    In this concept, we explore how virtual collaboration can enhance a procedure through remote consults with clinicians in other suites or other hospitals. The expert clinician does not have to scrub-in and be physically present but can provide his/her expert opinion remotely while having full access to the intervention data and images.

Extended Reality workshop

Earlier this year Dr Gupta was invited to a US FDA workshop on Medical Extended Reality, the purpose of which was to discuss evaluation techniques for hardware, standards development, and assessment challenges for applications of Extended Reality in medicine. At the workshop he said he believed image-guided therapy would be the first entry point for AR in medicine.

Noting that the best innovation comes from collaboration, he said it was extremely gratifying to collectively brainstorm with other invited global leaders in health technology, AR computing, academic centres, physicians, and researchers.

Challenges resolved by AR

Speaking at the event, Dr Gupta said AR potentially resolves a number of challenges the treatment team face in the interventional suite. The first of these is the conversion of 2D images into 3D holographci images.

“Three-D imaging is extraordinarily important. For example, if I am doing a chemoembolization procedure on a liver tumour, I need to navigate the hepatic artery. We know on our 2D screens if I have to go up or down, left or right, but it doesn’t tell me if I have to go in or out and we waste a lot of time – and also radiation dose – trying to see which way to navigate on 2D screens. The 3D hologram makes this so much better.”

Secondly, the physician doesn’t have to reach over to touch the 2D screen to zoom the image in or out as these actions can be applied to the hologram with voice recognition, eye-tracking and smart gestures. “My hands are busy, really busy and sometimes there are two individuals doing procedures. You have hands on catheters and wires and we just don’t want to let go of those to touch the buttons.”

Additionally, AR could enable each person on the treatment team wearing an AR headset to have their own customisable view of specific data relevant to their role in the treatment procedure.

Dr Gupta provided this example: “When I’m trying to gain access to the common femoral artery, it takes me about a minute. I don’t need the ultrasound image through the whole procedure, I just need that image there for one minute. Now, this is in addition to the master medical diagnostic quality image screen that still exists in front of me and behind me. However, the nurse doesn’t care to see that. She wants to see the heart rate and the O2 saturation and when she moves around that patient, she wants to be able to have it follow her. So, everybody gets their own customized personalized displays that are a supplement to the existing master medical screens.”

Although it is clear that augmented reality will play an important role in the future of image guided therapy, there are many steps that still need to be taken before the technology becomes integrated into daily clinical practice.

“We still need to overcome several challenges before we can start thinking about commercialization. Next to technology, there are challenges in, for example, standardization and regulatory approval.”

“The overwhelming take-home message the FDA received from us was that no one can innovate alone. Advancing AR technology is a complicated dance between physicians, manufacturers, industry leaders, software engineers, academia, and regulatory bodies.

“Philips is well-positioned to take a leading role in connecting the dots. We have already taken significant steps in this field. However, to make AR advances possible, all stakeholders involved need to work collaboratively to integrate the technology into everyday practice,” he told International Hospital.

Microsoft’s HoloLens 2

Microsoft’s HoloLens 2 is a goggle-like headset with a self-contained holographic computer that enables handsfree, heads-up interaction with 3-dimensional digital objects providing an immersive mixed reality experience.

The HoloLens 2, introduced last year, is a significant improvement on the original HoloLens introduced in 2016. Microsoft says the new version enhances immersion including in the visual display system, making holograms even more vibrant and realistic. The field of view is doubled in HoloLens 2, while maintaining the industry-leading holographic density of 47 pixels per degree of sight.

Microsoft has also completely refreshed the way the user interacts with holograms in HoloLens 2. Taking advantage of their new time-of-flight depth sensor, combined with built-in AI and semantic understanding, HoloLens 2 enables direct manipulation of holograms with the same instinctual interactions you’d use with physical objects in the real world.

In addition, the HoloLens 2 is connected to Microsoft’s Azure services that give them the spatial, speech and vision intelligence needed for mixed reality, plus cloud services for storage, security and application insights.

Philips’ Azurion imageguided therapy platform

In the past few decades, clinical practices around the world have evolved to successfully treat more patients and perform more complex procedures in interventional labs. However, with more staff and technologies involved during these procedures, interventional lab environments can become crowded and cluttered. In order to enhance clinician focus and control during procedures, Philips, in collaboration with several leading hospitals, developed the Azurion image-guided therapy platform to integrate vital procedural information from various sources, such as medical imaging, interventional devices, navigation tools and patient health records to enable interventional staff to the control all compatible applications from a single touch screen while performing procedures.

In September this year Philips announced a new version of their Azurion platform. The company says the ‘upgrade’ marks an important step forward in optimizing clinical and operational lab performance and expands the role of image-guided interventions in the treatment of patients. Philips notes that the Azurion platform has already achieved rapid global adoption and has been used in well over two million procedures worldwide since its introduction three years ago.

With this next-generation Azurion platform, Philips is also introducing a new 3D imaging solution called SmartCT. With SmartCT, users are guided through the image acquisition and can review and interact with the acquired CT-like 3D images on the tableside touch screen module using 3D visualization and measurement tools.

Philips says that with the new Azurion platform, clinicians can easily switch between imaging, physiology, hemodynamic and informatics applications, including SmartCT and IntraSight – a comprehensive suite of co-registration modalities as well as iFR (instant wave free ratio), FFR (fractional flow reserve), and IVUS (intravascular ultrasound) provided by advanced catheters and pressure wires that are capable of producing ultrasound images of the interior of blood vessels.