Reading Time: 13 minutes
Introduction

Augmented reality (AR) is one of the recent innovations making inroads into several markets, including healthcare. AR takes digital or computer-generated information such as audio, video, images and touch or haptic sensations and overlays them in a real-time environment. AR offers feasible solutions to many challenges within the health care system and, as such, offers numerous opportunities for its implementation in various areas, such as medical training, assistance with surgeries, rehabilitation etc. AR innovations can assist medical personnel improve their ability to diagnose, treat and perform surgery on their patients more accurately.

Some of the pioneering AR solutions aimed at changing the face of healthcare to meet the challenges faced by the users are discussed below.

Medical Education / Training

AR has a deep impact on medical training, with applications ranging from 3D visualizations to bringing anatomical learning to life. AR applications project extensive information, visual 3D structures and links onto the traditional pages of medical textbooks for training in, say, anatomy. Recent hardware platforms, such as the HoloLens glasses by Microsoft, have started supporting medical education applications. With the use of hand gestures, it allows the deformation, fly-by view, and other interactions with 3D models to reveal hidden organs [1, 2].

Vimedix™, from CAE Healthcare, is used for training students in echocardiography. It consists of a mannequin and a transducer transthoracic or transesophageal echocardiogram. It enables healthcare professionals to see how an ultrasound beam cuts through human anatomy in real-time. CAE Healthcare has begun to integrate HoloLens in their technology to provide the ability to view the images with glasses, unrestricted from the dimensions of the screen [3].

The main application of AR in surgical training is telementoring, i.e. the supervisor teaches the trainee by demonstrating the proper surgical moves, paths and handlings on the AR screen. These parts of information are displayed to the trainees while guiding them. As a learning tool, AR provides the key benefit of creating a highly engaging and immersive educational experience by combining different sensory inputs [4].

Surgery

AR finds the most applications in surgery such as nephrectomy, neurosurgery, orthopaedics, and particularly laparoscopic surgery. Reliability and realism are extremely important factors in these fields, not only for the comfort of the user but also for assuring that no inappropriate handling will be learnt and that the actual traumatic conditions will be recreated. AR demonstrates certain advantages, such as minimal cost per use, the absence of ethical issues, and safety as compared to training on actual patients [5].

Oral and maxillofacial surgery (OMS) is one sensitive and narrow spatial surgery that requires high accuracy in image registration and low processing time of the system. The current systems suffer from image registration problems while matching two different posture images. Researchers from Charles Sturt University, Australia developed a system to improve the visualization and fast augmented reality system for OMS. The proposed system achieved an improvement in overlay accuracy and processing time [6].

Surgical navigation is essential to perform complex operations accurately and safely. The traditional navigation interface does not display the total spatial information for the lesion area as it is intended only for two-dimensional observation. Researchers from Soochow University, China have applied augmented reality (AR) technology to spinal surgery to provide more intuitive information to surgeons. A virtual and real registration technique based on an improved identification method and robot-assisted method was proposed to improve the accuracy of virtual and real registration. The effectiveness of the puncture performed by the robot was verified by X-ray images. It was noticed that the two optimized methods are highly effective. The proposed AR navigation system has high accuracy and stability, and is expected to become a valuable tool in future spinal surgery [7].

The Fraunhofer Research Institute has initiated a project, MEVIS that uses an iPad-based AR application during liver operations. It helps the surgeon know the location of blood vessels inside the organ, by comparing the actual operation with the planning data based on 3D X-ray images. The figure below shows an overlay of the planning data on the actual camera image, as if looking inside the organ [8].

A recent study reporting 3D image-guided surgery indicates that AR with visual cues to the subsurface anatomy could be a replacement in the case of minimally invasive surgery in the field of urology [8].

iPad-used-during-an-operation

Fig. 1. iPad used during an operation [8]

Rehabilitation

AR is in the exploratory stage in the rehabilitation field, where it shows advantages over traditional rehabilitation methods by creating interactive and immersive environments. Moreover, AR can provide reliable and accurate feedback to guide and correct the patient during an exercise, enhancing the individual motor learning. In this way, the rehabilitation process can occur outside of a clinical setting, and without requiring the therapist supervision. However, in case of an unsupervised rehabilitation, the AR system can provide the therapist with the user performance data, enabling an offline physiotherapeutic evaluation [9, 10, 11].

Researchers from the University of Alberta, Canada, have developed a 3D-spatial augmented reality (AR) display to co-locate visual and haptic feedback to the user in three rehabilitative games. The users would be put under cognitive load (CL) for simulating disability-induced cognitive deficiencies when performing tasks. It was found that AR leads to the best user performance, with or without cognitive loading. It is most evident in dynamic exercises where the participants are required to have quick reaction times and fast movement [12].

Researchers from the Polytechnic University of Turin, Italy, have developed an AR system for real-time visualization of an index of muscle activity superimposed on the investigated muscle. The system includes a video camera, one or more surface EMG (sEMG) detection systems, and a processing and visualization unit. The system integrates the information from the video camera and sEMG systems, and creates an augmented video frame. The patient or the clinical operator can see the real-time augmented video on a display [13].

Researchers from the University of Tasmania have developed a visual augmentation system, Ghostman, that allows a physical therapist and patient to co-habit each other’s viewpoint in an augmented real-world environment. It consists of two subsystems in which one is operated by the patient and the other one by the therapist, and the two subsystems connect via the internet, enabling remote telepresence applications. The therapist delivers instructions remotely and observes acts of motor skills through the patient’s perspective [14].

Inhabiting-visual-augmentation
Inhabiting-visual-augmentation

Fig. 2. Inhabiting visual augmentation [14]

It was observed that fine motor skills developed much faster using the AR tool compared to traditional face-to-face demonstration.

Mental Health Therapy

AR is helping individuals overcome phobias, specifically bug phobias. In order to treat bug phobia, a specific mechanism of exposure therapy through AR can be utilized to decrease the fear of bugs for patients who are clinically diagnosed with phobias. AR can be an improvement over existing exposure therapy because healthcare providers have total control over the AR stimulus and can categorically guarantee that nothing unexpected can happen to the patient. It is effective with the added advantage of full control over the AR exposure conditions [15].

AR technologies can enhance emotional engagement and sense of presence for treatment health interventions. It can help  adolescents with autism spectrum disorders practice emotional judgements and improve social skills by providing facial modelling to promote emotional expressions with AR animation of six basic facial expressions overlaid on the participants’ faces. It is found that AR interventions can improve appropriate recognition and response to facial emotional expressions to improve adolescents’ ability to express their own feelings and respond to emotional facial expressions they see in everyday social situations. It is also found that participants evaluated AR interventions for children with autism to be engaging and fun, with an outcome of improved nonverbal communication, increased eye contact and increased social engagement [15].

therapy-for-phobias-of-small-animals

Fig. 3. Concept of using augmented reality in therapy for phobias of small animals [8]

There have been creative implementations for children with different health conditions using AR. Researchers studied that music therapy through AR for children with cerebral palsy increases motor exercise. Results from this study showed the potential of AR in the rehabilitation process for music therapy by promoting positive cognitive, physical, mental and social changes in individuals with cerebral palsy. This type of intervention could serve as an interventional tool for various cognitive, motor, psychological and social health conditions [15].

Doctor-Patient communication

Doctor-patient communication is a key component of healthcare because effective doctor-patient communication enhances patient motivation, increases overall benefit from healthcare and increases perceived levels of support that patients receive. AR can potentially evoke empathy, which is an important factor in family communication and perceived doctor-patient communication. Evoking empathy through AR could have unique applications for understanding patients’ situations when they go through an illness. Through visual effects, AR applications in theory could utilize presence and immersion experiences to help people vicariously experience patient symptoms [15].

Diagnostics/ Imaging

AR technology provides opportunities for improved diagnosis and characterization of diseases, including cancer. The user can view simultaneous display of a virtual image from the patients’ imaging investigation and the real-world image of the surroundings. The real-world image would be the patients’ anatomy if a cohesive physical exam and medical imaging assessment, pre-operative planning assessment or intraoperative procedure are being conducted. In other situations, the real-world image could include other environments, such as the operating room, educational lecture hall, radiology reading office or physician’s place [16].

Microsoft’s HoloLens helps doctors diagnose patients more effectively by overlaying CT scans and other imagery onto a patients’ body. AR with depth 3-dimensional (D3D) imaging provides depth perception through binocular vision, head tracking for improved human –machine interface (HMI) and other key features. It can enhance diagnosis, save cost and improve patient care [16].

Ophthalmology

Ophthalmology is one of the medical specialties that has developed fastest in the last few years. There are many new different procedures and practices that arise in Ophthalmology every year and augmented reality is one of them.

University of Oxford researchers have developed a new system called OxSight to aid navigation in visually sighted individuals. It is designed to enhance vision for people with peripheral vision loss resulting from, for example, glaucoma, retinitis pigmentosa and homonymous hemianopia. The glasses enable the visually impaired to see better by amplifying the remaining sight they have and helping their brain make sense of the information. OxSight’s glasses can be customized to assist with the unique restrictions each individual faces with their vision [17, 18].

OxSight-glasses

Fig. 4. OxSight glasses [18]

Transparent Optical Module (TOM) from NewSight Reality Inc. addresses many limitations of today’s AR devices such as size, weight, light and energy efficiency and transparency, along with the ability to be embedded within an ophthalmic lens. It has an exclusive transparent light engine and a Micro-Lens Array optical engine to project digital information from an image source to the eye of a user. It is hermetically sealed with a flexible multilayer coating that makes it resistant to sweat and environmental conditions. One of the main advantages of TOM is that it is a single thin, lightweight module that operates with low power and long battery life. TOM is curved to be integrated into an eyewear lens for monocular or binocular AR eyewear. It can result in highly efficient and high image quality AR [19].

Telemedicine

In telemedicine, the correct use of medical device at the point of need is necessary to provide appropriate service. In some scenarios when medical professionals are not available, untrained people may be required to interact with medical devices and patients.. Thus, AR becomes an effective tool in telemedicine for guiding untrained people at the point of need [20].

Researchers from University of Naples, Italy, have built a prototype system to estimate how untrained users, with limited or no knowledge, can effectively interact with an ECG device and properly place ECG electrodes on a patients’ chest. Some markers are attached to the ECG device and onto the patients’ thorax to allow camera calibration. The video of the current scene is augmented in real-time, and objects and their pose in the space are recognized, with additional pointers, text boxes and audio to help the untrained users to perform appropriate operation. Some user’s voice commands are also included to improve usability. The untrained users are able to carry out a complete ECG test first on a mannequin and then, on a real patient in a reasonable timespan. This approach can be adapted to support the use of other medical equipment as well as other telemedicine tasks [20].

the-portable-ECG
untrained-user-while-placing-precordial-electrodes

Fig. 5. Untrained user on a mannequin and on the portable ECG. Pictures representing an untrained user while placing precordial electrodes on the mannequin (right) and operating the ECG-device (left) [20]

Researchers from the Memorial University of Newfoundland, Canada, have developed a new ultrasound telementoring AR application using Microsoft HoloLens to facilitate and enhance remote medical training. The developed AR system is more immersive as it presents a controlled hand model with an attached ultrasound transducer. It enables remote learners to perform complex medical procedures such as point-of-care ultrasound without visual interference. HoloLens captures the first-person view of a simulated rural emergency room through mixed reality capture (MRC). Leap Motion is used to capture mentor’s hand gestures and virtually display the same in the AR space of the HoloLens. The trainee wears an AR headset and follows voice instructions together with the mentor’s transported hand gestures. This could become an alternative tool to perform ultrasound training [21].

telemedicine-platform-involving-real-time-remote-pointing-and-gesture-capture

Fig. 6. A novel Augmented Reality telemedicine platform involving real-time remote pointing and gesture capture [21]

Educating the Consumer

With AR, pharmaceutical companies can improve patient education by visualising complex products, wherein patients can see a 3D version of the drug working in front of their eyes instead of just reading long descriptions on the bottle. Lab analysts can monitor their experiments with AR equipment. In factories, workers can start working without hands-on training as the device would tell them what to do and how to do it. AR tools can display the process of taking a medicine along with the benefits in a visual way. For instance, Pfizer takes the help of AR powered by Blippar. Consumers can learn how a ThermaCare works and learn more about which product options are best suited to relieve their pain and stiffness [22].

Adoption of New Technology

One challenge the healthcare sector faces is that there is often a lag between a promising, innovative technology coming out of its development phase and its widespread adoption. This delay is due to costs of purchasing new technology, the time it takes to raise awareness and the need to integrate new systems from installation to training staff. Tools such as Proximie can help vendors reach potential customers all over the world any time as chosen and offer in-depth demonstrations, rather than wait for those all-too-rare opportunities to demonstrate new products face-to-face [23].

Key Market Players Developing Ground-Breaking AR Solutions
HoloAnatomy with Microsoft’s HoloLens

HoloAnatomy, a HoloLens app released by Case Western Reserve University and the Cleveland Clinic, in partnership with Microsoft, can visualize the human body easily and spectacularly.

Microsoft HoloLens is the first AR headset to create holograms of digital content within a real-world environment. Medical students can bring up a realistic image of a heart or brain in a classroom to rotate, enlarge, take apart and examine in-depth in front of their eyes. Besides, surgeons can use the holograms while preparing for complex operations [24].

HoloLens

Fig. 7. HoloLens 2[24]

AccuVein

AccuVein has come out with the world’s first handheld, non-contact vein illumination solution. It makes blood-drawing easier for both patients and practitioners. It employs augmented reality by using a handheld scanner that projects light over the skin to show medical professionals the precise location of veins in a patients’ body. It works by converting the heat signature of a patients’ veins into an image that is superimposed on the skin. It increases the efficiency and also decreases the likelihood of a second or third prick [25].

New-accu-vein

Fig. 8. New AccuVein AV500 [25]

Orca Health

Orca Health is a mobile software company that uses mobile applications and integrated tools to educate patients on making better decisions about their health. The Orca Health app features a ground-breaking Augmented Reality experience with fully interactive 3D anatomical models that can be manipulated with the touch of a finger. Using this, a user can view 3D anatomical models of ten areas of the human body that includes the spine, shoulder, hip, knee, hand, foot, heart, ENT, eye and mouth [26].

Immersive-3D-content

Fig. 9. Immersive 3D content in an intuitive app [26]

EchoPixel

EchoPixel develops diagnostic, surgical planning and image-guided treatment applications. Its True 3D AR system provides clinicians with a holographic experience to visualize and interact with patient-specific organs and tissue in an open 3D space. It is used to assist physicians to plan surgical and interventional procedures to treat Congenital Heart Defects. It enables physicians to interact with medical images (such as standard DICOM CT, MR, echocardiography and C-Arm fluoroscopy) the way they would with physical objects in the real world, and also facilitates interaction with virtual patient-specific anatomy without the need for a bulky VR/AR headset. Premier institutions, such as Cincinnati Children’s Hospital, Primary Children’s Hospital, C.S. Mott Hospital, Lucile Packard Children’s Hospital and Nemours Children’s Health System have adopted True 3D for clinical use, and are leading a renaissance in surgical imaging [27].

True-3D-platform

Fig. 10. True 3D platform [27]

Brain Power

Brain Power deals with integrating pioneering neuroscience with the latest in wearable technology; in particular, Google Glass. It builds software to convert wearables into neuro-assistive devices for the educational challenges of autism; the aim is to teach life skills to children and adults on the autism spectrum. It has developed a software, Empowered Brain to help children with language, social skills and positive behaviour [28].

autism-spectrum-using-the-evidence-based

Fig. 11. Three children on the autism spectrum using the evidence-based “Empowered Brain” augmented-reality system to address the learning challenges of autism [28]

Augmedix

The Augmedix platform is powered by a combination of exclusive natural-language-processing technology and medical documentation expert teams. It provides clinicians with hardware, Smartphones or Google Glass to securely stream the clinic visit to its cloud-based platform. The tech-enabled specialists remotely operate the platform and utilize exclusive automation modules to generate medical documentation accurately, comprehensively and timely delivered. The service works with over 25 specialties and supports most EHRs. Augmedix is used by independent providers and more than a dozen major health systems including Dignity Health, Sutter Health, The US Oncology Network, TriHealth, and others [29].

Atheer

Atheer has developed the Augmented interactive Reality (AiR) smart glasses platform that connects AR glasses to an android-based computer. AiR Smartglasses integrate hand-tracking and gesture control with their see-through display, enabling users to view critical work information right in their field-of-view and interact with these using familiar gestures and voice commands. Users can even remotely collaborate with experts on video calls and receive guidance through real-time image annotations for increased efficiency while staying focused on the task at hand [30, 31].

Augmented-interactive-reality

Fig. 12. Augmented interactive Reality (AiR) platform [30]

Aira

Aira is derived from Artificial Intelligence (AI) and Remote Assistance (RA). It delivers instant access to visual information for a user anytime and anywhere. It empowers trained and professional agents to remotely assist blind or low-vision people. The team uses deep learning algorithms to describe the environment to the user, read out text, recognize faces or notify them about obstacles. The system allows an Aira agent to see what the blind person sees in real-time, and then talk to them through whatever situation they’re in [32].

Aira-platform

Fig. 13. Aira platform [32]

Interesting Patents

Some recent patent applications presented below could be an indicator of things to come in the future.

US10326975B2 from OnPoint Medical, Inc. deals with a real-time surgery navigation method and apparatus for image-guided surgery, particularly CT-guided, MR-guided, fluoroscopy-based or surface-based image-guided surgery, wherein images of a portion of a patient are taken in the preoperative or intraoperative setting and used during surgery for guidance. A stereoscopic video display is used by the surgeon to see the graphical representation of the preoperative or intraoperative images blended with the video images in a stereoscopic manner through a see-through head mounted display.

WO2018US56422A from INTUITIVE SURGICAL INC. deals with a method comprising display of a surgical environment image that includes a virtual control element for controlling a component of a surgical system. The method also includes displaying an image of a body part of a user for interacting with the virtual control element, and receiving a user input from a gesture-based input device while the body part interacts with the virtual control element.

EP3574361A1 from NOVARTIS AG deals with a system of augmented reality device coupled to an imaging system of an ophthalmic microscope. The augmented reality device includes a lens to project a digital image, a gaze control to detect the focus of an eye of an operator, and a dimming system communicatively coupled to the gaze control and the outer surface. It also deals with a method of performing ophthalmic surgery using the augmented reality device and a non-transitory computer readable medium that is able to perform augmented reality functions.

WO2019171216A4 from STATE OF ISRAEL MINISTRY OF DEFENSE RAFAEL ARMAMENT DEVELOPMENT deals with an augmented reality device and method for assisting in walking or movement disorders. A wearable device is used by a user that has a super imposer for superimposing augmented information on a real-world field of view of the user. The wearable device provides the user with a relative downward directed field of view aimed at his foot or footwear, and the superimposed information is designed to appear within the downward directed field of view as being present close to the foot or footwear.

Startups to Watch
Proximie

Founded in 2016, Proximie is a secure surgical collaboration platform that enables surgeons to share knowledge before, during and after surgery. It enables users to connect and  interact with their colleagues anywhere in the world. Its app allows the user to do things like project anatomical cross sections onto a patient, or show 3D visualisations of internal organs, so the surgeon gets a see-through view as they plan an operation.

It is designed for use in low resource settings, so that it works over low-speed internet and can be used with almost any device with a camera. As it is coupled with an excellent library of recorded specialist cases and training materials, it is an affordable, long-term and sustainable solution. It was funded by Cedar Mundi Ventures, and is working with Global Smile Foundation to help provide quicker access to cleft surgery in Ecuador [23, 33].

Augmented-reality-surgery

Fig. 14. Proximie Remote Augmented Reality Surgery Platform

BioFlightVR

Founded in 2015, BioFlight offers a wide range of medical virtual reality and augmented reality services, has a VR/AR doctor training and 360° enhanced videos to train physicians and surgeons on new products and procedures within their field. It has also developed an AR/VR medical training module to help students and healthcare providers refine their learnings and increase retention of their knowledge.

It has raised a total of $250K funding. It has created a training program for the Children’s Hospital, Los Angeles, for teaching proper procedures or maintaining readiness for low-frequency and high-risk situations. In partnership with Duke University, it set up an Immersive Training Lab to improve curriculum effectiveness and retention. It is also partnered with Oculus, HTC vive, HP, Nvidia, and many others [34, 35].

MedicalRealities

Founded in 2015, Medical Realities is a medical training organization that builds an impressive repository of surgical resources using Google Glass. Users can experience the surgery through the eyes of an experienced consulting surgeon, seeing what they see and picking up on valuable techniques [36].

AIMS, an Italian academy specialising in physician training, has been using the Medical Realties Platform for three years, to enhance their training of surgical procedures using Medical Realities 360 Player, with over 400 students per year using the platform.[36].

Royal Brompton & Harefield, NHS Foundation Trust uses the Medical Realities platform in a diverse set of ways within a hospital environment, using this platform to improve safety, enhance skills and improve the wellbeing of patients. It provides a low-cost and scalable way to reach the employees and patients [36].

Roche has been using this platform, as the immersive technology is a great way to explain new or complex ideas. With the Medical Realities platform, they communicate with internal stakeholders, provide workplace education, or help their representatives explain a new or existing product to health care providers [36].

Help Lightning

Founded in 2009, Help Lightning enables surgeons to remotely project their hands into the display of a surgeon on-site wearing the AR technology, and thence, offer guidance during the procedure [37].

Help Lightning exclusively provides Remote Expertise through the use of Merged Reality. It uses Merged Reality to blend two real-time video streams, that of a remote expert and an onsite technician, into a collaborative environment. Its technology allows the expert to virtually reach out and touch what their service technician or customer is working on. It runs on existing mobile devices (iOS, Android) or a web-browser enabling experts to provide remote assistance as though they’re working side-by-side. They can telestrate, freeze images, use hand gestures, and even add real objects into the Merged Reality environment [37].

Lightning-remote-expertise

Fig. 15. Help Lightning Remote Expertise [37]

SentiAR

Founded in 2017, SentiAR adds a new dimension to clinical practice in interventional procedures. It has developed the first 3D visualization platform using real-time holography of the patient’s actual anatomy and catheter location, aiming at providing the clinician and patient a faster and safer delivery of care. The visualization is entirely controllable “hands-free” by the care giver, providing an ergonomic breakthrough for the treatment and analysis of cardiac arrhythmias within an interventional catheter lab environment. SentiAR is developing the platform for eventual FDA submission [38].

augmented-reality-platform

Fig. 16. 3D augmented reality platform [38]

SentiAR’s technology converts CT, MRI, and real-time mapping/catheter location outputs into a real-time hologram in the clinician field of view. The EP can expand, measure, or enter the chambers of the “floating” cardiac model for a significantly faster procedure with higher accuracy. It allows for visualization to multiple clinicians with multiple headsets [38].

Meta View

Founded in 2013, Meta View makes AR headset of the same name, as well as software to run on it, offering users a total augmented reality (AR) experience and serving a variety of purposes, including in healthcare space to assist surgeons with patient information and data. It offers wearable devices that provide augmented and virtual reality experience, and other gadgets based on machine learning and artificial intelligence [39].

Meta View has purchased the old Meta’s assets, which were sold when the latter’s creditor unexpectedly foreclosed on its loan. The new company will be led by former Qualcomm executive Jay Wright [40].

ImmersiveTouch

ImmersiveTouch provides augmented and virtual reality training and surgical simulation in healthcare. Founded in 2009, it offers the AR platform, ImmersiveTouch3 that is built into durable hardware and comes with an overhead monitor that allows others to watch real-time as the user goes through a simulation. It uses exclusive ultra-realistic haptic feedback to simulate the tactile sense of touch of operating on human anatomy, such as the resistance of skin, tissue, and bone. It can utilize a hand-held robotic stylus that mimics the use of relevant surgical tools and enables the appropriate tactile feedback. The system is a great choice for educational programs and can be used to rehearse surgical procedures for enhanced clinical performance [41].

Immersive-touch-platform

Fig. 17. ImmersiveTouch platform [41]

ImmersiveTouch has partnered with numerous medical institutions to conduct scientific studies on the effectiveness of its surgical simulations. The studies have consistently shown that practice on its technology links to improvements in actual surgical performance in dimensions such as surgical accuracy, time efficiency, technical finesse, procedural knowledge etc. [41].

Pravin K. Patel, M.D., Chief of Pediatric Plastic and Craniofacial Surgery at Craniofacial Center at the University of Illinois Heath & Medical Science Center and Chicago Shriners Hospitals for Children, who has conducted clinical studies with ImmersiveTouch, said that even though they could see the 3D anatomy, they could never interact with it – the patients’ 3D anatomy was trapped in a 2D surface of the display screen. ImmersiveView™ allowed them to cross that glass barrier into the other side, a different universe where they could interact with the anatomy. For the first time, it gave the surgeon the ability to touch and have depth perception [41].

Touch Surgery

Touch Surgery is an interactive surgical simulator platform that uses AR to deliver information to healthcare professionals. Their interactive simulators provide a realistic and detailed guide to every step of a procedure, enabling users to quickly learn and rehearse for surgery, and even instantly test their knowledge. However, instead of working on a patient, the procedures are projected holographically on screen [42].

Touch Surgery industry clients include Ethicon, Stryker, Lipogems, OrthoSensor, Zimmer Biomet, Vericel, and many others [42].

Immersiv-touch-platform

Fig. 17. ImmersiveTouch platform [42]

Stephen Beckmann, Sr. Program Manager, Medical Education, Stryker, said that, working with Touch Surgery’s talented team of animators and software engineers, they were able to create something that was unique both in the industry as well as for their respective companies. The end result was a unique learning tool that had been deployed to support their training curricula for surgeons, Stryker technical support staff and sales force. [42]

Challenges in AR

Though current AR devices display information with increasing accuracy and lower latency, there are several difficulties that need to be addressed. Currently, images are reconstructed and prepared in advance using complex algorithms and powerful computers. Due to advancement in technology, a real-time acquisition of high-resolution medical scans and 3D reconstructions can be possible. Such reconstructions could significantly improve overall accuracy. In spite of AR being able to speed up a surgical procedure, the necessity to prepare the whole system and make required registrations and measurements increases the time required to complete the surgery. Possibly, the introduction of fully automatic systems would eliminate this problem and reduce the total time required for completion of procedure [43].

Though the time required for completing a procedure can be reduced while using any form of AR, there are still certain limitations. The inattentional blindness is a concerning issue that needs to be addressed while using 3D overlays. The amount of information presented through AR during a surgery is increasing and therefore distracting. So, it is required to display only useful data or provide a method to switch between different sets of information based on requirement [43].

It is necessary to reach an adequate image contrast during a projection directly onto organs. There are persistent difficulties in creating a correct 3D and depth perception. The latency of the whole system is also an issue, because excessive latency may lower precision and reduce comfort of the surgeon [43].

Current HMDs usually weigh several hundred grams and produce plenty of heat; therefore, a discomfort for long-term wear. There is a need to better the ergonomics and allow continuous usage for a long period. AR projections in HMDs are known to produce simulator sickness, which is presented by nausea, headache, and vertigo or vomiting in the worst scenario. The exact reasons behind the simulator sickness are unknown; however, an inconsistency between visual, proprioceptive, and vestibular inputs is probably the case [43].

Recent technical developments

Compared to the first generation of HMD, significant changes have been observed in recent years, particularly concerning the quality of visual images. Early generation HMDs would provide a resolution comparable to that of standard TV, i.e., 800×600 pixels for each eye with an average field of view of 30 degrees. But new generation HMDs can display full high-definition resolution or many, and allow a field of view of at least 110 degrees. This allows the creation of experiences that appear more immersive and realistic, and increases the level of detail and precision users might need for professional applications such as in the medical field.

The overlaying images of AR can be either monoscopic or stereoscopic, which has the advantage that these images can be positioned in 3D space in front of the viewer. Google Glass and Epson SmartGlasses are prototypes of this technology, and they are constructed from light¬weight glass so that user can wear them in their pocket.

Medical professionals may be interested in using Google Glass in surgical settings due to the hands-free device potential to streamline workflow and maintain sterile conditions in operating room environment. Google Glass has been used in clinical simulation-based training for capturing video during care delivery and ultrasound-guided venous access and as a tool for paediatric surgery. Application of Smart Glasses in disaster scenarios would improve the quality of triage with telemedicine and AR features [44, 45, 46].

Current computers follow Moore’s law and have sufficient power to perform graphical computation of high-quality images in real time. Underlying algorithms based on the theories of machine learning are now supported by dedicated chips and manufacturer components of AR.

Advances in batteries have reached a level where huge amounts of video data can be transferred with almost no latency over dedicated cable connections or even wireless. Microsoft HoloLens is powered with high-performance batteries and presently allows unconnected operation for approximately three hours. Additionally, Microsoft delivers a software development kit (SDK) for giving access to lower level algorithms that perform elementary AR functions such as collision detection or environment space recognition.

Conclusion

AR is expected to serve as an advanced human-computer interface, working in symbiosis with health care professionals and allowing them to achieve even better results. Merging the technical expertise with local needs and health contexts can lead to new types of interventions and opportunities. Understating the relationship between AR and health, and their subsequent acceptance may play a role in the willingness of people within organizations to adopt them, particularly when it comes to wearing head-mounted displays. Future uses of AR may include electronic medical records (EMR) being automatically displayed on a device during a doctor’s examination or consultation with a patient, again highlighting the immediacy that AR can bring to medical practice.

References
  1. Paschold M, Huber T, Zeißig SR, Lang H, Kneist W. Tailored instructor feedback leads to more effective virtual-reality laparoscopic training. Surgical Endoscopy. 2014;28: 967-973. DOI: 10.1007/s00464-013-3258-z
  2. Hamacher A, Kim SJ, Cho ST, Pardeshi S, Lee SH, Eun S-J, et al. Application of virtual, augmented, and mixed reality to urology. International Neurourology Journal. 2016; 20:172-181. DOI: 10.5213/inj.1632714.357
  3. Barsom EZ, Graafland M, Schijven MP. Systematic review on the effectiveness of augmented reality applications in medical training. Surgical Endoscopy. 2016; 30:4174-4183. DOI: 10.1007/s00464-016-4800-6
  4. Vera AM, Russo M, Mohsin A, Tsuda S. Augmented reality telementoring (ART) platform: A randomized controlled trial to assess the efficacy of a new surgical education technology. Surgical Endoscopy. 2014; 28:3467-3472. DOI: 10.1007/s00464-014-3625-4
  5. Evans CH, Schenarts KD. Evolving educational techniques in surgical training. The Surgical Clinics of North America. 2016; 96:71-88. DOI: 10.1016/j.suc.2015.09.005
  6. Mucahit Bayrak et al. A Novel Rotation Invariant and Manhattan Metric based Pose Refinement: Augmented Reality based Oral and Maxillofacial Surgery. 14 January 2020.https://doi.org/10.1002/rcs.2077
  7. Chen, L., Zhang, F., Zhan, W. et al. Optimization of virtual and real registration technology based on augmented reality in a surgical navigation system. BioMed Eng OnLine 19, 1 (2020). https://doi.org/10.1186/s12938-019-0745-z
  8. Hamacher, et al., Application of Virtual, Augmented, and Mixed Reality to Urology. Int Neurourol J 2016;20:172-181.http://dx.doi.org/10.5213/inj.1632714.357
  9. Al-Issa, H.; Regenbrecht, H.; Hale, L. Augmented reality applications in rehabilitation to improve physical outcomes. Phys. Ther. Rev. 2012, 17, 16–28
  10. Sigrist, R.; Rauter, G.; Riener, R.; Wolf, P. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychon. Bull. Rev. 2013, 20, 21–53
  11. Da Gama, A.; Chaves, T.; Figueiredo, L.; Teichrieb, V. Guidance and movement correction based on therapeutics movements for motor rehabilitation support systems. In Proceedings of the 14th Symposium on Virtual and Augmented Reality, Rio de Janeiro, Brazil, 28–31 May 2012; pp. 191–200
  12. Renz Ocampo et al. Improving User Performance in Haptics-Based Rehabilitation Exercises by Colocation of User’s Visual and Motor Axes via a Three-Dimensional Augmented-Reality Display. IEEE Robotics and Automation Letters (Volume: 4, Issue: 2, April 2019). DOI: 10.1109/LRA.2019.2891283.https://ieeexplore.ieee.org/abstract/document/8604049/authors#authors
  13. M.Gazzoni et al. Augmented reality system for muscle activity biofeedback. Annals of Physical and Rehabilitation Medicine; Volume 61, Supplement, July 2018, Pages e483-e484.https://doi.org/10.1016/j.rehab.2018.05.1129
  14. https://www.researchgate.net/publication/262342964_Ghostman_Augmented_Reality_Application_for_Telerehabilitation_and_Remote_Instruction_of_a_Novel_Motor_Skill
  15. Tony Liao et al., Chapter 6 – Augmented reality in health and medicine: A review of augmented reality application for health professionals, procedures, and behavioral interventions. Technology and Health: Promoting Attitude and Behavior Change, 2020, Pages 109-128.https://doi.org/10.1016/B978-0-12-816958-2.00006-X
  16. David B. Douglas et al. Augmented Reality: Advances in Diagnostic Imaging. Multimodal Technologies Interact. 2017, 1(4), 29;https://doi.org/10.3390/mti1040029
  17. http://arinmed.com/breaking-visual-boundaries-augmented-reality/
  18. https://fyidoctors.com/en/blog/categories/health-and-wellness/oxsight-glasses-augmented-reality-for-the-visually-impaired
  19. Svetlana Samoilova, Amitava Gupta, Ron Blum, and Igor Landau “NewSight Reality Inc. (NSR) novel transparent optical module for augmented reality eyewear”, Proc. SPIE 11062, Digital Optical Technologies 2019, 110620D (21 June 2019);https://doi.org/10.1117/12.2531774
  20. Bifulco et al., Telemedicine supported by Augmented Reality:an interactive guide for untrained people in performing an ECG test. BioMedical Engineering OnLine 2014, 13:153. http://www.biomedical-engineering-online.com/content/13/1/153
  21. Shiyao Wang et al., Augmented Reality as a Telemedicine Platform for Remote Procedural Training. Sensors 2017, 17, 2294; doi:10.3390/s17102294
  22. https://www.blippar.com/blog/2018/05/18/3-benefits-of-augmented-reality-in-healthcare
  23. https://www.proximie.com/
  24. https://www.microsoft.com/en-us/hololens
  25. https://www.accuvein.com/why-accuvein/ar/
  26. https://orcahealth.com/software/orca-health-app
  27. https://echopixeltech.com/index.html
  28. http://www.brain-power.com/autism/
  29. https://augmedix.com/
  30. https://powelldesigns.com/portfolio/atheer-air-medical/
  31. https://atheerair.com/platform/
  32. https://aira.io/
  33. https://www.crunchbase.com/organization/proximie
  34. https://www.bioflightvr.com/
  35. https://www.crunchbase.com/organization/bioflightvr
  36. https://www.medicalrealities.com/
  37. https://helplightning.com/
  38. https://sentiar.com/
  39. https://www.crunchbase.com/organization/meta-view#section-overview
  40. https://www.theverge.com/2019/5/28/18642350/meta-view-metavision-augmented-reality-ar-headset-new-company-launch-jay-wright
  41. https://www.immersivetouch.com/immersivesim-training
  42. https://www.touchsurgery.com/
  43. P. Vávra et al., Recent Development of Augmented Reality in Surgery: A Review. Journal of Healthcare Engineering, Volume 2017, Article ID 4574172, 9 pages. https://doi.org/10.1155/2017/4574172
  44. Wei et al., Using Google Glass in Surgical Settings: Systematic Review. JMIR Mhealth Uhealth 2018;6(3): e54; doi: 10.2196/mhealth.9409
  45. Chaballout et al., Feasibility of Augmented Reality in Clinical Simulations: Using Google Glass With Manikins. JMIR MEDICAL EDUCATION 2016;2(1): e2; doi: 10.2196/mededu.5159.
  46. Follmann et al., Technical Support by Smart Glasses During a Mass Casualty Incident: A Randomized Controlled Simulation Trial on Technically Assisted Triage and Telemedical App Use in Disaster Medicine. J Med Internet Res 2019;21(1):e11939; doi: 10.2196/11939

Disclaimer:

  • This document has been created for educational and instructional purposes only
  • Copyrighted materials used have been specifically acknowledged
  • We claim the right of fair use as ascertained by the author

AUTHOR

Mr. Anil Vadnala
Author
Submit your review
1
2
3
4
5
Submit
     
Cancel

Create your own review

Scitech Patent Art
Average rating:  
 0 reviews