Augmented reality (AR) is one of the recent innovations making inroads into several markets, including healthcare. AR takes digital or computer-generated information such as audio, video, images and touch or haptic sensations and overlays them in a real-time environment. AR offers feasible solutions to many challenges within the health care system and, as such, offers numerous opportunities for its implementation in various areas, such as medical training, assistance with surgeries, rehabilitation etc. AR innovations can assist medical personnel improve their ability to diagnose, treat and perform surgery on their patients more accurately.
Some of the pioneering AR solutions aimed at changing the face of healthcare to meet the challenges faced by the users are discussed below.
AR has a deep impact on medical training, with applications ranging from 3D visualizations to bringing anatomical learning to life. AR applications project extensive information, visual 3D structures and links onto the traditional pages of medical textbooks for training in, say, anatomy. Recent hardware platforms, such as the HoloLens glasses by Microsoft, have started supporting medical education applications. With the use of hand gestures, it allows the deformation, fly-by view, and other interactions with 3D models to reveal hidden organs [1, 2].
Vimedix™, from CAE Healthcare, is used for training students in echocardiography. It consists of a mannequin and a transducer transthoracic or transesophageal echocardiogram. It enables healthcare professionals to see how an ultrasound beam cuts through human anatomy in real-time. CAE Healthcare has begun to integrate HoloLens in their technology to provide the ability to view the images with glasses, unrestricted from the dimensions of the screen .
The main application of AR in surgical training is telementoring, i.e. the supervisor teaches the trainee by demonstrating the proper surgical moves, paths and handlings on the AR screen. These parts of information are displayed to the trainees while guiding them. As a learning tool, AR provides the key benefit of creating a highly engaging and immersive educational experience by combining different sensory inputs .
AR finds the most applications in surgery such as nephrectomy, neurosurgery, orthopaedics, and particularly laparoscopic surgery. Reliability and realism are extremely important factors in these fields, not only for the comfort of the user but also for assuring that no inappropriate handling will be learnt and that the actual traumatic conditions will be recreated. AR demonstrates certain advantages, such as minimal cost per use, the absence of ethical issues, and safety as compared to training on actual patients .
Oral and maxillofacial surgery (OMS) is one sensitive and narrow spatial surgery that requires high accuracy in image registration and low processing time of the system. The current systems suffer from image registration problems while matching two different posture images. Researchers from Charles Sturt University, Australia developed a system to improve the visualization and fast augmented reality system for OMS. The proposed system achieved an improvement in overlay accuracy and processing time .
Surgical navigation is essential to perform complex operations accurately and safely. The traditional navigation interface does not display the total spatial information for the lesion area as it is intended only for two-dimensional observation. Researchers from Soochow University, China have applied augmented reality (AR) technology to spinal surgery to provide more intuitive information to surgeons. A virtual and real registration technique based on an improved identification method and robot-assisted method was proposed to improve the accuracy of virtual and real registration. The effectiveness of the puncture performed by the robot was verified by X-ray images. It was noticed that the two optimized methods are highly effective. The proposed AR navigation system has high accuracy and stability, and is expected to become a valuable tool in future spinal surgery .
The Fraunhofer Research Institute has initiated a project, MEVIS that uses an iPad-based AR application during liver operations. It helps the surgeon know the location of blood vessels inside the organ, by comparing the actual operation with the planning data based on 3D X-ray images. The figure below shows an overlay of the planning data on the actual camera image, as if looking inside the organ .
A recent study reporting 3D image-guided surgery indicates that AR with visual cues to the subsurface anatomy could be a replacement in the case of minimally invasive surgery in the field of urology .
AR is in the exploratory stage in the rehabilitation field, where it shows advantages over traditional rehabilitation methods by creating interactive and immersive environments. Moreover, AR can provide reliable and accurate feedback to guide and correct the patient during an exercise, enhancing the individual motor learning. In this way, the rehabilitation process can occur outside of a clinical setting, and without requiring the therapist supervision. However, in case of an unsupervised rehabilitation, the AR system can provide the therapist with the user performance data, enabling an offline physiotherapeutic evaluation [9, 10, 11].
Researchers from the University of Alberta, Canada, have developed a 3D-spatial augmented reality (AR) display to co-locate visual and haptic feedback to the user in three rehabilitative games. The users would be put under cognitive load (CL) for simulating disability-induced cognitive deficiencies when performing tasks. It was found that AR leads to the best user performance, with or without cognitive loading. It is most evident in dynamic exercises where the participants are required to have quick reaction times and fast movement .
Researchers from the Polytechnic University of Turin, Italy, have developed an AR system for real-time visualization of an index of muscle activity superimposed on the investigated muscle. The system includes a video camera, one or more surface EMG (sEMG) detection systems, and a processing and visualization unit. The system integrates the information from the video camera and sEMG systems, and creates an augmented video frame. The patient or the clinical operator can see the real-time augmented video on a display .
Researchers from the University of Tasmania have developed a visual augmentation system, Ghostman, that allows a physical therapist and patient to co-habit each other's viewpoint in an augmented real-world environment. It consists of two subsystems in which one is operated by the patient and the other one by the therapist, and the two subsystems connect via the internet, enabling remote telepresence applications. The therapist delivers instructions remotely and observes acts of motor skills through the patient's perspective .
Fig. 2. Inhabiting visual augmentation 
It was observed that fine motor skills developed much faster using the AR tool compared to traditional face-to-face demonstration.
AR is helping individuals overcome phobias, specifically bug phobias. In order to treat bug phobia, a specific mechanism of exposure therapy through AR can be utilized to decrease the fear of bugs for patients who are clinically diagnosed with phobias. AR can be an improvement over existing exposure therapy because healthcare providers have total control over the AR stimulus and can categorically guarantee that nothing unexpected can happen to the patient. It is effective with the added advantage of full control over the AR exposure conditions .
AR technologies can enhance emotional engagement and sense of presence for treatment health interventions. It can help
There have been creative implementations for children with different health conditions using AR. Researchers studied that music therapy through AR for children with cerebral palsy increases motor exercise. Results from this study showed the potential of AR in the rehabilitation process for music therapy by promoting positive cognitive, physical, mental and social changes in individuals with cerebral palsy. This type of intervention could serve as an interventional tool for various cognitive, motor, psychological and social health conditions .
Doctor-patient communication is a key component of healthcare because effective doctor-patient communication enhances patient motivation, increases overall benefit from healthcare and increases perceived levels of support that patients receive. AR can potentially evoke empathy, which is an important factor in family communication and perceived doctor-patient communication. Evoking empathy through AR could have unique applications for understanding patients' situations when they go through an illness. Through visual effects, AR applications in theory could utilize presence and immersion experiences to help people vicariously experience patient symptoms .
AR technology provides opportunities for improved diagnosis and characterization of diseases, including cancer. The user can view simultaneous display of a virtual image from the patients' imaging investigation and the real-world image of the surroundings. The real-world image would be the patients' anatomy if a cohesive physical exam and medical imaging assessment, pre-operative planning assessment or intraoperative procedure are being conducted. In other situations, the real-world image could include other environments, such as the operating room, educational lecture hall, radiology reading office or physician’s place .
Microsoft’s HoloLens helps doctors diagnose patients more effectively by overlaying CT scans and other imagery onto a patients' body. AR with depth 3-dimensional (D3D) imaging provides depth perception through binocular vision, head tracking for improved human –machine interface (HMI) and other key features. It can enhance diagnosis, save cost and improve patient care .
Ophthalmology is one of the medical specialties that has developed fastest in the last few years. There are many new different procedures and practices that arise in Ophthalmology every year and augmented reality is one of them.
University of Oxford researchers have developed a new system called OxSight to aid navigation in visually sighted individuals. It is designed to enhance vision for people with peripheral vision loss resulting from, for example, glaucoma, retinitis pigmentosa and homonymous hemianopia. The glasses enable the visually impaired to see better by amplifying the remaining sight they have and helping their brain make sense of the information. OxSight’s glasses can be customized to assist with the unique restrictions each individual faces with their vision [17, 18].
Transparent Optical Module (TOM) from NewSight Reality Inc. addresses many limitations of today’s AR devices such as size, weight, light and energy efficiency and transparency, along with the ability to be embedded within an ophthalmic lens. It has an exclusive transparent light engine and a Micro-Lens Array optical engine to project digital information from an image source to the eye of a user. It is hermetically sealed with a flexible multilayer coating that makes it resistant to sweat and environmental conditions. One of the main advantages of TOM is that it is a single thin, lightweight module that operates with low power and long battery life. TOM is curved to be integrated into an eyewear lens for monocular or binocular AR eyewear. It can result in highly efficient and high image quality AR .
In telemedicine, the correct use of medical device at the point of need is necessary to provide appropriate service. In some scenarios when medical professionals are not available, untrained people may be required to interact with medical devices and patients.. Thus, AR becomes an effective tool in telemedicine for guiding untrained people at the point of need .
Researchers from University of Naples, Italy, have built a prototype system to estimate how untrained users, with limited or no knowledge, can effectively interact with an ECG device and properly place ECG electrodes on a patients' chest. Some markers are attached to the ECG device and onto the patients' thorax to allow camera calibration. The video of the current scene is augmented in real-time, and objects and their pose in the space are recognized, with additional pointers, text boxes and audio to help the untrained users to perform appropriate operation. Some user’s voice commands are also included to improve usability. The untrained users are able to carry out a complete ECG test first on a mannequin and then, on a real patient in a reasonable timespan. This approach can be adapted to support the use of other medical equipment as well as other telemedicine tasks .
Fig. 5. Untrained user on a mannequin and on the portable ECG. Pictures representing an untrained user while placing precordial electrodes on the mannequin (right) and operating the ECG-device (left) 
Researchers from the Memorial University of Newfoundland, Canada, have developed a new ultrasound telementoring AR application using Microsoft HoloLens to facilitate and enhance remote medical training. The developed AR system is more immersive as it presents a controlled hand model with an attached ultrasound transducer. It enables remote learners to perform complex medical procedures such as point-of-care ultrasound without visual interference. HoloLens captures the first-person view of a simulated rural emergency room through mixed reality capture (MRC). Leap Motion is used to capture mentor’s hand gestures and virtually display the same in the AR space of the HoloLens. The trainee wears an AR headset and follows voice instructions together with the mentor’s transported hand gestures. This could become an alternative tool to perform ultrasound training .
Fig. 6. A novel Augmented Reality telemedicine platform involving real-time remote pointing and gesture capture 
With AR, pharmaceutical companies can improve patient education by visualising complex products, wherein patients can see a 3D version of the drug working in front of their eyes instead of just reading long descriptions on the bottle. Lab analysts can monitor their experiments with AR equipment. In factories, workers can start working without hands-on training as the device would tell them what to do and how to do it. AR tools can display the process of taking a medicine along with the benefits in a visual way. For instance, Pfizer takes the help of AR powered by Blippar. Consumers can learn how a ThermaCare works and learn more about which product options are best suited to relieve their pain and stiffness .
One challenge the healthcare sector faces is that there is often a lag between a promising, innovative technology coming out of its development phase and its widespread adoption. This delay is due to costs of purchasing new technology, the time it takes to raise awareness and the need to integrate new systems from installation to training staff. Tools such as Proximie can help vendors reach potential customers all over the world any time as chosen and offer in-depth demonstrations, rather than wait for those all-too-rare opportunities to demonstrate new products face-to-face .
HoloAnatomy, a HoloLens app released by Case Western Reserve University and the Cleveland Clinic, in partnership with
Microsoft HoloLens is the first AR headset to create holograms of digital content within a real-world environment. Medical students can bring up a realistic image of a heart or brain in a classroom to rotate, enlarge, take apart and examine in-depth in front of their eyes. Besides, surgeons can use the holograms while preparing for complex operations .
AccuVein has come out with the world’s first handheld,
Orca Health is a mobile software company that uses mobile applications and integrated tools to educate patients on making better decisions about their health. The Orca Health app features a ground-breaking Augmented Reality experience with fully interactive 3D anatomical models that can be manipulated with the touch of a finger. Using this, a user can view 3D anatomical models of ten areas of the human body that includes the spine, shoulder, hip, knee, hand, foot, heart, ENT, eye and mouth .
EchoPixel develops diagnostic, surgical planning and image-guided treatment applications. Its True 3D AR system provides clinicians with a holographic experience to visualize and interact
Brain Power deals with integrating pioneering neuroscience with the latest in wearable technology; in particular, Google Glass. It builds software to convert wearables into neuro-assistive devices for the educational challenges of autism; the aim is to teach life skills to children and adults on the autism spectrum. It has developed a software, Empowered Brain to help children with language, social skills and positive behaviour .
Fig. 11. Three children on the autism spectrum using the evidence-based “Empowered Brain” augmented-reality system to address the learning challenges of autism 
The Augmedix platform is powered by a combination of exclusive natural-language-processing technology and medical documentation expert teams. It provides clinicians with hardware, Smartphones or Google Glass to securely stream the clinic visit to its cloud-based platform. The tech-enabled specialists remotely operate the platform and utilize exclusive automation modules to generate medical documentation accurately, comprehensively and timely delivered. The service works with over 25 specialties and supports most EHRs. Augmedix is used by independent providers and more than a dozen major health systems including Dignity Health, Sutter Health, The US Oncology Network, TriHealth, and others .
Atheer has developed the Augmented interactive Reality (AiR)
Aira is derived from Artificial Intelligence (AI) and Remote
Some recent patent applications presented below could be an indicator of things to come in the future.
US10326975B2 from OnPoint Medical, Inc. deals with a real-time surgery navigation method and apparatus for image-guided surgery, particularly CT-guided, MR-guided, fluoroscopy-based or surface-based image-guided surgery, wherein images of a portion of a patient are taken in the preoperative or intraoperative setting and used during surgery for guidance. A stereoscopic video display is used by the surgeon to see the graphical representation of the preoperative or intraoperative images blended with the video images in a stereoscopic manner through a see-through head mounted display.
WO2018US56422A from INTUITIVE SURGICAL INC. deals with a method comprising display of a surgical environment image that includes a virtual control element for controlling a component of a surgical system. The method also includes displaying an image of a body part of a user for interacting with the virtual control element, and receiving a user input from a gesture-based input device while the body part interacts with the virtual control element.
EP3574361A1 from NOVARTIS AG deals with a system of augmented reality device coupled to an imaging system of an ophthalmic microscope. The augmented reality device includes a lens to project a digital image, a gaze control to detect the focus of an eye of an operator, and a dimming system communicatively coupled to the gaze control and the outer surface. It also deals with a method of performing ophthalmic surgery using the augmented reality device and a non-transitory computer readable medium that is able to perform augmented reality functions.
WO2019171216A4 from STATE OF ISRAEL MINISTRY OF DEFENSE RAFAEL ARMAMENT DEVELOPMENT deals with an augmented reality device and method for assisting in walking or movement disorders. A wearable device is used by a user that has a super imposer for superimposing augmented information on a real-world field of view of the user. The wearable device provides the user with a relative downward directed field of view aimed at his foot or footwear, and the superimposed information is designed to appear within the downward directed field of view as being present close to the foot or footwear.
Founded in 2016, Proximie is a secure surgical collaboration platform that enables surgeons to share knowledge before, during and after surgery. It enables users to connect and
It is designed for use in low resource settings, so that it works over low-speed internet and can be used with almost any device with a camera. As it is coupled with an excellent library of recorded specialist cases and training materials, it is an affordable, long-term and sustainable solution. It was funded by Cedar Mundi Ventures, and is working with Global Smile Foundation to help provide quicker access to cleft surgery in Ecuador [23, 33].
Founded in 2015, BioFlight offers a wide range of medical virtual reality and augmented reality services, has a VR/AR doctor training and 360° enhanced videos to train physicians and surgeons on new products and procedures within their field. It has also developed an AR/VR medical training module to help students and healthcare providers refine their learnings and increase retention of their knowledge.
It has raised a total of $250K funding. It has created a training program for the Children's Hospital, Los Angeles, for teaching proper procedures or maintaining readiness for low-frequency and high-risk situations. In partnership with Duke University, it set up an Immersive Training Lab to improve curriculum effectiveness and retention. It is also partnered with Oculus, HTC vive, HP, Nvidia, and many others [34, 35].
Founded in 2015, Medical Realities is a medical training organization that builds an impressive repository of surgical resources using Google Glass. Users can experience the surgery through the eyes of an experienced consulting surgeon, seeing what they see and picking up on valuable techniques .
AIMS, an Italian academy specialising in physician training, has been using the Medical Realties Platform for three years, to enhance their training of surgical procedures using Medical Realities 360 Player, with over 400 students per year using the platform..
Royal Brompton & Harefield, NHS Foundation Trust uses the Medical Realities platform in a diverse set of ways within a hospital environment, using this platform to improve safety, enhance skills and improve the wellbeing of patients. It provides a low-cost and scalable way to reach the employees and patients .
Roche has been using this platform, as the immersive technology is a great way to explain new or complex ideas. With the Medical Realities platform, they communicate with internal stakeholders, provide workplace education, or help their representatives explain a new or existing product to health care providers .
Founded in 2009, Help Lightning enables surgeons to remotely
Help Lightning exclusively provides Remote Expertise through the use of Merged Reality. It uses Merged Reality to blend two real-time video streams, that of a remote expert and an onsite technician, into a collaborative environment. Its technology allows the expert to virtually reach out and touch what their service technician or customer is working on. It runs on existing mobile devices (iOS, Android) or a web-browser enabling experts to provide remote assistance as though they're working side-by-side. They can telestrate, freeze images, use hand gestures, and even add real objects into the Merged Reality environment .
Founded in 2017, SentiAR adds a new dimension to clinical
SentiAR’s technology converts CT, MRI, and real-time mapping/catheter location outputs into a real-time hologram in the clinician field of view. The EP can expand, measure, or enter the chambers of the “floating” cardiac model for a significantly faster procedure with higher accuracy. It allows for visualization to multiple clinicians with multiple headsets .
Founded in 2013, Meta View makes AR headset of the same name, as well as software to run on it, offering users a total augmented reality (AR) experience and serving a variety of purposes, including in healthcare space to assist surgeons with patient information and data. It offers wearable devices that provide augmented and virtual reality experience, and other gadgets based on machine learning and artificial intelligence .
Meta View has purchased the old Meta’s assets, which were sold when the latter’s creditor unexpectedly foreclosed on its loan. The new company will be led by former Qualcomm executive Jay Wright .
ImmersiveTouch provides augmented and virtual reality training and surgical simulation in healthcare. Founded in 2009, it offers the AR platform, ImmersiveTouch3 that is built into durable hardware and comes with an overhead monitor that allows others to watch real-time as the user goes through a simulation. It uses exclusive ultra-realistic haptic feedback to simulate the tactile sense of touch of operating on human
ImmersiveTouch has partnered with numerous medical institutions to conduct scientific studies on the effectiveness of its surgical simulations. The studies have consistently shown that practice on its technology links to improvements in actual surgical performance in dimensions such as surgical accuracy, time efficiency, technical finesse, procedural knowledge etc. .
Pravin K. Patel, M.D., Chief of Pediatric Plastic and Craniofacial Surgery at Craniofacial Center at the University of Illinois Heath & Medical Science Center and Chicago Shriners Hospitals for Children, who has conducted clinical studies with ImmersiveTouch, said that even though they could see the 3D anatomy, they could never interact with it – the patients' 3D anatomy was trapped in a 2D surface of the display screen. ImmersiveView™ allowed them to cross that glass barrier into the other side, a different universe where they could interact with the anatomy. For the first time, it gave the surgeon the ability to touch and have depth perception .
Touch Surgery is an interactive surgical simulator platform
Touch Surgery industry clients include Ethicon, Stryker, Lipogems, OrthoSensor, Zimmer Biomet, Vericel, and many others .
Stephen Beckmann, Sr. Program Manager, Medical Education, Stryker, said that, working with Touch Surgery’s talented team of animators and software engineers, they were able to create something that was unique both in the industry as well as for their respective companies. The end result was a unique learning tool that had been deployed to support their training curricula for surgeons, Stryker technical support staff and sales force. 
Though current AR devices display information with increasing accuracy and lower latency, there are several difficulties that need to be addressed. Currently, images are reconstructed and prepared in advance using complex algorithms and powerful computers. Due to advancement in technology, a real-time acquisition of high-resolution medical scans and 3D reconstructions can be possible. Such reconstructions could significantly improve overall accuracy. In spite of AR being able to speed up a surgical procedure, the necessity to prepare the whole system and make required registrations and measurements increases the time required to complete the surgery. Possibly, the introduction of fully automatic systems would eliminate this problem and reduce the total time required for completion of procedure .
Though the time required for completing a procedure can be reduced while using any form of AR, there are still certain limitations. The inattentional blindness is a concerning issue that needs to be addressed while using 3D overlays. The amount of information presented through AR during a surgery is increasing and therefore distracting. So, it is required to display only useful data or provide a method to switch between different sets of information based on requirement .
It is necessary to reach an adequate image contrast during a projection directly onto organs. There are persistent difficulties in creating a correct 3D and depth perception. The latency of the whole system is also an issue, because excessive latency may lower precision and reduce comfort of the surgeon .
Current HMDs usually weigh several hundred grams and produce plenty of heat; therefore, a discomfort for long-term wear. There is a need to better the ergonomics and allow continuous usage for a long period. AR projections in HMDs are known to produce simulator sickness, which is presented by nausea, headache, and vertigo or vomiting in the worst scenario. The exact reasons behind the simulator sickness are unknown; however, an inconsistency between visual, proprioceptive, and vestibular inputs is probably the case .
Compared to the first generation of HMD, significant changes have been observed in recent years, particularly concerning the quality of visual images. Early generation HMDs would provide a resolution comparable to that of standard TV, i.e., 800×600 pixels for each eye with an average field of view of 30 degrees. But new generation HMDs can display full high-definition resolution or many, and allow a field of view of at least 110 degrees. This allows the creation of experiences that appear more immersive and realistic, and increases the level of detail and precision users might need for professional applications such as in the medical field.
The overlaying images of AR can be either monoscopic or stereoscopic, which has the advantage that these images can be positioned in 3D space in front of the viewer. Google Glass and Epson SmartGlasses are prototypes of this technology, and they are constructed from light¬weight glass so that user can wear them in their pocket.
Medical professionals may be interested in using Google Glass in surgical settings due to the hands-free device potential to streamline workflow and maintain sterile conditions in operating room environment. Google Glass has been used in clinical simulation-based training for capturing video during care delivery and ultrasound-guided venous access and as a tool for paediatric surgery. Application of Smart Glasses in disaster scenarios would improve the quality of triage with telemedicine and AR features [44, 45, 46].
Current computers follow Moore’s law and have sufficient power to perform graphical computation of high-quality images in real time. Underlying algorithms based on the theories of machine learning are now supported by dedicated chips and manufacturer components of AR.
Advances in batteries have reached a level where huge amounts of video data can be transferred with almost no latency over dedicated cable connections or even wireless. Microsoft HoloLens is powered with high-performance batteries and presently allows unconnected operation for approximately three hours. Additionally, Microsoft delivers a software development kit (SDK) for giving access to lower level algorithms that perform elementary AR functions such as collision detection or environment space recognition.
AR is expected to serve as an advanced human-computer interface, working in symbiosis with health care professionals and allowing them to achieve even better results. Merging the technical expertise with local needs and health contexts can lead to new types of interventions and opportunities. Understating the relationship between AR and health, and their subsequent acceptance may play a role in the willingness of people within organizations to adopt them, particularly when it comes to wearing head-mounted displays. Future uses of AR may include electronic medical records (EMR) being automatically displayed on a device during a doctor’s examination or consultation with a patient, again highlighting the immediacy that AR can bring to medical practice.