Do-It-Yourself Augmented Reality Heads-Up Display (DIY AR-HUD): A Technical Note ================================================================================ * Jang W. Yoon * Michael Spadola * Rachel Blue * Anissa Saylany * Nikhil Sharma * Hasan S. Ahmad * Vivek Buch * Karthik Madhavan * H. Isaac Chen * Michael P. Steinmetz * William C. Welch * Neil R. Malhotra ## ABSTRACT **Background:** We present a “Do-It-Yourself” method to build an affordable augmented reality heads-up display system (AR-HUD) capable of displaying intraoperative images. All components are commercially available products, which the surgeons may use in their own practice for educational and research purposes. **Methods:** Moverio BT 35-E smart glasses were connected to operating room imaging modalities (ie, fluoroscopy and 3D navigation platforms) via a high-definition multimedia interface (HDMI) converter, allowing for continuous high-definition video transmission. The addition of an HDMI transmitter-receiver makes the AR-HUD system wireless. **Results:** We used our AR-HUD system in 3 patients undergoing instrumented spinal fusion. AR-HUD projected fluoroscopy images onto the surgical field, eliminating shift of surgeon focus and procedure interruption, with only a 40- to 100-ms delay in transmission, which was not clinically impactful. **Conclusions:** An affordable AR-HUD capable of displaying real-time information into the surgeon's view can be easily designed, built, and tested in surgical practice. As wearable heads-up display technology continues to evolve rapidly, individual components presented here may be substituted to improve its functionality and usability. Surgeons are in a unique position to conduct clinical testing in the operating room environment to optimize the augmented reality system for surgical use. * spine * minimally invasive surgery * MIS * nerve surgery * orthopedic * 3D imaging * augmented reality * image-guided surgery * intraoperative imaging * navigation * heads-up display ## INTRODUCTION Innovations in intraoperative imaging, from radiography to fluoroscopy to functional magnetic resonance imaging scans and tractography, have continually led to advents in diagnoses and surgical treatment options. However, each technological advancement also came with some inevitable resistance due to necessary modifications in a surgeon's routine. For example, while implementation of image guidance for spinal instrumentation has steadily grown amongst spinal surgeons since its inception in the 1990s, widespread adoption has been hampered by the surgical workflow alterations this technology requires, such as the prolonged steps of instrument registration and verification that can significantly increase operating room time.1–4 As technology improved and nursing staff were able to frequently preregister and verify the equipment before beginning the case, image guidance for procedures, such as screw placement, became more commonplace.5 Thus, new imaging technologies must not only improve upon the status quo but also must minimize workflow disruptions and reduce barriers to adoption before they can achieve their true potential. One drawback of intraoperative imaging that has yet to be successfully addressed is alternating attention, the back-and-forth fluctuation of a surgeon's focus between the surgical field and the image monitor. This known entity, occurring in both navigated and fluoroscopic techniques, stems from intraoperative images being displayed on a separate screen, requiring the surgeon to take their attention away from the patient every time they refer to the image guidance. Alternating attention creates unnecessary distractions and movements, which can lead to surgeon fatigue and an increased risk of intraoperative complications. In an attempt to solve these issues, the surgical community has shifted its attention to augmented reality (AR) with a heads-up display (HUD), which can project images directly onto the surgical field.6–10 Some platforms, such as XVision (Augmedics, Yokne'am Illit, Israel) and OpenSight (Novarad, American Fork, UT), also offer the capability of overlaying holograms onto the surgical field. The XVision system relies on a custom head-mounted display that allows a surgeon to visualize 3D navigation information through holograms projected onto the surgical field, while OpenSight uses Microsoft Hololens (Microsoft, Redmond, WA) as the heads-up display. The XVision system has obtained Food and Drug Administration (FDA) 510k clearance for intraoperative usage, while OpenSight is only approved for preoperative surgical planning. While these products attempt to bring AR into the operating room, they, like all new surgical technology, also have some barriers to adoption due to technical difficulties with image rendering and registration, limited battery life, and bulky headsets that impede surgical ergonomics and induce fatigue. Possibly, above all else, these systems are expensive; devices can cost up to US $250 000 and have significant per case disposable costs that can total up to several thousands of dollars per case. As a result, AR adoption by the surgical community has been slow, despite the advances that AR promises to bring. In this manuscript, we present a “Do-It-Yourself” (DIY) method to build a customizable and affordable augmented reality heads-up display system (AR-HUD) that is capable of displaying 3D navigation or fluoroscopy images into the surgeon's field of view in test environments such that an evolution of practice might be accelerated. All components described are commercially available “off-the-shelf” products that cost less than US $1000 in total. The light-weight design is comparable to the weight and feel of typical surgical loupes, allowing for a seamless transition between loupes and the AR-HUD for surgeons. This system is compatible with nearly all existing hospital equipment and can be easily implemented into surgical workflow to demonstrate how such a technology can augment current image guidance platforms without driving up costs. Importantly, the intent of this technology development was not to obtain FDA approval or create a commercialized product. Rather, by introducing a low-cost, light-weight, and easily-replicable AR-HUD system, we hope to lower the barrier of entry to using AR technology for surgeons who are willing to explore this new technology but may not be in a position, or have the desire, to make a major capital investment at their institution. While not as advanced as the AR devices mentioned previously, our system can enable surgeons to experience the benefits of reduced alternating attention while using intraoperative imaging without the high costs and workflow disruptions that accompany current surgical AR systems. Importantly, this DIY device also enables spine surgeons to become the key conceptualists and decision makers in the development and implementation of AR technology in the operating room. After testing and experiencing the advantages of AR, surgeons can advocate for the prioritization of specific AR features and workflow changes that are most beneficial from a surgical perspective. These suggestions will then inform further institutional investment in more powerful, FDA-approved AR devices and the application of these technologies in surgery, research, and resident education. ## MATERIALS AND METHODS We present a method to build an AR-HUD system that is capable of screen mirroring any imaging modality of the surgeon's choice. Three main components are needed to build this system: (1) HUD goggles (Moverio BT-35E, Epson Inc, Suwa, Japan), (2) a video graphics array (VGA) or digital visual interface (DVI) to high-definition multimedia interface (HDMI) converter, and (3) an HDMI cable or an optional HDMI wireless transmitter-receiver (Figure 1). The Moverio BT-35E HUD offers a light (119 g) frame and is capable of displaying binocular images onto the surgeon's visual field. Moverio BT-35E (Epson Inc) has an HDMI input, which can receive video that is converted from any video format (ie, VGA or DVI) to an HDMI format. The converter is needed to make older video output formats, such as VGA or DVI, HDMI compatible so that it can connect to the HUD. After the HDMI cable is connected to the Moverio BT-35E, continuous video-streaming software is triggered, and the video size and resolution is formatted automatically to start screen mirroring on the Moverio BT-35E HUD (Figure 2). Images are displayed on the binocular display through a 1.09-cm-wide panel that has 921 600 pixels (1280 × 720 × red, green, and blue) with a 30-Hz refresh rate. This allows surgeons to simultaneously see both the guidance images and the surgical anatomy. The battery pack for the AR-HUD is small enough to be placed in the surgeon's pocket without interference. ![Figure 1](https://www.ijssurgery.com/http://www.ijssurgery.com/content/ijss/15/4/826/F1.medium.gif) [Figure 1](https://www.ijssurgery.com/content/15/4/826/F1) Figure 1 The components necessary to create the augmented reality heads-up display (AR-HUD) system. This shows the connection from the Ziehm through a digital visual interface (DVI)- high-definition multimedia interface (HDMI) converter to the Moverio BT-35E glasses via an HDMI cable with the battery pack that would fit inside the user's pocket during the procedure. ![Figure 2](https://www.ijssurgery.com/http://www.ijssurgery.com/content/ijss/15/4/826/F2.medium.gif) [Figure 2](https://www.ijssurgery.com/content/15/4/826/F2) Figure 2 View of the image that is displayed within the augmented reality heads-up display (AR-HUD) system as seen from the operator's perspective (black line is not on actual display; it is only used to cover up patient identity). Optionally, an HDMI wireless transmitter-receiver can be used to transfer guidance images wirelessly to the AR-HUD. HDMI transmitters transfer HDMI images via microwave frequency to the receiver that is connected to the AR-HUD, which decodes the frequency into HDMI images. This detethers the AR-HUD from the imaging modality so that surgeon's mobility around the operating room is not limited to the HDMI cable connected to the imaging station. This optional HDMI transmitter-receiver can be added to supplement operating room workflow, but it is not an absolute necessity to build or run the system. After institutional review board approval, we used this customized AR-HUD system to display intraoperative fluoroscopic images to assess its potential value during spine instrumentation. All consents were obtained from patients participating in the study during preoperative clinical visits. A total of 3 patients underwent AR-HUD-assisted spine instrumentation. The Ziehm C-arm (Ziehm Imaging GmbH Company, Nuremberg, Germany) was used in all cases and was connected to the Moverio BT-35E (Epson Inc). In all 3 cases, a direct connection between the AR-HUD and Ziehm C-arm machine was done via HDMI cable. ## RESULTS Three patients underwent AR-HUD-assisted spine instrumentation: one-level cervical arthroplasty, complex revision combined anterior and posterior thoracolumbar fusion with lateral lumbar interbody fusion, and revisional extension of fusion to pelvis with posterior lumbar interbody fusion. Total operative time was 1158 minutes, with an average operative time of 386 minutes and an average estimated blood loss of 670 mL. Total surgical time was 181 minutes for case 1, 421 minutes for case 2, and 556 minutes for case 3. The first patient was a 35-year-old male who presented with medically intractable right C7 radiculopathy and early signs of myelopathy and was found to have C6-C7 cervical stenosis. The patient underwent C6-C7 cervical arthroplasty using AR-HUD to successfully display intraoperative fluoroscopic images. Lateral fluoroscopy was used to place two Caspar pins parallel to the endplate into the C6 and C7 vertebral bodies (Figure 3). Fluoroscopic images were immediately displayed via the AR-HUD directly into the surgeon's field of view, eliminating pauses and head turning during pin placement. The surgeon was also able to mallet a trial spacer and artificial disk into the disk space without turning away from the surgical field and losing sight of instrumentation. ![Figure 3](https://www.ijssurgery.com/http://www.ijssurgery.com/content/ijss/15/4/826/F3.medium.gif) [Figure 3](https://www.ijssurgery.com/content/15/4/826/F3) Figure 3 Case 1. C6-C7 cervical arthroplasty was successfully performed using an augmented reality heads-up display (AR-HUD) system under fluoroscopic guidance (panel A) for a C6-7 dis herniation (panel B). The placement of the Caspar pins (panel C) parallel to the endplates is important to allow symmetric distraction during cervical arthroplasty. AR-HUD allows for continuous visualization of x-ray images while malleting Caspar pin into the vertebral body. In addition, during the placement of trials and artificial discs into the disc space with the thecal sac exposed, AR-HUD allows the surgeon to view the lateral x-ray images without taking his eyes off his hands. Post-operative imaging shows successful placement of arthroplasty (panel D). The second patient was a 73-year-old female with a prior L3-L5 fusion that was complicated by adjacent segment disease with 21° of levoscoliosis with the apex at L2-L3. She underwent L1-L3 direct lateral lumbar interbody fusion and an extension of her prior posterior fusion to T10 (Figure 4). All fluoroscopic imaging for pedicle screw placement was immediately projected onto the AR-HUD during the case. ![Figure 4](https://www.ijssurgery.com/http://www.ijssurgery.com/content/ijss/15/4/826/F4.medium.gif) [Figure 4](https://www.ijssurgery.com/content/15/4/826/F4) Figure 4 Case 2. L1-L3 direct lateral interbody fusion and extension of fusion from T10-L5 using an augmented reality heads-up display (AR-HUD) system (panel A). Pre-operative imaging shows prior hardware from a previous surgery (panel B). During the pedicle screw placements with fluoroscopic guidance, lateral x-ray images are displayed within the surgeon's field of view; therefore, the surgeon can easily adjust or continue the trajectory of pedicle probe (or a screw) into the pedicle (panel C). Notice that in A, the attending surgeon is placing a pedicle screw with his back turned to the monitor. He is able to see the lateral x-ray images while looking straight at his hand without turning his body to see the monitor. This continuous overlay of x-ray images into the surgeon's field of view obviates the need for diverting his/her attention. Panel D shows post-operative imaging with successful extension of prior fusion. The third patient was a 71-year-old male with a prior T10-L4 fusion with a right L4-L5 pars fracture requiring decompression and extension of fusion to pelvis with L5-S1 posterior lumbar interbody fusion (Figure 5). AR-HUD was again used for all critical portions of the case requiring image guidance. Starting points for the pedicle screws and iliac bolts were initially created using an anatomic landmark (ie, the point where the midpoint of transverse process, pars interarticularis, and mamillary process converge), and the gearshift was advanced. Fluoroscopic images were then taken and immediately transferred to the AR-HUD, allowing for instant confirmation of the gearshift trajectory. The trajectory was easily adjusted using the AR-HUD without the surgeon shifting attention away from the surgical field. The AR-HUD also allowed for confirmation of iliac bolt placement by allowing seamless transitions between multiple x-ray views (ie, pelvic inlet and teardrop views), which obviated the need for alternating attention. Placement of the posterior lumbar interbody cages through Kambin's triangle was also completed with the AR-HUD, eliminating shifts in focus away from the instrumentation and decreasing the potential for inadvertent nerve root or thecal sac injury (see the Supplemental Video available online). ![Figure 5](https://www.ijssurgery.com/http://www.ijssurgery.com/content/ijss/15/4/826/F5.medium.gif) [Figure 5](https://www.ijssurgery.com/content/15/4/826/F5) Figure 5 Case 3. Extension of prior T10-L4 fusion (panel B) to pelvis with L5-S1 posterior lumbar interbody fusion using an augmented reality heads-up display (AR-HUD) system (panel A). When the resident is learning how to place a pedicle screw, it requires careful supervision from an experienced spine surgeon to teach proper technique to the resident. In A, the attending surgeon is standing on the right and the resident is on the left. The attending surgeon is able to view the x-ray images on AR-HUD while the resident is placing an S1 pedicle screw under lateral x-ray guidance (panel C). In addition, AR-HUD allows for continuous visualization of x-ray images during discectomy, endplate preparation, and the placement of an interbody cage through Kambin's triangle. Panel D shows post-operative imaging of successful extension of fusion. Overall, the device itself is comparable to regular loupes in size, weight, and comfort. It did not obstruct the surgeon's view and was able to be comfortably worn throughout the duration of spine instrumentation placement. While not intended for FDA approval and subsequent commercialization, this device would most likely be categorized as a class I low-risk medical device considering that it is not in direct contact with the patient. As the DIY AR-HUD does not generate additional diagnostic information but rather allows for easier visualization of available imaging data, 510k premarket approval would likely not be required, similar to the exemption rules for surgical loupes. However, these exact classifications and exemptions may be clarified by the FDA if or when similar AR-HUD technology becomes commercialized. ## DISCUSSION AR technology is a potentially useful adjunct to surgical interventions associated with imaging. Herein, we describe a user-friendly, DIY methodology for practicing surgeons to incorporate and develop this technology into surgical practice before making a major capital investment. As AR technology continues to develop, its indications and applications will likely expand. However, in order for these systems to successfully cross over, it will take time for surgeons to learn to adopt such technology into their workflow. Nowhere is this more evident than with navigation guidance in spine surgery. Although the technology has existed for over 3 decades, its adoption has not proven universal, despite documented patient and surgeon advantages. Modifying a surgeon's workflow and routine is a challenging task and can often be met with skepticism; AR adoption will face with similar challenges. In order to prove the value of AR, proponents will have to show that the improved ergonomics experienced by its users (ie, surgeons) is worth the cost and workflow adjustments in the operating room. However, in an economically challenging environment with limited resources, adding costly technology may not be feasible, even if the technology improves surgical ergonomics. Furthermore, surgeons may be unwilling to integrate new and unknown technology into their operations without fully understanding and experiencing the impacts and benefits it will bring. Thus, to create a paradigm shift in surgery with AR technology, affordability and ease of implementation are critical. Here, we described how to create an affordable, efficient, and comfortable AR-HUD system that is universally compatible with current navigation and fluoroscopic guidance technology, even systems that use older video formats (ie, VGA or DVI). This device improves surgical ergonomics and safety and is a way for surgeons to experience the intraoperative benefits of AR without a large monetary expenditure or drastic changes to the current surgical workflow. By expanding the number of surgeons who can experience and use initial AR technology, we hope to place surgeons in the “driver's seat” to guide further development of intraoperative AR systems. This manuscript is also intended to share our initial subjective experience in 3 spine instrumentation cases. We did not collect objective data, such as screw placement time, accuracy, number of times the surgeon used AR-HUD for viewing x-ray images during the surgery, and others. However, the current work serves as a proof of concept for this novel technology, demonstrating cost and implementation feasibility as well as the capacity to improve the existing intraoperative imaging technology for 3 surgical cases. This simple and cost-effective system allows surgeons to easily experience the ergonomic and surgical benefits of AR and can enable surgeons to spearhead subsequent investment in and adoption of more powerful AR systems. This easily replicable nonproprietary system can also be used by investigators conducting research with AR-HUDs who would benefit from an affordable alternative to current models as well as residency programs that could begin implementing AR for training purposes. For subsequent work, we intend to quantify the benefit of AR-HUD by collecting objective measures to demonstrate that it works beyond subjective opinion. Future directions of this system could include several improvements upon the current technology, such as image magnification and fine-tuned interactive holograms with improved accuracy. Continuing studies using this AR-HUD with navigation and microscopic images for both spinal and cranial surgeries would be able to provide more guidance for implementation of the technology to potential users. ## CONCLUSIONS In this manuscript, we describe a DIY AR-HUD system that is customizable and affordable. The entire system can be built with relative ease and affordability. By sharing our “know how” on building an AR system, it is our hope that more surgeons will be able to experience and develop AR in surgical practice, especially with a comprehensible, economical alternative. By simply trialing this system, surgeons will be able to understand the technology better, which can spark further adoption of AR technology and expand its indications from the operating room to research to resident training. This is the requisite process through which AR technologies can become ubiquitous and user friendly. Such a transition may occur regardless of a surgeon's input in the evolution of AR, but our hope is to inspire surgeons to become the decision makers and the key leaders in doing so. ## Footnotes * **Disclosures and COI:** Dr Yoon is the founder and CEO of MedCyclops. This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. * This manuscript is generously published free of charge by ISASS, the International Society for the Advancement of Spine Surgery. Copyright © 2021 ISASS ## REFERENCES 1. 1 .Wagner A, Ploder O, Enislidis G, et al. Image-guided surgery. *Int J Oral Maxillofac Surg*. 1996;25(2):147–151. [PubMed](https://www.ijssurgery.com/lookup/external-ref?access_num=8727590&link_type=MED&atom=%2Fijss%2F15%2F4%2F826.atom) 2. 2 .Iseki H, Masutani Y, Iwahara M. Volumegraph (overlaid three-dimensional image-guided navigation): clinical application of augmented reality in neurosurgery. *Stereotact Funct Neurosurg*. 1997;68(1-4):18–24. [PubMed](https://www.ijssurgery.com/lookup/external-ref?access_num=9711690&link_type=MED&atom=%2Fijss%2F15%2F4%2F826.atom) 3. 3 .Barnett GH, Steiner CP, Weisenberger J. Adaptation of personal projection television to a head-mounted display for intra-operative viewing of neuroimaging. *J Image Guided Surg*. 1995;1(2):109–112. [PubMed](https://www.ijssurgery.com/lookup/external-ref?access_num=9079435&link_type=MED&atom=%2Fijss%2F15%2F4%2F826.atom) 4. 4 .Rahmathulla G, Nottmeier EW, Pirris SM, et al. Intraoperative image-guided spinal navigation: technical pitfalls and their avoidance. *Neurosurg Focus*. 2014;36(3):E3. doi:[10.3171/2014.1.FOCUS13516](https://www.ijssurgery.com/lookup/doi/10.3171/2014.1.FOCUS13516) [CrossRef](https://www.ijssurgery.com/lookup/external-ref?access_num=10.3171/2014.1.FOCUS13516&link_type=DOI) [PubMed](https://www.ijssurgery.com/lookup/external-ref?access_num=24580004&link_type=MED&atom=%2Fijss%2F15%2F4%2F826.atom) 5. 5 .Tjardes T, Shafizadeh S, Rixen D, et al. Image-guided spine surgery: state of the art and future directions. *Eur Spine J*. 2010;19(1):25–45. doi:[10.1007/s00586-009-1091-9](https://www.ijssurgery.com/lookup/doi/10.1007/s00586-009-1091-9) [CrossRef](https://www.ijssurgery.com/lookup/external-ref?access_num=10.1007/s00586-009-1091-9&link_type=DOI) [PubMed](https://www.ijssurgery.com/lookup/external-ref?access_num=19763640&link_type=MED&atom=%2Fijss%2F15%2F4%2F826.atom) 6. 6 .Yoon JW, Chen RE, Han PK, et al. Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. *Int J Med Robot*. 2017;13(3). doi:[10.1002/rcs.1770](https://www.ijssurgery.com/lookup/doi/10.1002/rcs.1770) 7. 7 .Yoon JW, Chen RE, Kim EJ, et al. Augmented reality for the surgeon: systematic review. *Int J Med Robot*. 2018; 14(4):e1914. doi:[10.1002/rcs.1914](https://www.ijssurgery.com/lookup/doi/10.1002/rcs.1914) 8. 8 .Mascitelli JR, Schlachter L, Chartrain AG, et al. Navigation-linked heads-up display in intracranial surgery: early experience. *Oper Neurosurg (Hagerstown)*. 2018;15(2):184–193. doi:[10.1093/ons/opx205](https://www.ijssurgery.com/lookup/doi/10.1093/ons/opx205) [CrossRef](https://www.ijssurgery.com/lookup/external-ref?access_num=10.1093/ons/opx205&link_type=DOI) 9. 9 .Diaz RJ, Yoon JW, Chen RE, et al. Real-time video-streaming to surgical loupe mounted head-up display for navigated meningioma [published online ahead of print April 30, 2017]. *Turk Neurosurg*. doi:[10.5137/1019-5149.JTN.20388-17.1](https://www.ijssurgery.com/lookup/doi/10.5137/1019-5149.JTN.20388-17.1) 10. 10 .Yoon JW, Chen RE, ReFaey K, et al. Technical feasibility and safety of image-guided parieto-occipital ventricular catheter placement with the assistance of a wearable head-up display. *Int J Med Robot*. 2017;13(4). doi:[10.1002/rcs.1836](https://www.ijssurgery.com/lookup/doi/10.1002/rcs.1836)