•Complete workflow for use of mixed reality in surgery planning.•Conducted qualitative user evaluation from wide demographic of participants.•Conducted use-cases in planning for liver resection and ...congenital heart surgery.•Surgeons perceived the HoloLens to be useful and recommendable to others.
Meticulous preoperative planning is an important part of any surgery to achieve high levels of precision and avoid complications. Conventional medical 2D images and their corresponding three-dimensional (3D) reconstructions are the main components of an efficient planning system. However, these systems still use flat screens for visualisation of 3D information, thus losing depth information which is crucial for 3D spatial understanding. Currently, cutting-edge mixed reality systems have shown to be a worthy alternative to provide 3D information to clinicians. In this work, we describe development details of the different steps in the workflow for the clinical use of mixed reality, including results from a qualitative user evaluation and clinical use-cases in laparoscopic liver surgery and heart surgery. Our findings indicate a very high general acceptance of mixed reality devices with our applications and they were consistently rated high for device, visualisation and interaction areas in our questionnaire. Furthermore, our clinical use-cases demonstrate that the surgeons perceived the HoloLens to be useful, recommendable to other surgeons and also provided a definitive answer at a multi-disciplinary team meeting.
•Collaborative MR technology is now mature enough to focus squarely on human needs.•We distil representative literature focusing on collaboration in MR since 1990.•We identify limitations of existing ...frameworks when applied to collaborative MR.•We identify emerging trends and future directions to support collaboration in MR.
Collaborative Mixed Reality (MR) systems are at a critical point in time as they are soon to become more commonplace. However, MR technology has only recently matured to the point where researchers can focus deeply on the nuances of supporting collaboration, rather than needing to focus on creating the enabling technology. In parallel, but largely independently, the field of Computer Supported Cooperative Work (CSCW) has focused on the fundamental concerns that underlie human communication and collaboration over the past 30-plus years. Since MR research is now on the brink of moving into the real world, we reflect on three decades of collaborative MR research and try to reconcile it with existing theory from CSCW, to help position MR researchers to pursue fruitful directions for their work. To do this, we review the history of collaborative MR systems, investigating how the common taxonomies and frameworks in CSCW and MR research can be applied to existing work on collaborative MR systems, exploring where they have fallen behind, and look for new ways to describe current trends. Through identifying emergent trends, we suggest future directions for MR, and also find where CSCW researchers can explore new theory that more fully represents the future of working, playing and being with others.
Augmented Reality (AR), Virtual Reality (VR), Mixed Reality, and Extended Reality (often – misleadingly – abbreviated as XR) are commonly used terms to describe how technologies generate or modify ...reality. However, academics and professionals have been inconsistent in their use of these terms. This has led to conceptual confusion and unclear demarcations. Inspired by prior research and qualitative insights from XR professionals, we discuss the meaning and definitions of various terms and organize them in our proposed framework. As a result, we conclude that (1) XR should not be used to connote extended reality, but as a more open approach where the X implies the unknown variable: xReality; (2) AR and VR have fundamental differences and thus should be treated as different experiences; (3) AR experiences can be described on a continuum ranging from assisted reality to mixed reality (based on the level of local presence); and (4), VR experiences can be conceptualized on a telepresence-continuum ranging from atomistic to holistic VR.
•XR should not be used as an abbreviation for “Extended Reality”; X should be the placeholder for “all” new reality formats.•AR and VR have fundamental differences and thus should be treated as different experiences.•AR experiences can be described on a LOCAL PRESENCE continuum ranging from Assisted Reality to Mixed Reality.•VR experiences can be conceptualized on a TELEPRESENCE-continuum ranging from atomistic to holistic VR.
Augmented Reality (AR) is a promising and growing field in marketing research and practice. Very little is known if, how, and why AR-apps can impact consumers’ perception and evaluation of brands. ...The following research presents and empirically tests a framework that theorizes how consumers perceive and evaluate the benefits and augmentation quality of AR apps, and how this evaluation drives subsequent changes in brand attitude. The study reveals consumer inspiration as a mediating construct between the benefits consumers derive from AR apps and changes in brand attitude. Besides providing novel insights into AR marketing theory, the study also suggests that marketers should consider evaluating mobile AR apps based on the inspiration potential (and not simply based on consumer attitudes, such as star-ratings in app stores).
Wearable
Mixed Reality (MR)
technology is a tool that gives people a new enhanced experience that they have not encountered before. This study shows the process of designing new museum experiences ...while considering how this technology changes previous museum experiences, what those experiences are, and what people should feel through these experiences. This process was systematically conducted according to the UX design process of analysis, synthesis, and evaluation. In the analysis step, six types of museum artifact viewing experiences were defined: knowing, restoring, exploring, expanded scale, encountering, and sharing experience through research and user surveys related to the museum experience. In addition, through research analysis related to MR technology, presence, flow, and natural interaction were defined as three essential factors that users should feel in the MR experience. In the synthesis stage, optimized wearable MR experiences were designed and implemented by applying the necessary experience types and essential factors according to the characteristics of each artifact. In the evaluation stage, user experience evaluations such as user experience tests for essential factors in the MR experience, User Experience Questionnaire (UEQ) tests for interaction products, and the Visual Aesthetics of Websites Inventory (VisAWI) test for visual experiences from various perspectives were conducted on the developed results. Through these evaluations, users gave positive scores to the design results based on the experience types and essential factors defined in this study. When applying new media technologies such as wearable MR technology, improved technology implementation is important, but an understanding of the applied field must first be obtained, and user analysis must first be thoroughly conducted. This study will be a guide to the systematic development process to be followed when applying wearable MR technology to other fields.
Metaverses embedded in our lives create virtual experiences inside of the physical world. Moving towards metaverses in aircraft maintenance, mixed reality (MR) creates enormous opportunities for the ...interaction with virtual airplanes (digital twin) that deliver a near-real experience, keeping physical distancing during pandemics. 3D twins of modern machines exported to MR can be easily manipulated, shared, and updated, which creates colossal benefits for aviation colleges who still exploit retired models for practicing. Therefore, we propose mixed reality education and training of aircraft maintenance for Boeing 737 in smart glasses, enhanced with a deep learning speech interaction module for trainee engineers to control virtual assets and workflow using speech commands, enabling them to operate with both hands. With the use of the convolutional neural network (CNN) architecture for audio features and learning and classification parts for commands and language identification, the speech module handles intermixed requests in English and Korean languages, giving corresponding feedback. Evaluation with test data showed high accuracy of prediction, having on average 95.7% and 99.6% on the F1-Score metric for command and language prediction, respectively. The proposed speech interaction module in the aircraft maintenance metaverse further improved education and training, giving intuitive and efficient control over the operation, enhancing interaction with virtual objects in mixed reality.
Background
Due to recent lockdown conditions, which restricted opportunities for face‐to‐face contact and the ability to be physically in schools, the need for novel, safe ways to train pre‐service ...teachers emerged even more pressingly. Whilst virtual simulation has received some attention in pedagogy and its benefits have been demonstrated in many disciplines, there appears to be less synthesized evidence on the use of physical and/or mixed‐reality simulation utilized in teacher training.
Objectives
The goal of this systematic scoping review was to summarize and synthesize the literature on the use of physical and/or mixed‐reality simulation in pre‐service teacher training.
Methods
A systematic scoping literature review combined with a textual narrative synthesis was undertaken. Ten reference databases were searched in May 2020: Academic search premier, CINAHL, Education Research Complete, Humanities International Complete, Psychology and Behavioural Sciences Collection, PsycInfo, Teacher Reference Center, Science Direct, Web of Science and Scopus.
Results and Conclusions
Following inclusion/exclusion criteria assessment and screening, 13 articles were included for appraisal and synthesis. Seven papers examined physical simulations, while the remainder examined mixed‐reality simulations. The evidence from this review suggests that simulation, including physical and mixed‐reality types, could be used as a tool to increase confidence, self‐efficacy, classroom management skills and communication.
Implications
In comparison to other fields (e.g., nursing, medicine and aviation) simulation in education appears to be in its infancy—more large‐scale research is needed. At the same time, this review indicates that mixed‐reality simulation in particular has the potential for contributing to teacher education, because it offers the potential for learning in various contexts when compared to traditional didactic teaching practices.
Lay Description
What is already known about this topic
Computer‐based simulation used in pre‐service teacher training can have a positive impact on specific skills development.
The use of virtual reality in relation to pedagogy and pedagogical environments already has a noteworthy evidence‐base, but an overview of physical and mixed‐reality simulations in pre‐service teacher education is underrepresented.
Virtual simulation is an emerging powerful tool to provide an inside experience, where pre‐service teachers can engage with the environment and virtual characters in a way that feels real
Mixed‐reality simulation involves real and simulated environments in which virtual and real objects, as well as people interact in real time.
What this paper adds
An overview of evidence for physical and mixed‐reality simulation used in pre‐service teacher education.
Evidence that physical and mixed reality simulation allow trainee‐teachers to reflect on their behaviour within a classroom setting and which is observable by other participants.
Implications for practice and/or policy
Educators should explore a range of simulation options in pre‐service teacher education to harness the different strengths of the technologies.
In the face of future emergencies, which may reduce face‐to‐face learning contexts, mixed‐reality simulation appears to be a promising approach to instil teaching skills whilst interacting with other participants
Numerous research studies and corporate press releases highlight the potential of a new form of wearable device appearing on the technology landscape: augmented reality smart glasses (ARSGs), i.e., ...digital eyeglasses that integrate virtual information into the user's field of vision. Yet research knows very little about this nascent technology. Therefore, the authors develop and empirically test a theoretical model to assess ARSG usage. Core findings are that expected utilitarian, hedonic, and symbolic benefits drive consumers' reactions to ARSGs. The results also show that the extent to which ARSGs threaten other people's, but not one's own, privacy can strongly influence users' decision making. A qualitative second study identifies multiple explanations for this surprising privacy finding. Theoretical and managerial implications conclude.
Background
The number of patients who suffer from glioma has been increasing, and this malignancy is a serious threat to human health. The mainstream treatment for glioma is surgical resection; ...therefore, accurate resection can improve postoperative patient recovery.
Purpose
Many studies have investigated surgical navigation guided by mixed reality, with good outcomes. However, the limitations of mixed reality, such as spatial drift caused by environmental changes, limit its clinical application. Therefore, we present a mixed reality surgical navigation system for glioma resection. Preoperative information can be fused precisely with the real patient with the spatial compensation method to achieve clinically suitable accuracy.
Methods
A head‐mounted device was used to display virtual information, and a markerless spatial registration method was applied to precisely align the virtual anatomy with the real patient preoperatively. High‐accuracy preoperative and intraoperative movement and spatial drift compensation methods were used to increase the positional accuracy of the mixed reality‐guided glioma resection system when the patient's head is fixed to the bed frame. Several experiments were designed to validate the accuracy and efficacy of this system.
Results
Phantom experiments were performed to test the efficacy and accuracy of this system under ideal conditions, and clinical tests were conducted to assess the performance of this system in clinical application. The accuracy of spatial registration was 1.18 mm in the phantom experiments and 1.86 mm in the clinical application.
Conclusions
Herein, we present a mixed reality‐based multimodality‐fused surgical navigation system for assisting surgeons in intuitively identifying the glioma boundary intraoperatively. The experimental results indicate that this system has suitable accuracy and efficacy for clinical usage.