The cover image is based on the Research Article Augmented reality‐based method for road maintenance operators in human–robot collaborative interventions by A. C. Bavelos et al., ...https://doi.org/10.1111/mice.13185.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, KILJ, NLZOH, NUK, OILJ, SAZU, SBCE, SBMB, UL, UM, UPUK
This research paper explores the impact of augmented reality (AR) tracking characteristics, specifically an AR head-worn display’s tracking registration accuracy and precision, on users’ spatial ...abilities and subjective perceptions of trust in and reliance on the technology. Our study aims to clarify the relationships between user performance and the different behaviors users may employ based on varying degrees of trust in and reliance on AR. Our controlled experimental setup used a 360° field-of-regard search-and-selection task and combines the immersive aspects of a CAVE-like environment with AR overlays viewed with a head-worn display.
We investigated three levels of simulated AR tracking errors in terms of both accuracy and precision (+0°, +1°, +2°). We controlled for four user task behaviors that correspond to different levels of trust in and reliance on an AR system: AR-Only (only relying on AR), AR-First (prioritizing AR over real world), Real-Only (only relying on real world), and Real-First (prioritizing real world over AR). By controlling for these behaviors, our results showed that even small amounts of AR tracking errors had noticeable effects on users’ task performance, especially if they relied completely on the AR cues (AR-Only). Our results link AR tracking characteristics with user behavior, highlighting the importance of understanding these elements to improve AR technology and user satisfaction.
Display omitted
•Augmented Reality (AR) tracking factors include accuracy and precision.•Search-and-selection task tested effects of tracking factors on performance, trust.•Results indicate even small AR tracking errors impact user performance, trust.•Negative performance effects can be mitigated if users do not rely solely on AR.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Illumination estimation is an essential problem in computer vision, graphics and augmented reality. In this paper, we propose a learning based method to recover low‐frequency scene illumination ...represented as spherical harmonic (SH) functions by pairwise photos from rear and front cameras on mobile devices. An end‐to‐end deep convolutional neural network (CNN) structure is designed to process images on symmetric views and predict SH coefficients. We introduce a novel Render Loss to improve the rendering quality of the predicted illumination. A high quality high dynamic range (HDR) panoramic image dataset was developed for training and evaluation. Experiments show that our model produces visually and quantitatively superior results compared to the state‐of‐the‐arts. Moreover, our method is practical for mobile‐based applications.
Full text
Available for:
BFBNIB, DOBA, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UILJ, UKNU, UL, UM, UPUK
Augmented Reality (AR), Virtual Reality (VR), Mixed Reality, and Extended Reality (often – misleadingly – abbreviated as XR) are commonly used terms to describe how technologies generate or modify ...reality. However, academics and professionals have been inconsistent in their use of these terms. This has led to conceptual confusion and unclear demarcations. Inspired by prior research and qualitative insights from XR professionals, we discuss the meaning and definitions of various terms and organize them in our proposed framework. As a result, we conclude that (1) XR should not be used to connote extended reality, but as a more open approach where the X implies the unknown variable: xReality; (2) AR and VR have fundamental differences and thus should be treated as different experiences; (3) AR experiences can be described on a continuum ranging from assisted reality to mixed reality (based on the level of local presence); and (4), VR experiences can be conceptualized on a telepresence-continuum ranging from atomistic to holistic VR.
•XR should not be used as an abbreviation for “Extended Reality”; X should be the placeholder for “all” new reality formats.•AR and VR have fundamental differences and thus should be treated as different experiences.•AR experiences can be described on a LOCAL PRESENCE continuum ranging from Assisted Reality to Mixed Reality.•VR experiences can be conceptualized on a TELEPRESENCE-continuum ranging from atomistic to holistic VR.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Augmented reality (AR) applications have gained much research and industry attention. Moreover, the mobile counterpart—mobile augmented reality (MAR) is one of the most explosive growth areas for AR ...applications in the mobile environment (e.g., smartphones). The technical improvements in the hardware of smartphones, tablets, and smart-glasses provide an advantage for the wide use of mobile AR in the real world and experience these AR applications anywhere. However, the mobile nature of MAR applications can limit users’ interaction capabilities, such as input and haptic feedback. In this survey, we analyze current research issues in the area of human-computer interaction for haptic technologies in MAR scenarios. The survey first presents human sensing capabilities and their applicability in AR applications. We classify haptic devices into two groups according to the triggered sense:
cutaneous/tactile
: touch, active surfaces, and mid-air;
kinesthetic
: manipulandum, grasp, and exoskeleton. Due to MAR applications’ mobile capabilities, we mainly focus our study on wearable haptic devices for each category and their AR possibilities. To conclude, we discuss the future paths that haptic feedback should follow for MAR applications and their challenges.
Full text
Available for:
IZUM, KILJ, NUK, PILJ, SAZU, UL, UM, UPUK
Augmented reality (AR) and virtual reality (VR) have played an important role in our society. AR aims to improve our perception of reality, while VR aims to create a new or artificial reality. There ...is no doubt that the results achieved in these two areas are remarkable. It is, however, quite surprising that our understanding of reality is not yet complete. In fact, a better understanding of reality will be essential for future developments in these areas.
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, ...Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
The popularity of wearable devices and smartphones has fueled the development of Mobile Augmented Reality (MAR), which provides immersive experiences over the real world using techniques, such as ...computer vision and deep learning. However, the hardware-specific MAR is costly and heavy, and the App-based MAR requires an additional download and installation and it also lacks cross-platform ability. These limitations hamper the pervasive promotion of MAR. This paper argues that mobile Web AR (MWAR) holds the potential to become a practical and pervasive solution that can effectively scale to millions of end-users because MWAR can be developed as a lightweight, cross-platform, and low-cost solution for end-to-end delivery of MAR. The main challenges for making MWAR a reality lie in the low efficiency for dense computing in Web browsers, a large delay for real-time interactions over mobile networks, and the lack of standardization. The good news is that the newly emerging 5G and Beyond 5G (B5G) cellular networks can mitigate these issues to some extent via techniques such as network slicing, device-to-device communication, and mobile edge computing. In this paper, we first give an overview of the challenges and opportunities of MWAR in the 5G era. Then we describe our design and development of a generic service-oriented framework (called MWAR5) to provide a scalable, flexible, and easy to deploy MWAR solution. We evaluate the performance of our MWAR5 system in an actually deployed 5G trial network under the collaborative configurations, which shows encouraging results. Moreover, we also share the experiences and insights from our development and deployment, including some exciting future directions of MWAR over 5G and B5G networks.