Three-dimensional user interfaces that are controlled by the user’s bare hands are mostly based on purely gesture-based interaction techniques. However, these interfaces are often slow and error ...prone. Especially in the field of immersive 3D modelling, gestures are unsuitable because they complicate and delay the modelling process. To address these problems, we present a new gesture-free 3D modelling technique called “3D touch-and-drag”, which allows users to select vertices by approaching them and to terminate operations by moving the 3D cursor (e.g. the forefinger) away from the constraint geometry (e.g. a straight line or a plane). Our proposed technique makes it possible to transfer the existing 3D modelling concepts (“3D widgets”) to virtual environments, as shown by an experimental 3D modelling tool. The gesture-free bare-hand interaction also improves the possibility of tactile feedback during 3D manipulation. We compared different modelling techniques for controlling the 3D widgets. We found that controller-based techniques are significantly faster than finger-tracking-based techniques. The 3D touch-and-drag technique is about as fast as gesture-based interactions. Mouse interaction in a two-dimensional GUI is only slightly faster than the 3D modelling techniques. Since our proposed technique has proven to be at least equivalent to gesture-based interaction techniques in terms of accuracy and efficiency, its further development using more accurate tracking techniques seems promising to exploit the advantages of hands-free and gesture-free interaction for immersive 3D modelling.
•2D input like the mouse favors 3D visualization tasks where a high accuracy is demanded whereas 3D input devices like a tangible tablet allows faster adjustment if accuracy is not a key ...requirement.•Compared with 2D screen, 3D output largely helps spatial understanding of positions, but has limited effect for understanding rotation.•We found no evidence to suggest that the parity of dimensionality between input and output devices plays an important role on users’ performance. Thus, for 3D visualization tasks, users are not constrained to choose an input device that matches the dimensionality of the output space. They are free to use any devices that is good for a specific task.•Mouse remains an efficient tool accomplishing 3D visualization tasks both with the screen and with the augmented reality (AR) output space. It thus remains a good choice controlling a hybrid PC and AR setting.
Focusing on interaction needs for scientific data exploration, we evaluated people’s performance using a 2D mouse, 3D SpaceMouse, or 3D-tangible tablet as input devices to interact with visualizations on 2D screens or stereoscopic augmented reality (AR) head-mounted displays. The increasing availability and power of immersive displays drives us to try to understand how to choose input devices, interaction techniques and output displays for the visualization of scientific data, thus to finally help us guide the interaction design for hybrid AR and PC visualization systems. With a docking task and a clipping plane placement/orientation task, we measure our participants’ performance (completion time and accuracy) with each of the different combinations of input and output. We also report on their perceived workload, their preference, and on other qualitative feedback. Results show that the mouse remains good with any display, especially for tasks that require a high accuracy. Our results highlight the potential to retain the mouse as a primary input device, and to complement it with other 3D interaction devices for specific uses.
Techniques that can dexterously manipulate particles/cells are significant for biological detection, analytical chemistry, and material synthesis applications. Although there are many methods to ...realize the 3D manipulation in a static condition, there is a lack of sufficient 3D manipulating strategies in a continuous flow. We develop an integrated microsystem with a cooling system based on thermal convection to realize 3D manipulation in continuous flow. Theoretical modeling and experiments demonstrate that with increasing the temperature of a microheater, the moving direction of particles transfers from the y-axis to the z-axis. Based on the principle, focusing flow and single particles are manipulated in the y-axis and z-axis. Besides, five outlets in different z-axis are fabricated to demonstrate the application of particle sorting. This novel strategy offers a new direction for particle manipulation vertically and benefits to reveal the heat transfer in micro-scale, which can pave a path for generating the next integrated microfluidic system for particles/cells sorting.
•3D manipulation for the fluid and particles based on thermal convection in the continuous flow was demonstrated.•A simulation model to predict thermal convection in the microchannel was developed.•3D Multioutlet sorting for single particles was developed.
Hydrogel microstructures that encapsulate cells can be assembled into tissues and have broad applications in biology and medicine. However, 3D posture control for a single arbitrary microstructure ...remains a challenge. A novel 3D manipulation and assembly technique based on optothermally generated bubble robots is proposed. The generation, rate of growth, and motion of a microbubble robot can be controlled by modulating the power of a laser focused on the interface between the substrate and a fluid. In addition to 2D operations, bubble robots are able to perform 3D manipulations. The 3D properties of hydrogel microstructures are adjusted arbitrarily, and convex and concave structures with different heights are designed. Furthermore, annular micromodules are assembled into 3D constructs, including tubular and concentric constructs. A variety of hydrogel microstructures of different sizes and shapes are operated and assembled in both 2D and 3D conformations by bubble robots. The manipulation and assembly methods are simple, rapid, versatile, and can be used for fabricating tissue constructs.
Hydrogel microstructures are manipulated and assembled in 3D using laser and localized heating‐induced bubble microrobots. The conditions for generating and methods of controlling the size of bubbles are studied. Additionally, the mechanism of microstructural 3D flipping is studied. Microstructures of different shapes and sizes can be assembled into more complex constructs.
•3D manipulation of yeast cells inside a chamber with a height of 1 mm was realized.•ZnO/Si SAW devices could be seamlessly integrated into a lab-on-chip (LOC) device.•Factors influencing ...microparticle manipulation in both 2D and 3D were investigated.•A numerical model has been developed to investigate the 3D motions of yeast cells.
Manipulating biological cells or microparticles in three dimensions (3D) is invaluable for many biomedical applications, and recently effective and rapid manipulations of microparticles in 2D and 3D within microchannels or chambers using surface acoustic waves (SAWs) with bulk piezoelectric materials have been reported. However, these are generally expensive, or brittle and cannot be easily integrated into a single lab-on-chip. In this paper, we realized microparticle/cell patterning and 3D manipulation of yeast cells inside a chamber with a height of 1 mm using thin film ZnO/Si SAW devices. Effects of SAW frequency, channel width and thickness on alignment of microparticles were firstly investigated, and positions of the microparticles in the direction of SAW propagation can be controlled precisely by changing the phase angle of the acoustic waves from the ZnO/Si SAW device. A numerical model has been developed to investigate the SAW acoustic field and the resulted 3D motions of microparticles under the acoustic radiation forces within the microchamber. Finally, we realized and observed the 3D patterning of yeast cells within the microchannel. Our work shows a great potential for acoustofluidic, neural network research and biomedical applications using the ZnO/Si SAW devices.
3D manipulation of two droplets in a pair of parallel plates configuration with patterned electrodes.
Display omitted
•The droplet 3D manipulation was achieved with hybrid forces, namely, asymmetric ...electrowetting force and electrostatic force.•A numerical model was developed for the quantitative interpretation of the droplet dynamic behavior between two plates.•Both the threshold and the limitation applied voltage for driving the droplet were determined experimentally.
In this paper, we demonstrated the droplets three-dimension (3D) manipulation between a pair of parallel liquid-infused membrane plates with patterned electrodes by programmably controlling electrical direct current signals. The asymmetric electrowetting behaviors of droplet between the parallel slippery liquid-infused porous surface (SLIPS) plates configuration were investigated firstly. The droplet tends to move from the higher potential plate to the lower potential plate. The transport of the deionized water droplet in the vertical direction with electrostatic force was investigated in detail subsequently. A model was developed to study the dynamic transport behavior of the droplet movement in the vertical direction. It is found the droplet velocity increases with time, while tends to a maximum value due to the increase of drag force. We also experimentally demonstrated the droplet was driven accelerated from the higher potential plate to the lower one, and the transport time exponentially decreases with the increase of applied voltage. However, note that the excessive applied voltage would cause irregularly jumping of droplets between two plates. The applied voltage range for driving a 0.5 μl droplet was determined experimentally with various parallel-plate spacings. Finally, we demonstrated the 3D manipulation of one droplet and two droplets in two parallel plates configuration with patterned electrodes by the hybrid forces, namely, by asymmetric electrowetting force in the horizontal direction and by electrostatic force in the vertical direction. The 3D droplet manipulation on SLIPS provides a promising solution for high-throughput analysis and integrated device with high density on the digital microfluidic chip.
Handheld Augmented Reality (HAR) has the potential to introduce Augmented Reality (AR) to large audiences due to the widespread use of suitable handheld devices. However, many of the current HAR ...systems are not considered very practical and they do not fully answer to the needs of the users. One of the challenging areas in HAR is the in-situ AR content creation where the correct and accurate positioning of virtual objects to the real world is fundamental. Due to the hardware limitations of handheld devices and possible restrictions in the environment, the correct 3D positioning of objects can be difficult to achieve we are unable to use AR markers or correctly map the 3D structure of the environment.
We present SlidAR, a 3D positioning for Simultaneous Localization And Mapping (SLAM) based HAR systems. SlidAR utilizes 3D ray-casting and epipolar geometry for virtual object positioning. It does not require a perfect 3D reconstruction of the environment nor any virtual depth cues. We have conducted a user experiment to evaluate the efficiency of SlidAR method against an existing device-centric positioning method that we call HoldAR. Results showed that SlidAR was significantly faster, required significantly less device movement, and also got significantly better subjective evaluation from the test participants. SlidAR also had higher positioning accuracy, although not significantly.
Display omitted
•The correct in-situ 3D positioning of virtual objects with HAR is fundamental.•We developed a 3D positioning method for SLAM-based handheld AR.•We evaluated our method against a conventional device-centric method.•Our method was significantly better in objective and subjective measurements.
Various touch-based interaction techniques have been developed to make interactions on mobile devices more effective, efficient, and intuitive. Finger orientation, especially, has attracted a lot of ...attentions since it intuitively brings three additional degrees of freedom (DOF) compared with two-dimensional (2D) touching points. The mapping of finger orientation can be classified as being either absolute or relative, suitable for different interaction applications. However, only absolute orientation has been explored in prior works. The relative angles can be calculated based on two estimated absolute orientations, although, a higher accuracy is expected by predicting relative rotation from input images directly. Consequently, in this paper, we propose to estimate complete 3D relative finger angles based on two fingerprint images, which incorporate more information with a higher image resolution than capacitive images. For algorithm training and evaluation, we constructed a dataset consisting of fingerprint images and their corresponding ground truth 3D relative finger rotation angles. Experimental results on this dataset revealed that our method outperforms previous approaches with absolute finger angle models. Further, extensive experiments were conducted to explore the impact of image resolutions, finger types, and rotation ranges on performance. A user study was also conducted to examine the efficiency and precision using 3D relative finger orientation in 3D object rotation task.
This paper presents a collaborative platform developed to allow the communication between surgeons and engineers in the process of designing patient-specific surgical instruments. To date, only a few ...applications are available to collaboratively create surgical instruments from medical 3D models, mostly dedicated to expert CAD modelers. This makes the preoperative planning process time-consuming and inefficient limiting the usability of applications and making planning difficult and inaccurate. Accordingly, we propose a solution in the form of a web-based, interactive, extendable, 3D navigation and manipulation application, called Precise, which does not require client installation. Precise is a lightweight, high-performance application built to provide easy-to-use, powerful, on-demand visualization and manipulation of 3D images, implemented using open-source libraries.
•User-friendly and intuitive tools for collaborative planning of surgical guides.•Platform to allow the communication between surgeons and engineers.•Web-based application for navigating and editing three-dimensional models.•Full-stack cross-platform application usable on both computers and tablet.
Vortex trap manipulation of microscopic objects in three-dimensions by helical microswimmers has a great potential towards non-contact biological cell manipulation or microassembly. However, in the ...current state-of-the-art, it has been limited in 2D manipulation due to the conflicting characteristics of optimizing the trapping force and propulsion force. In this paper, we propose a new design of the helical microswimmers enabling purely non-contact, selective and 3D vortex trap micromanipulation. The proposed helical microswimmers are fabricated by 3D nanoprinting technology based on two-photon laser absorption. The vertically standing helical mirostructures on top of the supporting micropillars allows uniform coating of ferromagnetic metal layer with minimum shadow area during metallization by sputtering. Furthermore this reduces the risk of damaging or losing materials during micromanipulation process for releasing them after fabrication which allows propulsion force characterizations and optimization. We characterized to reveal their propulsion force and this proved the propulsion force was recovered back to even higher than the single helical microswimmers. We consider that the proposed helical microswimmers with 3D manipulation could have a great impact to non-contact biological cell manipulation.
•A new design of non-contact mobile micro-manipulator for 3D manipulation of cell-size particles is proposed•Design and fabrication process for increasing propulsion force and stability of bio-inspired helical micro-swimmer•D non-contact micro-manipulation is described and demonstrated. Display omitted