Grasping force control is important for multi-fingered robotic hands to stabilize the grasped object. Humans are able to adjust their grasping force and react quickly to instabilities through tactile ...sensing. However, grasping force control through tactile sensing with robotic hands is still relatively unexplored. In this paper, we make use of tactile sensing for multi-fingered robot hands to adjust the grasping force to stabilize unknown objects without prior knowledge of their shape or physical properties. In particular, an online detection module based on Deep Neural Network (DNN) is designed to detect contact events and object material simultaneously from tactile data. In addition, a force estimation method based on Gaussian Mixture Model (GMM) is proposed to compute the contact information (i.e., contact force and contact location) from tactile data. According to the results of tactile sensing, an object stabilization controller is then employed for a robotic hand to adjust the contact configuration for object stabilization. The spatio-temporal property of tactile data is exploited during tactile sensing. Finally, the effectiveness of the proposed framework is evaluated in a real-world experiment with a five-fingered Shadow Dexterous Hand equipped with BioTac sensors.
Display omitted
While for vision and audio the same mass-produced units can be embedded in many different systems from smartphones to robots, tactile sensors have to be built in application-specific ...shapes and sizes. To use a commercially available tactile sensor, it can be necessary to develop the entire system around an existing sensor model. We present a set of open-source solutions for designing, manufacturing, reading and integrating custom application-specific tactile matrix sensors. Our manufacturing process only requires an off-the-shelf cutting plotter and widely available plastic and metal foils. This allows creating sensors of diverse sizes, shapes, and layouts, which can be adapted to various specific use cases as demonstrated with exemplary robot integrations. For interfacing and readout, we develop an Arduino-like prototype board (Tacduino) with amplifier circuits to ensure good resolution and to suppress crosstalk. As an example, we give step-by-step instructions to build tactile fingertips for the RobotiQ 3-Finger Gripper, and we provide design files for the readout circuit board together with Arduino firmware and driver software. Both, wired and wireless communication between the sensors and a host PC are supported by this system. The hardware was originally presented and investigated in 1.
The key role of tactile sensing for human grasping and manipulation is widely acknowledged, but most industrial robot grippers and even multi-fingered hands are still designed and used without any ...tactile sensors. While the basic design principles for resistive or capacitive sensors are well known, several factors keep tactile sensing from large-scale deployment - high sensor costs, short lifespan, poor reliability, difficult production processes, a lack of suitable software and tools for system integration, and the unique requirement for tactile sensors to conform to application-specific shapes.In this work, we describe a very simple but efficient approach to design low-cost resistive matrix sensors, where sensor layout and geometry, taxel-size, and measurement sensitivity can be customized over a wide range. Sensor assembly needs nothing more than a hobby cutting plotter for precise cutting of aluminum tape and Velostat foils, as well as adhesive plastic tape. Our electronics combines transimpedance amplifiers with common Arduino microcontrollers, supporting standard communication protocols, and using either cabled or wireless data transfer to the host. We present three different application examples and sketch our ROS software for sensor calibration and visualization. All parts of our project, including detailed building instructions, bill-of-materials, electronics, and firmware are available open-source.
Mixed reality (MR)opens up new vistas for human-robot interaction (HRI)scenarios in which a human operator can control and collaborate with co-located robots. For instance, when using a see-through ...head-mounted-display (HMD)such as the Microsoft HoloLens, the operator can see the real robots and additional virtual information can be superimposed over the real-world view to improve security, acceptability and predictability in HRI situations. In particular, previewing potential robot actions in-situ before they are executed has enormous potential to reduce the risks of damaging the system or injuring the human operator. In this paper, we introduce the concept and implementation of such an MR human-robot collaboration system in which a human can intuitively and naturally control a co-located industrial robot arm for pick-and-place tasks. In addition, we compared two different, multimodal HRI techniques to select the pick location on a target object using (i)head orientation (aka heading)or (ii)pointing, both in combination with speech. The results show that heading-based interaction techniques are more precise, require less time and are perceived as less physically, temporally and mentally demanding for MR-based pick-and-place scenarios. We confirmed these results in an additional usability study in a delivery-service task with a multi-robot system. The developed MR interface shows a preview of the current robot programming to the operator, e. g., pick selection or trajectory. The findings provide important implications for the design of future MR setups.
In this work, we present a comprehensive multi-modal pipeline for grasping pieces of fabric from flat surfaces. The pipeline is capable of grasping the fabric with a success rate of 99% without the ...need for information on material or shape. Maintaining the pressure onto the fabric and surface while closing the gripper causes a fold in the material, which is firmly pinched between the fingers of the gripper. We present and evaluate several grasp policies in different configurations with various materials. The achieved pulling force of grasps and, therefore, their quality is analyzed. We demonstrate the versatility of the pipeline by transferring it to a different gripper. Further, it features a recurrent deep learning based slip and fall detection using tactile sensing.
This paper presents the design and capabilities of a custom-built Encountered-Type Haptic Display. With an X-Y-Z-Yaw plotter mechanism below the tabletop and four permanent magnets in the ...end-effector, the device can manipulate multiple objects in three dimensions on top of the table. Four hall effect sensors in the end-effector are used to compensate the friction of the display and increase the positioning accuracy of objects to 0.5 mm. With an end-effector speed of 66.6 cm/s, objects can be placed in a workspace of 47.2×26 cm. The device is built for human-computer interaction scenarios in virtual reality (VR) and augmented reality (AR) to provide haptic feedback on-demand. Due to the design, arbitrary object shapes and materials can be presented to the user within the table's workspace. The usability and performance of the table are evaluated with a user study.