Underwater humanoid robots (UHRs) have emerged as a significant area of interest in robotics, with the potential to overcome the limitations of traditional underwater robots and revolutionize ...underwater activities. This review examines the development of UHRs, focusing on their perception, decision-making, and execution capabilities within a hierarchical human-machine cooperation framework. The Perception Layer involves gathering information from the environment and human collaborators. The Decision-making Layer explores different levels of robot autonomy and the current status of human-UHR collaborative decision-making. The Execution Layer encompasses modeling, control, and actuation mechanisms to translate high-level intentions into physical actions. Various UHR implementations across research teams are reviewed to provide a comprehensive overview of current advancements. Discussions and challenges surrounding UHR progress are provided as well. Continued research and development efforts of UHR represent a promising avenue for advancing human-machine cooperation and pushing the boundaries of underwater exploration, contributing to scientific discoveries and societal benefits in this captivating realm.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPUK, ZAGLJ, ZRSKP
Hierarchical least-square optimization is often used in robotics to inverse a direct function when multiple incompatible objectives are involved. Typical examples are inverse kinematics or dynamics. ...The objectives can be given as equalities to be satisfied (e.g. point-to-point task) or as areas of satisfaction (e.g. the joint range). This paper proposes a complete solution to solve multiple least-square quadratic problems of both equality and inequality constraints ordered into a strict hierarchy. Our method is able to solve a hierarchy of only equalities 10 times faster than the iterative-projection hierarchical solvers and can consider inequalities at any level while running at the typical control frequency on whole-body size problems. This generic solver is used to resolve the redundancy of humanoid robots while generating complex movements in constrained environments.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK
Highly dynamic movements such as jumping are important to improve the agility and environmental adaptation of humanoid robots. This article proposes an online optimization method to realize a ...vertical jump with centroidal angular momentum (CAM) control and landing impact absorption for a humanoid robot. First, the robot's center of mass (CoM) trajectory is generated by nonlinear optimization. Then, a quasi-sliding mode controller is designed to ensure that the robot tracks the CoM trajectory accurately. To avoid unexpected spinning in the flight phase, a center-of-pressure-guided angular momentum controller is designed to stabilize the CAM. The modifications of CoM and CAM are realized by online optimization of dynamic components and inverse dynamics. Two quadratic programming optimizations are utilized to generate feasible contact force/torque and joint acceleration referring to uplevel CoM and CAM controllers. In addition, a viscoelastic model-based controller is designed to absorb the vibration caused by a large contact impact. A simulation and experiment of a 0.5-m high (foot lifting distance) vertical jump are achieved on a humanoid robot platform in this article (Fig. 1).
We investigated the multisensory emotion perception from humanoid-robot. In the experiment, participants were presented with video clips containing emotional colored eyes and voice (Task 1) or body ...gesture and voice (Task 2) of the robot, which were either congruent or incongruent in terms of emotional content (e.g., a happy body gesture paired with a sad voice on an incongruent trial). Participants were instructed to judge the emotion of the robot as either happiness or sadness. We examined the proportion of responses based on visual or auditory cues for the robot’s expression. Results showed that participants relied more on auditory cues than on the visual cues in Task 1. However, this vocal superiority was not observed in Task 2. These results suggest that the multisensory emotion perception from the robot is different whether the cues are natural or artificial. We proposed a model for multisensory emotion perception from a robot.