UNI-MB - logo
UMNIK - logo
 
E-resources
Full text
Peer reviewed Open access
  • Understanding the acceptanc...
    Ho, Manh-Tung; Le, Ngoc-Thang B.; Mantello, Peter; Ho, Manh-Toan; Ghotbi, Nader

    Technology in society, February 2023, 2023-02-00, Volume: 72
    Journal Article

    Despite having one of the most advanced healthcare systems in the world, Japan is expected to experience a shortage of nearly half a million healthcare workers by 2025 due to its rapidly aging population. In response, government authorities plan to implement a wide range of AI-driven healthcare solutions. These include care robots that assist the physically handicapped or elderly, chatbots that provide anonymous online mental health consultation, and diagnostic software utilizing machine learning. Yet one of the most popular smart technologies to augment the nation's already overstrained and undermanned healthcare system is a little known but emerging emotional AI technologies, i.e., deep learning systems trained to read, classify, and respond to human emotions. These technologies are being sold on a commercial level not only to the public but also to rehabilitation centers, local hospitals, and senior citizen residences. Although the augmentation of healthcare services to intelligent machines may seem like a logical step in a country well-known for its long-standing affection toward robots, Japanese society is also known for its adherence to established social relations and traditional institutional practices, especially, in the realm of medical care. In order to gauge Japanese acceptance of emotion-sensing technology, we analyze a dataset of 245 visitors to clinics and hospitals in a typical suburban area in Japan using multiple linear regression. The results show that in general, senior and male patients perceive the emotional AI technology more negatively. For behavioral variables, patients' level of familiarity has positive correlations with attitudes toward emotional AI-based applications in private setting (βFamiliarity_AttitudePri=0.346, p<0.001) and public setting (βFamiliarity_PublicAttitude=0.297, p<0.001); while concern for losing control to AI has negative correlations with the attitudes' variables: private setting (βLosingControl_AttitudePri=−0.262,p=0.002) and public setting (β LosingControl_AttitudePub=-0.188, p=0.044). Interestingly, concerns over violation of privacy and discrimination are non-significant correlates, which contradict the emerging literature on this subject. We further contextualize the findings with insights afforded by an understanding of Japanese culture as well as the relevant literature on care robots in Japan. Finally, policy and education implications to promote emotional AI acceptance to the general and senior members of the society are provided. •Attitude toward emotional AI use in Japanese healthcare positively correlates with familiarity with the technology.•Fear of losing control to AI has significant negative correlation to the perception of EAI in both private and public setting.•Privacy and discriminatory concerns are non-significant correlates of the attitude toward emotional AI use in healthcare.•The results necessitate the cultural framing of technological acceptance behaviors.