NUK - logo
E-viri
Celotno besedilo
Recenzirano
  • Rubik’s Cube+: A self-super...
    Zhu, Jiuwen; Li, Yuexiang; Hu, Yifan; Ma, Kai; Zhou, S. Kevin; Zheng, Yefeng

    Medical image analysis, August 2020, 2020-08-00, 20200801, Letnik: 64
    Journal Article

    •We propose a pretext task, namely Rubik's cube+, consisting of three sub-tasks, i.e., cube ordering, cube orientation and masking identification.•Experiments on the two target tasks, i.e., cerebral hemorrhage classification and brain tumor segmentation, are conducted to demonstrate the effectiveness of our Rubik’s cube+.•Comprehensive discussions on the limitation and potential applications of our study are included. Display omitted Due to the development of deep learning, an increasing number of research works have been proposed to establish automated analysis systems for 3D volumetric medical data to improve the quality of patient care. However, it is challenging to obtain a large number of annotated 3D medical data needed to train a neural network well, as such manual annotation by physicians is time consuming and laborious. Self-supervised learning is one of the potential solutions to mitigate the strong requirement of data annotation by deeply exploiting raw data information. In this paper, we propose a novel self-supervised learning framework for volumetric medical data. Specifically, we propose a pretext task, i.e., Rubik’s cube+, to pre-train 3D neural networks. The pretext task involves three operations, namely cube ordering, cube rotating and cube masking, forcing networks to learn translation and rotation invariant features from the original 3D medical data, and tolerate the noise of the data at the same time. Compared to the strategy of training from scratch, fine-tuning from the Rubik’s cube+ pre-trained weights can remarkablely boost the accuracy of 3D neural networks on various tasks, such as cerebral hemorrhage classification and brain tumor segmentation, without the use of extra data.