NUK - logo
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • Comparison of Interobserver...
    Lin, Jian; Sun, Wei‐Jin; Chen, Jian‐Hai; Dong, Jing‐Ming; Bao, Ding‐Su; Yan, Ling; Ni, Wei‐Dong; Xiang, Ming; Ding, Jian; Cai, Ming; Song, Jia‐Xiang; Mao, Ming‐Gui; Wu, Xiao‐Ming

    Orthopaedic surgery, August 2023, Letnik: 15, Številka: 8
    Journal Article

    Objective Distal clavicle fracture classification directly affects the treatment decisions. It is unclear whether the classification systems implemented differ depending on surgeons' backgrounds. This study aimed to compare the interobserver agreement of four classification systems used for lateral clavicle fractures by shoulder specialists and general trauma surgeons. Methods Radiographs of 20 lateral clavicle fractures representing a full spectrum of adult fracture patterns were analyzed by eight experienced shoulder specialists and eight general trauma surgeons from 10 different hospitals. All cases were graded according to the Orthopedic Trauma Association (OTA), Neer, Jäger/Breitner, and Gongji classification systems. To measure observer agreement, Fleiss' kappa coefficient (κ) was applied and assessed. Results When only X‐ray films were presented, both groups achieved fair agreement. However, when the 3D‐CT scan images were provided, improved interobserver agreement was found in the specialist group when the OTA, Jäger/Breitner, and Gongji classification systems were used. In the generalist groups, improved agreement was found when using the Gongji classification system. In terms of interobserver reliability, the OTA, Neer, and Jäger/Breitner classification systems showed better agreement among shoulder specialists, while a slightly lower level of agreement was found using the Gongji classification system. For the OTA classification system, interobserver agreement had a mean kappa value of 0.418, ranging from 0.446 (specialist group) to 0.402 (generalist group). For the Neer classification system, interobserver agreement had a mean kappa value of 0.368, ranging from 0.402 (specialist group) to 0.390 (generalist group). For the Jäger/Breitner classification system, the inter‐observer agreement had a mean kappa value of 0.380, ranging from 0.413 (specialist group) to 0.404 (generalist group). For the Gongji classification system, interobserver agreement had a mean kappa value of 0.455, ranging from 0.480 (specialist group) to 0.485 (generalist group). Conclusion Generally speaking, 3D‐CT scans provide a richer experience that can lead to better results in most classification systems of lateral clavicle fractures, highlighting the value of digitization and specialization in diagnosis and treatment. Competitive interobserver agreement was exhibited in the generalist group using the Gongji classification system, suggesting that the Gongji classification is suitable for general trauma surgeons who are not highly experienced in the shoulder field. A smartphone‐based survey was designed to verify the difference in therapeutic experience between shoulder specialists and general trauma surgeons and to compare the interobserver agreement of four classification systems for lateral clavicle fractures between these two groups of surgeons. The results showed that, when only the X‐ray films were presented, both groups achieved fair agreement. However, when the 3D‐CT scan images were provided, improved interobserver agreement was found in specialists group using the OTA, Jäger/Breitner, and Gongji classification systems and in generalist group using Gongji classification system. The interobserver reliability of the OTA, the Neer, and the Jäger/Breitner classification system showed a better agreement in shoulder specialists, while a competitive agreement between the two groups were found in the Gongji classification system, which may possibly suggest that the Gongji classification is more easy to grasp for general trauma surgeons who are not highly experienced in shoulder field.