TY - JOUR
T1 - Reliability and validity of a mobile phone for radiographic assessment of ankle injuries
T2 - A randomized inter- and intraobserver agreement study
AU - Tennant, Joshua N.
AU - Shankar, Viswanathan
AU - Dirschl, Douglas R.
N1 - Funding Information:
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was funded by a Small Student Grant from the University of North Carolina at Chapel Hill Injury Prevention Research Center.
PY - 2013/2
Y1 - 2013/2
N2 - Background: Current mobile phone technology may allow orthopaedic surgeons to make clinical decisions using radiographs viewed on a small mobile device screen. The purpose of this study was to examine the reliability and validity of interpreting ankle fracture images viewed on a mobile device and a computer monitor, with a hypothesis that the agreement in clinical decision making between the mobile device and computer monitor would be high. Methods: A randomized interobserver and intraobserver reliability study was conducted in which 16 mortise and lateral ankle images representing a severity spectrum of malleolar ankle, plafond, and extra-articular tibial fractures were shown to volunteer orthopaedic surgeons on both an Apple fourth-generation iPod Touch and a 23-inch liquid crystal display (LCD) computer monitor. Participants answered a multiple-choice questionnaire for each image regarding diagnosis, severity, need for higher level imaging, need for acute inpatient versus outpatient management, and plan of treatment. Inter- and intraobserver reliability was assessed by kappa (κ), multirater kappa statistics, and intraclass correlation coefficient (ICC). Results: Ninety-three orthopaedic surgeon volunteers completed the study. Excellent intraobserver agreement (κ ≥ 0.8) was found for all variables measured, including diagnosis (median κ = 0.84), need for computed tomography scan (κ = 0.86), need for reduction (κ = 0.82), treatment setting (κ = 0.82), and treatment type (κ = 0.87). Interobserver agreement was consistent between the mobile device and computer screen. Interobserver agreement for the severity assessment had a slightly higher ICC for the mobile device compared with the computer monitor (ICC = 0.83 vs 0.79). Sixty-seven percent (62/93) said at the completion of the study they were "completely" or "very" comfortable using a mobile device as a primary viewing device for new emergency room, inpatient, or transfer request consults. Conclusions: Strong reliability for radiographic assessment of ankle injuries existed between a 23-inch computer monitor and a handheld mobile device. Further study is warranted to validate the technology to apply to other anatomic locations and imaging modalities. Level of Evidence: Level II, diagnostic study.
AB - Background: Current mobile phone technology may allow orthopaedic surgeons to make clinical decisions using radiographs viewed on a small mobile device screen. The purpose of this study was to examine the reliability and validity of interpreting ankle fracture images viewed on a mobile device and a computer monitor, with a hypothesis that the agreement in clinical decision making between the mobile device and computer monitor would be high. Methods: A randomized interobserver and intraobserver reliability study was conducted in which 16 mortise and lateral ankle images representing a severity spectrum of malleolar ankle, plafond, and extra-articular tibial fractures were shown to volunteer orthopaedic surgeons on both an Apple fourth-generation iPod Touch and a 23-inch liquid crystal display (LCD) computer monitor. Participants answered a multiple-choice questionnaire for each image regarding diagnosis, severity, need for higher level imaging, need for acute inpatient versus outpatient management, and plan of treatment. Inter- and intraobserver reliability was assessed by kappa (κ), multirater kappa statistics, and intraclass correlation coefficient (ICC). Results: Ninety-three orthopaedic surgeon volunteers completed the study. Excellent intraobserver agreement (κ ≥ 0.8) was found for all variables measured, including diagnosis (median κ = 0.84), need for computed tomography scan (κ = 0.86), need for reduction (κ = 0.82), treatment setting (κ = 0.82), and treatment type (κ = 0.87). Interobserver agreement was consistent between the mobile device and computer screen. Interobserver agreement for the severity assessment had a slightly higher ICC for the mobile device compared with the computer monitor (ICC = 0.83 vs 0.79). Sixty-seven percent (62/93) said at the completion of the study they were "completely" or "very" comfortable using a mobile device as a primary viewing device for new emergency room, inpatient, or transfer request consults. Conclusions: Strong reliability for radiographic assessment of ankle injuries existed between a 23-inch computer monitor and a handheld mobile device. Further study is warranted to validate the technology to apply to other anatomic locations and imaging modalities. Level of Evidence: Level II, diagnostic study.
KW - Agreement
KW - Ankle
KW - Fracture
KW - Mobile device
UR - http://www.scopus.com/inward/record.url?scp=84875898898&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84875898898&partnerID=8YFLogxK
U2 - 10.1177/1071100712466849
DO - 10.1177/1071100712466849
M3 - Article
C2 - 23413062
AN - SCOPUS:84875898898
SN - 1071-1007
VL - 34
SP - 228
EP - 233
JO - Foot and Ankle International
JF - Foot and Ankle International
IS - 2
ER -