Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound

Paru Patrawalla, Lewis Eisen, Ariel L. Shiloh, Brijen J. Shah, Oleksandr Savenkov, Wendy Wise, Laura Evans, Paul Mayo, Demian Szyld

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

BACKGROUND: Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking.

OBJECTIVE: We describe the development and validity arguments of a competency assessment tool for critical care ultrasound.

METHODS: A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: "Perform deep venous thrombosis study (DVT)" and "Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo)." One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed.

RESULTS: An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo.

CONCLUSIONS: The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.

Original languageEnglish (US)
Pages (from-to)567-573
Number of pages7
JournalJournal of graduate medical education
Volume7
Issue number4
DOIs
StatePublished - Dec 1 2015

Fingerprint

Critical Care
Checklist
Venous Thrombosis
Point-of-Care Systems
Medicine
Left Ventricular Function
Technology

ASJC Scopus subject areas

  • Medicine(all)

Cite this

Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound. / Patrawalla, Paru; Eisen, Lewis; Shiloh, Ariel L.; Shah, Brijen J.; Savenkov, Oleksandr; Wise, Wendy; Evans, Laura; Mayo, Paul; Szyld, Demian.

In: Journal of graduate medical education, Vol. 7, No. 4, 01.12.2015, p. 567-573.

Research output: Contribution to journalArticle

Patrawalla, Paru ; Eisen, Lewis ; Shiloh, Ariel L. ; Shah, Brijen J. ; Savenkov, Oleksandr ; Wise, Wendy ; Evans, Laura ; Mayo, Paul ; Szyld, Demian. / Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound. In: Journal of graduate medical education. 2015 ; Vol. 7, No. 4. pp. 567-573.
@article{2c55fdc20cf446bea5d2f48997414c85,
title = "Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound",
abstract = "BACKGROUND: Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking.OBJECTIVE: We describe the development and validity arguments of a competency assessment tool for critical care ultrasound.METHODS: A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: {"}Perform deep venous thrombosis study (DVT){"} and {"}Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo).{"} One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed.RESULTS: An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo.CONCLUSIONS: The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.",
author = "Paru Patrawalla and Lewis Eisen and Shiloh, {Ariel L.} and Shah, {Brijen J.} and Oleksandr Savenkov and Wendy Wise and Laura Evans and Paul Mayo and Demian Szyld",
year = "2015",
month = "12",
day = "1",
doi = "10.4300/JGME-D-14-00613.1",
language = "English (US)",
volume = "7",
pages = "567--573",
journal = "Journal of graduate medical education",
issn = "1949-8349",
publisher = "University of Finance and Management",
number = "4",

}

TY - JOUR

T1 - Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound

AU - Patrawalla, Paru

AU - Eisen, Lewis

AU - Shiloh, Ariel L.

AU - Shah, Brijen J.

AU - Savenkov, Oleksandr

AU - Wise, Wendy

AU - Evans, Laura

AU - Mayo, Paul

AU - Szyld, Demian

PY - 2015/12/1

Y1 - 2015/12/1

N2 - BACKGROUND: Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking.OBJECTIVE: We describe the development and validity arguments of a competency assessment tool for critical care ultrasound.METHODS: A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: "Perform deep venous thrombosis study (DVT)" and "Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo)." One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed.RESULTS: An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo.CONCLUSIONS: The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.

AB - BACKGROUND: Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking.OBJECTIVE: We describe the development and validity arguments of a competency assessment tool for critical care ultrasound.METHODS: A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: "Perform deep venous thrombosis study (DVT)" and "Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo)." One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed.RESULTS: An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo.CONCLUSIONS: The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.

UR - http://www.scopus.com/inward/record.url?scp=85017250212&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85017250212&partnerID=8YFLogxK

U2 - 10.4300/JGME-D-14-00613.1

DO - 10.4300/JGME-D-14-00613.1

M3 - Article

C2 - 26692968

AN - SCOPUS:85017250212

VL - 7

SP - 567

EP - 573

JO - Journal of graduate medical education

JF - Journal of graduate medical education

SN - 1949-8349

IS - 4

ER -