Interrater reliability to assure valid content in peer review of CME-accredited presentations

Mark Quigg, Fred A. Lado

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Introduction: The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of peer-review audits for ACCME accreditation and designation of American Medical Association Category 1 CreditTM is unknown. Methods: Categories of presentation defects were constructed from discussions of the CME committee of the American Epilepsy Society: (1) insufficient citation, (2) poor formatting, (3) nonacknowledgment of non-FDA- approved use, (4) misapplied data, (5) 1-sided data, (6) self- or institutional promotion, (7) conflict of interest/commercial bias, (8) other, or (9) no defect. A PowerPoint lecture (n = 29 slides) suitable for presentation to general neurologists was purposefully created with the above defects. A multirater, multilevel kappa statistic was determined from the number and category of defects. Results: Of 14 reviewers, 12 returned completed surveys (86%) identifying a mean6 standard deviation 1.6±1.1 defects/slide. The interrater kappa equaled 0.115 (poor reliability) for number of defects/slides. No individual categories achieved kappa > 0.38. Discussion: Interrater reliability on the rating of durable materials used in subspecialty CME was poor. Guidelines for CME appropriate content are too subjective to be applied reliably by raters knowledgeable in their specialty field but relatively untrained in the specifics of CME requirements. The process of peer review of CME materials would be aided by education of physicians on validation of materials appropriate for CME.

Original languageEnglish (US)
Pages (from-to)242-245
Number of pages4
JournalJournal of Continuing Education in the Health Professions
Volume29
Issue number4
DOIs
StatePublished - 2009

Fingerprint

Continuing Medical Education
Peer Review
peer review
education
Accreditation
accreditation
Guidelines
Conflict of Interest
formatting
medical association
American Medical Association
conflict of interest
audit
promotion
rating
Physicians
statistics
physician
Education

Keywords

  • Accreditation
  • Content
  • Continuing
  • Credit
  • Education
  • Interrater reliability
  • Medical
  • Neurology

ASJC Scopus subject areas

  • Education
  • Medicine(all)

Cite this

Interrater reliability to assure valid content in peer review of CME-accredited presentations. / Quigg, Mark; Lado, Fred A.

In: Journal of Continuing Education in the Health Professions, Vol. 29, No. 4, 2009, p. 242-245.

Research output: Contribution to journalArticle

@article{eb17f58f61d246c287c044b9d240bd21,
title = "Interrater reliability to assure valid content in peer review of CME-accredited presentations",
abstract = "Introduction: The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of peer-review audits for ACCME accreditation and designation of American Medical Association Category 1 CreditTM is unknown. Methods: Categories of presentation defects were constructed from discussions of the CME committee of the American Epilepsy Society: (1) insufficient citation, (2) poor formatting, (3) nonacknowledgment of non-FDA- approved use, (4) misapplied data, (5) 1-sided data, (6) self- or institutional promotion, (7) conflict of interest/commercial bias, (8) other, or (9) no defect. A PowerPoint lecture (n = 29 slides) suitable for presentation to general neurologists was purposefully created with the above defects. A multirater, multilevel kappa statistic was determined from the number and category of defects. Results: Of 14 reviewers, 12 returned completed surveys (86{\%}) identifying a mean6 standard deviation 1.6±1.1 defects/slide. The interrater kappa equaled 0.115 (poor reliability) for number of defects/slides. No individual categories achieved kappa > 0.38. Discussion: Interrater reliability on the rating of durable materials used in subspecialty CME was poor. Guidelines for CME appropriate content are too subjective to be applied reliably by raters knowledgeable in their specialty field but relatively untrained in the specifics of CME requirements. The process of peer review of CME materials would be aided by education of physicians on validation of materials appropriate for CME.",
keywords = "Accreditation, Content, Continuing, Credit, Education, Interrater reliability, Medical, Neurology",
author = "Mark Quigg and Lado, {Fred A.}",
year = "2009",
doi = "10.1002/chp.20042",
language = "English (US)",
volume = "29",
pages = "242--245",
journal = "Journal of Continuing Education in the Health Professions",
issn = "0894-1912",
publisher = "John Wiley and Sons Inc.",
number = "4",

}

TY - JOUR

T1 - Interrater reliability to assure valid content in peer review of CME-accredited presentations

AU - Quigg, Mark

AU - Lado, Fred A.

PY - 2009

Y1 - 2009

N2 - Introduction: The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of peer-review audits for ACCME accreditation and designation of American Medical Association Category 1 CreditTM is unknown. Methods: Categories of presentation defects were constructed from discussions of the CME committee of the American Epilepsy Society: (1) insufficient citation, (2) poor formatting, (3) nonacknowledgment of non-FDA- approved use, (4) misapplied data, (5) 1-sided data, (6) self- or institutional promotion, (7) conflict of interest/commercial bias, (8) other, or (9) no defect. A PowerPoint lecture (n = 29 slides) suitable for presentation to general neurologists was purposefully created with the above defects. A multirater, multilevel kappa statistic was determined from the number and category of defects. Results: Of 14 reviewers, 12 returned completed surveys (86%) identifying a mean6 standard deviation 1.6±1.1 defects/slide. The interrater kappa equaled 0.115 (poor reliability) for number of defects/slides. No individual categories achieved kappa > 0.38. Discussion: Interrater reliability on the rating of durable materials used in subspecialty CME was poor. Guidelines for CME appropriate content are too subjective to be applied reliably by raters knowledgeable in their specialty field but relatively untrained in the specifics of CME requirements. The process of peer review of CME materials would be aided by education of physicians on validation of materials appropriate for CME.

AB - Introduction: The Accreditation Council for Continuing Medical Education (ACCME) provides guidelines for continuing medical education (CME) materials to mitigate problems in the independence or validity of content in certified activities; however, the process of peer review of materials appears largely unstudied and the reproducibility of peer-review audits for ACCME accreditation and designation of American Medical Association Category 1 CreditTM is unknown. Methods: Categories of presentation defects were constructed from discussions of the CME committee of the American Epilepsy Society: (1) insufficient citation, (2) poor formatting, (3) nonacknowledgment of non-FDA- approved use, (4) misapplied data, (5) 1-sided data, (6) self- or institutional promotion, (7) conflict of interest/commercial bias, (8) other, or (9) no defect. A PowerPoint lecture (n = 29 slides) suitable for presentation to general neurologists was purposefully created with the above defects. A multirater, multilevel kappa statistic was determined from the number and category of defects. Results: Of 14 reviewers, 12 returned completed surveys (86%) identifying a mean6 standard deviation 1.6±1.1 defects/slide. The interrater kappa equaled 0.115 (poor reliability) for number of defects/slides. No individual categories achieved kappa > 0.38. Discussion: Interrater reliability on the rating of durable materials used in subspecialty CME was poor. Guidelines for CME appropriate content are too subjective to be applied reliably by raters knowledgeable in their specialty field but relatively untrained in the specifics of CME requirements. The process of peer review of CME materials would be aided by education of physicians on validation of materials appropriate for CME.

KW - Accreditation

KW - Content

KW - Continuing

KW - Credit

KW - Education

KW - Interrater reliability

KW - Medical

KW - Neurology

UR - http://www.scopus.com/inward/record.url?scp=74949089872&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=74949089872&partnerID=8YFLogxK

U2 - 10.1002/chp.20042

DO - 10.1002/chp.20042

M3 - Article

VL - 29

SP - 242

EP - 245

JO - Journal of Continuing Education in the Health Professions

JF - Journal of Continuing Education in the Health Professions

SN - 0894-1912

IS - 4

ER -