Abstract

Interobserver reliability in obtaining neurologic histories and examinations was investigated among neurologists collaborating in the Stroke Data Bank (SDB). Seventeen in-hospital stroke patients were examined by six neurologists experienced in stroke over the course of three days. Patients were examined twice a day for two successive days, with each patient seen by four different neurologists. Data were recorded on SDB forms, according to definitions and procedures established for the SDB. Percent agreement and kappa coefficients were calculated to assess the levels of agreement for each item. Important differences in levels of agreement were found among items on both neurologic history and examination. Agreement among neurologists was higher for neurologic examination than for history. Patterns of agreement for items with low prevalence or with numerous unknown ratings are discussed. Improvement in interobserver agreement due to data editing for intra-observer consistency was shown.

Keywords

KappaStroke (engine)MedicineCohen's kappaNeurological examinationInter-rater reliabilityPhysical therapyConsistency (knowledge bases)Physical medicine and rehabilitationPediatricsPsychologyPsychiatryStatisticsDevelopmental psychologyRating scale

Affiliated Institutions

Related Publications

Publication Info

Year
1985
Type
article
Volume
42
Issue
6
Pages
557-565
Citations
125
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

125
OpenAlex

Cite This

David Shinar, Cynthia R. Gross, J.P. Mohr et al. (1985). Interobserver Variability in the Assessment of Neurologic History and Examination in the Stroke Data Bank. Archives of Neurology , 42 (6) , 557-565. https://doi.org/10.1001/archneur.1985.04060060059010

Identifiers

DOI
10.1001/archneur.1985.04060060059010