The authors report adequate levels of interrater agreement
for the CPRSR (r = .
The first author compared 20% of each scorer's writing samples with her own and calculated interrater agreement
(number of agreements divided by agreements plus disagreements and multiplied by 100).
The initial interrater agreement
was 90%; after discussion, 100% agreement was reached.
The analysis of interrater agreement
is available in the "Occupational Requirements Survey job observation report" at http://www.
and internal consistency are high ([kappa] = 0.
Therefore, it is possible to have human judgement bias during interrater agreement
if specific procedures to characterize and classify the scanpaths are not developed.
Ratings of interrater agreement
over the study period were consistent, ranging from 0.
Cronbach's alpha, inter-item and inter-total correlations, and interrater agreement
were used to evaluate reliability of the tool.
According to Whitehurst (1984), "There are many reasons to be concerned with interrater agreement
in peer review, not the least of which is a prevalent impression that the fate of a manuscript .
: Interrater Reliability (IRR) and Interrater Agreement
(IRA) indices are often used to justify aggregating data used in composition models (LeBreton & Senter, 2008).
between biopsy based diagnoses and EMR based diagnoses was determined by using the kappa statistic.
Assessing risk of bias in prevalence studies: modification of an existing tool and evidence of interrater agreement