interrater agreement

interrater agreement

(int′ĕr-rāt′ĕr)
The extent to which two or more individuals come to the same conclusions when they examine a single set of data.
References in periodicals archive ?
Data collection for the two students began as soon as interrater agreement levels of at least 100% were achieved for three consecutive trials with the first author.
In the second phase (assessment of variability), recordings of paired automated and manual assessments were made to analyze variability and interrater agreement for pupil size, detection of anisocoria, and PLR between 2 independent observers.
Prior to implementation of data collection, the TBOM and the SSIBOM were reviewed for clarity and face validity by experts in the field and were piloted to examine interrater agreement.
Interrater agreement on subgingival calculus detection following scaling.
Cohen's Kappa test was also used to evaluate interrater agreement for qualitative (categorical) items.
Other authors have suggested a significantly higher number of repeated scans before an ultrasound-naive nurse can achieve a good interrater agreement with that of an expert.
Nine of these 19 had an interrater agreement greater than 0.6 using the Cohen [kappa] (Figure 7, a) versus 12 using the Gwet [AC.sub.1] (Figure 7, b).
An unbiased estimate of global interrater agreement. Educational and Psychological Measurement, 13164416654740.
The interrater agreement on the classification of potential eligible extrasystoles was "moderate" to "strong" by common standards [17, 18].
To determine interrater agreement and accuracy of the triage, 12 nurses working at the ED independently triaged 10 different written case scenarios.
The authors report adequate levels of interrater agreement for the CPRSR (r = .71, p = .001) and acceptable levels of concurrent criterion-related validity, in that scores have been found to be significantly correlated (r = .64) with scores on the Beck Scale for Suicidal Ideation (Beck & Steer, 1991).