inter-rater reliability

(redirected from Inter-rater agreement)

inter-rater reliability

The property of scales yielding equivalent results when used by different raters on different occasions.
Segen's Medical Dictionary. © 2012 Farlex, Inc. All rights reserved.
References in periodicals archive ?
Our study also supported this subjectivity and there were fair ([kappa] = 0.291) and moderate ([kappa] = 0.523) inter-rater agreement for flattened posterior globe/sclera and tortuosity of the optic nerve, respectively.
We will calculate a kappa statistic for inter-rater agreement. There will be pictures and tables of numbers.
Inter-rater agreement for this primary outcome measure was ultimately moderate.
Third, inter-rater agreement and reliability for all six personality traits and hireability were consistent with past studies that used SM technologies other than Twitter, demonstrating the reliability of using Twitter to predict personality and hireability.
Our quality of care implicit review instrument had excellent internal consistency, moderate inter-rater reliability, and high inter-rater agreement when applied to a diverse cohort of acutely ill and injured children receiving care in a large sample of EDs.
The modified Wilson scale is a variant of the Ramsay [9] and Wilson [10] scales is simple to use with an inter-rater agreement of 84%.
The degree of inter-rater agreement between parents and children can vary widely, but has been found to correlate better for observable physical domains than for nonobservable emotional domains (Matza et al., 2004).
A calculation of the intraclass correlation coefficient (ICG) was used to determine the level of inter-rater agreement. High IGG values indicate more agreement between raters.
The item content validity index (I-CVI) and scale content validity index (S-CVI) were calculated by the mean approach and inter-rater agreement. The scale was revised based on the comments from a team of five experts, after which it was evaluated by an additional group of four experts.
Tadi et al., "Generalized periodic discharges and 'triphasic waves': a blinded evaluation of inter-rater agreement and clinical significance," Clinical Neurophysiology, vol.
In spite of our best attempt to reduce subjectivity in the "Predatory Versus Non-Predatory Journal Rating Form" we found it difficult to reach close inter-rater agreement on several criteria across several articles, and this was reflected in the marginal inter-rater reliability measures that we obtained on criteria three, four and five.
Full browser ?