Skip to main content

Table 6 Cohen’s kappa was used to quantify the interrater agreement between the radiologist readers

From: Practical application and validation of the 2018 ATS/ERS/JRS/ALAT and Fleischner Society guidelines for the diagnosis of idiopathic pulmonary fibrosis

HRCT readers

Overall agreement (kappa)

R1-R2

0.33 (0.20, 0.45)

R1-R3

0.81 (0.72, 0.90)

R2-R3

0.32 (0.20, 0.44)

  1. To quantify the uncertainty of these estimates, bootstrapped 95% confidence intervals were computed