HelpForum

Forum (Archive)

This forum is kept largely for historic reasons and for our latest changes announcements. (It was focused around the older EPPI Reviewer version 4.)

There are many informative posts and answers to common questions, but you may find our videos and other resources more informative if you are an EPPI Reviewer WEB user.

Click here to search the forum. If you do have questions or require support, please email eppisupport@ucl.ac.uk.

<< Back to main Help page

HomeHomeUsing EPPI-Revi...Using EPPI-Revi...Questions about...Questions about...Inter-rater ReliabilityInter-rater Reliability
Previous
 
Next
New Post
16/06/2016 16:26
 

Is there a way to calculate inter-rater reliablity in EPPI for include vs. exclude AND for reasons excluded (we have about 8 reasons to exclude an article and a reviewer can select multiple options)? How would I go about doing this? 

 
New Post
16/06/2016 17:05
 

Hello Cara,

The answer depends on what you mean by inter-rater reliability. If you are looking for percentages of agreement and disagreement you should be able to get that when you run a comparison. If you have used the screening codeset type you will get the data for both include vs exclude and full disagreement (for example, if two people disagree on the exclusion)

If you are looking for true kappa statistics than you are talking about Cohen's kappa or possibly Fleiss's kappa (when there are more than two raters which you might have if you are triple screening). We don't have a function to calculate the kappa score but if necessary we could retrieve the data necessary if you wanted to calculate it.

Best regards,
Jeff

 
Previous
 
Next
HomeHomeUsing EPPI-Revi...Using EPPI-Revi...Questions about...Questions about...Inter-rater ReliabilityInter-rater Reliability


Copyright 2021 by EPPI-Centre :: Privacy Statement :: Terms Of Use :: Site Map :: Login
Home::Help::EPPI-Mapper::RIS Export::About::Account Manager