HelpForum

Forum (Archive)

This forum is kept largely for historic reasons and for our latest changes announcements. (It was focused around the older EPPI Reviewer version 4.)

There are many informative posts and answers to common questions, but you may find our videos and other resources more informative if you are an EPPI Reviewer WEB user.

Click here to search the forum. If you do have questions or require support, please email eppisupport@ucl.ac.uk.

<< Back to main Help page

HomeHomeUsing EPPI-Revi...Using EPPI-Revi...Questions about...Questions about...Kappa StatisticsKappa Statistics
Previous
 
Next
New Post
15/06/2011 16:27
 

 Hi,

We would like the software to keep a note of all the disagreements in our group since we would like to calculate kappa statistics of level 1 screeening at the end. 

My question is: Will the system keep the info about the # of disagreements in each group or the info will be lost on recouncilation?

Monali

 
New Post
15/06/2011 16:53
Accepted Answer 

Hello Monali,

At this time we do not have a function for calculating kappa (although it is something we would like to add) but you can use the data in the Collaborate tab to calculate it manually.

If you run 'Create comparison' after you have finished your double coding but before you start 'completing' your agreements and reconciling your disagreements you will have a static record of each persons coding in the comparison. Each time you generate a new 'Create comparison' you are creating a static record of how it looked at the time the comparison was created.

If you click on View in the stats column you will see the number of agreements and disagreements. If you run a 'Quick report' the individual coding for each person in the comparison is preserved from the time the comparison was generated.

The important thing to remember is to run your comparison before you start sorting out your agreements and disagreements. As well, be sure that you have set up the comparison correctly (i.e. you are comparing the correct items against the correct codeset using the correct coders).

Best regards,

Jeff

 
New Post
28/05/2012 11:41
 

Just to clarify - once you start reconciliation, there's no way to recover information on the individual codings?

thanks

Theo

 
New Post
28/05/2012 12:15
 

Hello Theo,

The system remembers who applies which codes to which items so there are a few different ways to see this information depending on what you require.

When you run a 'Create comparison' in the Collaborate tab a snap shot is made how each item in the comparison is coded and by who. If you then run a quick report on that comparison you can see a table showing what codes each person in the comparison assigned to the item(s). The comparison is like a snap shot in time so if you run this before you start reconciling differences that you will have a permanent record of the individual codings. This method of comparison is often used when your codeset is of a single hierarchy (such as a screening codeset).

If you have a more detailed codeset, such as a data extraction tool, you will want to compare the individual codings in the 'Coding record' tab that you can find in the 'Document details' window. In this tab you first select the individuals to compare and the codeset to use. You then click on 'Run comparison' and a report will be generated for that item showing how each person coded that item. This method can be used on any codeset, including screening codesets. If you don't want to run a comparison you can look at an individuals coding by clicking on the 'View' button in the appropriate row (in the 'Coding record' tab).

Best regards,

Jeff

 

 
New Post
28/05/2012 13:15
 

Thanks Jeff. What i'd like to do at the moment is get inter-rater reliability statistics for codings (just the screening, so the first of your two options above) which have all been finalized already. So I'm looking again at the 'create comparison' snapshots as suggested. This is a little tricky firstly because the initial process was messy and took place in multiple stages, and it's hard to make out when there are overlaps etc., but that's manageable. The bigger issue is that i'd like to get inter-rater reliability for IN vs EX - i.e. i'm not interested in disagreements where different reviewers put different EX codes. Using 'create comparison' only gives me the raw numbers for agreements vs disagreements - is there any way to break this down?

 
New Post
28/05/2012 14:27
 

Hello Theo,

There isn't a way in the software to break it down into codes that indicated include vs those that indicated exclude (when comparing each coder).

You might be able to figure this out if you have the required 'comparisons' in the Collaborate tab. In that case you could generated a 'quick report', cut and paste the report into excel and then use the formulas in excel to count the IN and EX code disagreements.

It might be possible for us to generate a script to determine this information but if the screening took place in multiple stages and multiple codesets, then it becomes a bit more complicated.

Best regards,

Jeff

 
New Post
28/05/2012 14:58
 

Good thinking; will do it in Excel.

To be honest - at later stages of this review, rather than using Collaborate i created a separate code set for each reviewer, plus another one for the finalised codes. So to get the required info for those code sets, I can just run a crosstab. (The reconciliation process was more straightforward as well.) In future I'll be doing this at the outset and avoiding Collaborate altogether.

 
New Post
28/05/2012 17:07
 

Hello Theo,

I can see where each person having their own coding tool might be a good solution in certain situations but the work assignments and comparison reports in the collaborate tab will provide you with more functionality in most situations.

We have tried to keep EPPI-Reviewer very flexible and users are free to experiment with different methodologies to find what works best for them.

Best regards,

Jeff

 

 
Previous
 
Next
HomeHomeUsing EPPI-Revi...Using EPPI-Revi...Questions about...Questions about...Kappa StatisticsKappa Statistics


Copyright 2021 by EPPI-Centre :: Privacy Statement :: Terms Of Use :: Site Map :: Login
Home::Help::EPPI-Mapper::RIS Export::About::Account Manager