HelpForum

Forum (Archive)

This forum is kept largely for historic reasons and for our latest changes announcements. (It was focused around our, older, EPPI Reviewer version 4.)

There are many informative posts and answers to common questions, but you may find our videos and other resources more informative if you are an EPPI Reviewer WEB user.

Click here to search the forum. If you do have questions or require support, please email eppisupport@ucl.ac.uk.

HomeHomeEPPI-Reviewer 4...EPPI-Reviewer 4...Forum announcem...Forum announcem...Changes to the comparison calculation in the Collaborate tabChanges to the comparison calculation in the Collaborate tab
Previous
 
Next
New Post
24/11/2011 15:45
 

We have made a change in the way comparisons are calculated in the Collaborate tab.

The original calculation of agreements and disagreements between two coders is based on a comparison of the first coder against the second coder. The identify of the first coder is determined by how the comparison is created when ‘Create comparison’ is run. The system looks at the codes the first coder has selected and determines if the second coder has selected the same codes. It is a one-way comparison as coder 1 is the basis of the comparison. If the second coder has selected all of the codes the first coder selected then it is called  an agreement. If the second coder has not selected one of the codes the first coder selected then it is called a disagreement.
 
This method of comparison is dependant on how the ‘Create comparison’ is run as the first coder was the control. This approach of comparison has both advantages and disadvantages. The original expectation was that reviewers would user the comparison functions when double screening. It is well suited for this type of activity as screening is normally an include / exclude scenario which lends itself well to identifying agreements and disagreements
 
Where a possible issue arises with this comparison method is when it is used against a coding tool where multiple selections are expected to be made. An example of this would be a keywording coding tool where an item is being categorised. Coder 1 might select a number of appropriate codes and coder two might select a number of codes.  The correct response might be a combination of what coder 1 and coder 2 have selected. If coder 2 has matched all of coder 1’s selections but also selected further items it would still be considered an agreement based on the present comparison method as coder 1 is the control. If the reviewer ‘completes’ the agreements and selects coder 1’s responses as the agreed version then the extra selections of coder 2 are being ignored. Those extra selections might be just as valid as what coder 1 and coder 2 agreed on.
 
We have found that the comparison functions are being used for much more than screening studies so we have decided to amend our comparison method.  It will no longer be dependant on how the comparison is set up and will run the comparison in both directions. This means that neither coder 1 nor coder 2 will be the control. If one coder has selected more codes than the other, even though there might be a one-to-one match in the other direction, it will be called a disagreement. In the past this could have been an agreement depending on how the comparison was set up.
 
The affect of this change is that the user may now have more disagreements to resolve when a comparison is made. Although this might mean more work for some it will be a safer approach as it will avoid the situation where a comparison could be both an agreement and disagreement depending on how the comparison was originally set up.

As for existing reviews, we have identified any comparisons that might be affected by this change and in most cases it is not an issue. This is because the problem does not affect situations where only one code is selected per coder. As well it does not affect situations in which there are many codes selected only by reviewer A and many by reviewer B with only some being in common between them. (This situation will be typical in data extraction: you will very rarely get situations in which the 'disagreements' are only one-sided). If it looks like this change will affect any of the existing reviews in a detrimental fashion we will contact the review owners.

If you have any questions about this please let us know.

Best regards,

Jeff

 

 
Previous
 
Next
HomeHomeEPPI-Reviewer 4...EPPI-Reviewer 4...Forum announcem...Forum announcem...Changes to the comparison calculation in the Collaborate tabChanges to the comparison calculation in the Collaborate tab


Copyright 2021 by EPPI-Centre :: Privacy Statement :: Terms Of Use :: Site Map :: Login
Home::Help::EPPI-Mapper::RIS Export::About::Account Manager