Hello Lisa,
You could do this using one codeset but it would require a bit of co-ordinating when the coding tool was set for multiple data entry.
At the beginning stage your screening codeset would be set to multiple data entry and you would create coding assignments in the ‘Collaborate’ tab with items that would be used for inter-rater reliability. You would double screen the items, run comparisons to find your agreements and disagreements and then ‘complete’ the agreements and ‘reconcile and complete’ the disagreements.
You would then change the codeset back to single data entry, create new coding assignments with half of the remaining items and screen them.
When those coding assignments were finished you would then change the coding tool back to multiple data entry, create new coding assignments for the inter-rater reliability phase and repeat what you did at the beginning (double code, compare, reconcile and complete) but with the new coding assignments.
When that inter-rater reliability phase was finished you would return the codeset to single data entry, create new coding assignments with the remaining items (minus a batch that you will use for the final inter-rater reliability test) and carry on coding.
The final operation would be the final inter-rater reliability batch of items that you would double code just as you did previously.
When that was all done I would then return the coding tool to single data entry and you would have all of your items screened and completed, have 3 double screening sessions, and still be working with just one screening tool.
Best regards,
Jeff