The Consensus Stage
Consensus stages enable the quality assurance of annotations by measuring the extent of agreement between different annotators (and/or models) and allowing the selection of the best annotation when there is a disagreement. This page describes how to effectively use Consensus Stages in your workflows.
Overview
A Consensus Stage has three parts: the stage setup, annotation and review. This can best be understood via an example, as described below.
A medical team has 2 radiologists and a cancer segmentation model which it wants to use to annotate cancerous/non-cancerous DICOM images. At a high level:
- The workforce manager sets up a Consensus Stage and assigns the 2 radiologists and the model to parallel stages within the Consensus stage.
- The annotators request work and independently annotate the items in the Consensus Stage, whilst the model generates its predictions.
- If there is disagreement between the annotators and/or model, the items enter a Review Stage, where the workforce manager reviews and resolves any disagreements.
How to Setup a Consensus Stage
- Add a Consensus Stage to your workflow in the workflows editor.
- Add and specify which models and/or annotators will participate in the Consensus Stage. You can add multiple parallel stages and annotators per parallel stage.
- Configure the Consensus threshold per class. This specifies the minimum IoU score required for an item to automatically pass through the Consensus Stage.
- Specify which model or annotator will act as your Champion Stage. If an item automatically passes through a Consensus Stage due to an agreement, the Champion Stage's annotation is deemed as the ‘correct’ annotation and is used moving forward.
- Save your workflow and let your annotators request work.
How to Review Disagreements
- After all annotators and/or models have added their annotations, the disagreements can be reviewed in a review stage.
- To resolve a disagreement, accept an annotation or delete annotations in the sidebar until the disagreement is resolved.
Current Limitations
We currently support automatic disagreement checking for bounding boxes and polygons on images, videos, DICOMs and PDFs, for any number of annotators and/or models.
We currently don’t support automatic disagreement checking for polylines, ellipses, keypoints and keypoint skeletons. We do, however, allow teams to add these annotation types in the parallel annotation stages so that you can identify and review potential disagreements manually and still use a Consensus Stage in your workflow. If you have a consensus use case for one of these annotation types, please ask us to turn this setting on for you via Intercom or via your dedicated customer success manager.
Updated about 1 year ago