Beta: Consensus Stages

Setting up a workflow with Consensus stages gives your team the ability to assign the same file to multiple labellers, and have them annotate in parallel without any insight into each others' annotations.

Based on the overlap parameters that you define, labels that do not meet your requirements will be sent to a separate review stage where they can be accepted or discarded.

When the percentage of overlap between labels meets your requirements, one annotation will be randomly selected and moved to the next workflow stage in sequence.

📘

Enabling Consensus Stages

Consensus stages are currently in beta. Reach out to us at the Support/Feedback button in V7 to request access, and we'll turn the feature on so you can test it with your team.

Set up a consensus stage

To add one or more parallel stages to a workflow, add a Being Annotated stage, and tick the option to make it a Blind Stage. Add a parallel stage for each annotator that will be taking part in the stage.

Define minimum required overlap for polygon or bounding box annotations. When this requirement is met, the file will pass the Automated Test stage. One of the overlapping labels will be selected to move to the next stage.

If the requirement is not met, the file will fail the Automated Test stage. And be sent to a separate Review stage.

590

Assign files in parallel

When a consensus stage is enabled, files can be assigned in three ways:

1. Automatic batch assignment

If your dataset is set up for your team to assign themselves tasks in batches automatically, then available assignment slots in a file's consensus stage will automatically fill as your team assigns themselves work.

2. Bulk assignment

When Assigning in bulk, select the files that you would like to assign to a team member. Re-select and assign the files to another team member to fill up the available assignment slots for the file's consensus stage.

3. Assignment from the Workview

From within the workview of a file, select the assignment dropdown for the consensus stage, and assign team members for each available assignment slot.

278

Review failed labels

If a file has failed the Automated Test stage, any reviewer assigned to the fail stage will be able to view all annotations and their authors, and discard or edit any labels that require correction before sending them to the next stage.

590