|This article applies to:||
What we'll cover
Navigating to and utilizing the controls in the Scoring Configuration panel.
- Navigating to the Scoring Configuration panel
- Overview of Scoring Configuration concepts
- Utilizing the Scoring Configuration Controls
- Recommendations on using Confidence Scoring
Navigating to the Scoring Configuration panel
- Click "Projects" in the left sidebar of the admin portal. Then click "Ontology" in the central panel. Finally, click "Scoring configuration".
The "Scoring Configuration" panel will pop-out.
Overview of Scoring Configuration Concepts
The Scoring Configuration pop-up allows users to configure thresholds for localization accuracy for bounding boxes and key points. Each annotator's result is scored using these thresholds against the final reviewed result for that record.
Bounding box localization accuracy is measured using Intersection over Union (IOU). Key point localization accuracy is measured using pixel distance. The localization is considered correct for an entity when the measured accuracy is greater than or equal to the specified threshold. An annotator's history of providing correct localizations is used to estimate a confidence value for newly completed records. This confidence estimate is reported in the portal for records that are ready to be reviewed.
Intersection over Union (IOU) is a metric that relates the overlap (intersection) between two bounding boxes and the total area (union) taken up by both boxes. The intersection area is divided by the union area to get the IOU ratio.
Pixel distance is measured by the straight line distance between two points in the pixel coordinate system of an image or video frame. Also called Euclidean distance or Cartesian distance.
Utilizing the Scoring Configuration Controls
- To modify the Bounding Box IOU Accuracy slide the selector left or right to decrease or increase the IOU threshold respectively.
- To modify the Key Point Pixel Distance use the up or down arrows to increase or decrease the threshold respectively.
Recommendations on using Confidence Scoring
When deciding if you want to review a record/video to possibly edit/fix annotations, you can view its confidence score to help make that decision. Records that are 'Ready for Review' will have a confidence value which is based on prior reviews of the annotator's work/records/videos. This confidence value (ranging between 0.0 and .99) is a reflection of the extent to which the annotator's previously annotated records were modified by the person who reviewed those records. So, a higher value indicating that review of the annotator's work resulted in few edits/corrections being made, can provide more confidence that a record does not warrant review and can simply be approved. For a given user, this annotation score is constantly evolving, incorporating all review work for a project. (Note: A confidence score will only exist for a record after an annotator has had one record reviewed.)