Authoring Information

The Annotation Authoring Information feature provides comprehensive tracking and visibility into the creation and modification history of annotations (cuboids, 3D polylines, and 2D bounding boxes). This feature captures critical metadata about who created or modified an annotation, when changes were made, how the annotation was generated, and where in the workflow it was edited.

Key information tracked includes:

  • Timestamp: Creation and edit dates/times for each annotation

  • User Identity: Email ID or employee ID of the creator/editor

  • Annotation Source: Origin method (prelabel, manual drawing, smart automation like interpolation or dynamic sizing)

  • Workflow Stage: The specific stage/node where changes occurred

  • Change History: All attribute modifications, including positional, dimensional, rotational, and property changes

  • Visual Comparison: Side-by-side visualization of any two historical versions

The system consolidates changes per session (stage + labeler combination) to show the final state rather than every incremental edit, providing a clear audit trail without overwhelming detail.


Use Cases

  1. Quality Assurance & Review: Quality reviewers can trace annotation lineage to identify patterns in errors, verify that annotations meet quality standards, and understand which automated tools or manual processes produced specific results.

  2. Dispute Resolution: When questions arise about annotation accuracy or consistency, teams can review the complete history to understand decision-making context and resolve disagreements with factual evidence.

  3. Debugging & Issue Investigation: When annotation quality issues are detected downstream, teams can trace back through the history to identify where problems were introduced and prevent recurrence.


Benefits

Good: Reviewers can quickly identify suspicious changes, verify automation accuracy, and ensure annotations meet project standards, resulting in higher-quality datasets.

Fast: Reduce time spent investigating quality issues from hours to minutes by quickly pinpointing when and where problems occurred.

Cost Efficient:

  • Identify and address quality issues early before they propagate through the pipeline, avoiding expensive late-stage corrections.

  • Analyze prelabel vs. manual annotation performance to make data-driven decisions about where automation delivers ROI and where human expertise is essential.

  • Understand which stages and annotators require more support, allowing managers to allocate training, supervision, and resources where they'll have the greatest impact.


How to Use the Authoring Information

Accessing Authoring Information

  1. Click on any cuboid, 3D polyline, or 2D bounding box

  2. Right-click and navigate to the Authoring Information. The Authoring Information panel appears

  3. The authoring information panel displays the following information sorted on the recent changes:

    • Creation timestamp and creator identity

    • Most recent edit timestamp and editor identity

    • Annotation source (prelabel/manual/smart automation)

    • Workflow stage where changes occurred

    • List of all attribute changes made

Understanding the Information Display

User Identity: Shows email address when available, otherwise displays employee ID for attribution.

Source Indicators:

  • Prelabel: Annotation originated from automated prelabeling pipeline

  • Manual: Created by human labeler from scratch

  • Automation: Generated or modified by tools like interpolation, dynamic sizing, or propagation

Stage Information: Displays the workflow node (e.g., "Labeling/OP", "Review/QC") where the annotation reached its current state in that session.

Change Consolidation: Only the final state per session (stage + labeler) is shown, not every intermediate adjustment.

Visual Comparison Feature

  1. In the authoring information panel, click on the toggle beside each version

  2. The system displays both versions simultaneously with the version history appearing on the right panel

  3. Switch between split-screen, overlay, and difference highlighting modes


Best Practices

For Project Managers

Regular Audits: Schedule periodic reviews of authoring information to identify process bottlenecks, training needs, or quality issues before they escalate.

Leverage Source Data: Analyze prelabel vs. manual annotation metrics to optimize automation investment and understand where human expertise adds most value.

Set Expectations: Communicate to teams that authoring information is tracked for quality improvement, not punitive measures, to maintain a positive learning culture.

For Quality Reviewers

Start with Source: When reviewing annotations, check the source first—prelabeled items may need closer scrutiny than manually created annotations that have been through multiple stages.

Compare Versions Strategically: Use visual comparison between initial creation and current state to quickly spot significant changes that warrant investigation.

Document Patterns: When you notice recurring issues from specific sources, stages, or individuals, document these patterns to drive systematic improvements.

Validate Automation: Regularly sample smart automation results to ensure interpolation, dynamic sizing, and other tools maintain accuracy.

General Best Practices

Version Comparison for Validation: Always use visual comparison when investigating annotation discrepancies.

Session Consolidation Awareness: Remember that only final session states are shown—if you need more granular detail, review changes in real-time rather than retroactively.

Document Exceptional Cases: When unusual patterns appear in authoring history (e.g., multiple rapid edits), investigate and document root causes to prevent future issues.

Last updated