Skip to main content
Annotation queues provide a focused review interface for domain experts. You review one item at a time, examine the provided context, and submit structured feedback using predefined fields. You do not need to understand the underlying model or tracing system to complete reviews. This allows external experts to review the data only without needing larger project awareness.

Annotation workflow

As an annotator, your task consists of reviewing the accuracy of the LLM responses. Your feedback is saved as annotations to the data. To review an annotation queue:
  1. Open the shared annotation queue using a queue link provided by your team.
  2. Review each item. You can move forward and backward through items in the queue at any time.
  3. Submit an annotation using the provided annotation fields.
Weave saves your progress automatically after you submit an item. If you need to pause and resume your work later, the review automatically resumes at the first incomplete item. Once all items have been reviewed, the Annotation Queue table reflects completion of the work.

Review a queue item

For each item, the review interface shows two panes:
  • A trace pane showing selected input context (such as prompts, documents, or images) and the corresponding model response or decision.
  • An item pane containing an annotation entry form containing annotation fields.
Complete each annotation field according to the instructions provided for the queue, then click Submit and next to save this item and continue to the next.
Annotation queue review interface for a single trace item, showing a trace information pane alongside an annotation form pane where reviewers submit structured feedback and move through the queue.
In the trace pane header, you can click the Trace ID suffix (for example, “b3ea”) to open a panel showing the full trace details for the call, where you can inspect the trace and view or compare the annotation provided.
If any annotator has already reviewed an item in the queue, both pane headers of the review interface are decorated with Has annotation and Response submitted indicators. You can still provide an annotation for an item that someone else has already reviewed. In this case, the annotation submission button still displays Edit annotation, but submitting creates a new annotation entry with your responses.
If you submit feedback for an item that someone else has already reviewed, your feedback is stored in addition to the original annotator’s feedback.