Assessment Workflows

Learn how to create and configure your first assessment workflow in Highlighter through this comprehensive tutorial.

Overview

This tutorial will guide you through creating a complete assessment workflow from start to finish. You'll learn how to set up data sources, configure assessment stages, assign reviewers, and monitor progress.

Prerequisites

Before starting this tutorial, ensure you have:

  • Access to a Highlighter platform account
  • Administrator or workflow creator permissions
  • Sample data files to work with
  • Basic understanding of your assessment requirements

Tutorial Goals

By the end of this tutorial, you will have:

  • Created a functional assessment workflow
  • Configured both machine and human assessment stages
  • Set up data routing and quality controls
  • Launched your first workflow order
  • Monitored workflow performance

Step 1: Planning Your Workflow

Define Your Objectives

Before creating the workflow, clearly define:

  • Purpose: What are you trying to assess or evaluate?
  • Input Data: What type of data will you be processing?
  • Output Goals: What decisions or classifications do you need?
  • Quality Standards: What level of accuracy is required?

Example Scenario

For this tutorial, we'll create a workflow that:

  • Processes uploaded images for quality assessment
  • Uses AI to detect potential issues automatically
  • Routes questionable images to human reviewers
  • Generates final approval/rejection decisions

Step 2: Creating the Workflow

Access the Workflow Builder

  1. Log into your Highlighter platform
  2. Navigate to the "Workflows" section
  3. Click "Create New Workflow"
  4. Choose "Assessment Workflow" as the template type

Basic Workflow Configuration

  1. Workflow Name: "Image Quality Assessment"
  2. Description: "Automated quality assessment of uploaded images with human review escalation"
  3. Category: Quality Control
  4. Priority: Normal

Data Source Configuration

  1. Input Source: Select your data source (file upload, database, API)
  2. File Types: Specify accepted formats (JPG, PNG, etc.)
  3. Size Limits: Set reasonable file size constraints
  4. Validation Rules: Configure basic data validation

Step 3: Configuring Assessment Stages

Stage 1: Data Intake and Validation

  1. Stage Name: "Data Validation"
  2. Stage Type: Automated Processing
  3. Functions:
    • File format verification
    • Size and resolution checks
    • Basic quality metrics
    • Duplicate detection

Configuration:

  • Set timeout limits for processing
  • Configure error handling for invalid files
  • Define quality thresholds for automatic rejection

Stage 2: Machine Assessment

  1. Stage Name: "AI Quality Analysis"
  2. Stage Type: Machine Assessment
  3. Agent Selection: Choose or configure your AI quality assessment agent
  4. Processing Parameters:
    • Batch size: 10 images
    • Confidence threshold: 85%
    • Processing timeout: 30 seconds per image

Quality Gates:

  • High confidence (>90%): Auto-approve
  • Medium confidence (70-90%): Route to human review
  • Low confidence (<70%): Flag for detailed review

Stage 3: Human Review

  1. Stage Name: "Expert Review"
  2. Stage Type: Human Assessment
  3. Review Type: Classification and Quality Control
  4. Reviewer Assignment:
    • Assign to "Quality Reviewers" group
    • Set workload distribution to "Even"
    • Configure backup reviewers

Review Configuration:

  • Review interface: Image annotation tool
  • Decision options: Approve, Reject, Request Clarification
  • Time limit: 2 minutes per image
  • Quality requirements: 95% accuracy

Step 4: Workflow Logic and Routing

Decision Points

Configure routing logic between stages:

  1. After Data Validation:

    • Valid files → AI Quality Analysis
    • Invalid files → Automatic rejection with notification
  2. After AI Assessment:

    • High confidence approval → Final approval
    • Medium confidence → Human review
    • Low confidence/rejection → Human review with priority flag
  3. After Human Review:

    • Approved → Final approval
    • Rejected → Final rejection with feedback
    • Clarification needed → Return to previous stage

Quality Control Measures

  • Audit Sampling: 10% of auto-approved items undergo human review
  • Consensus Reviews: Difficult cases reviewed by multiple experts
  • Escalation Rules: Complex cases escalated to senior reviewers
  • Feedback Loops: Human decisions used to improve AI performance

Step 5: Testing Your Workflow

Validation Testing

  1. Test Data Preparation:

    • Gather representative sample data (20-30 files)
    • Include edge cases and challenging examples
    • Prepare expected outcomes for comparison
  2. Workflow Testing:

    • Run test data through the workflow
    • Verify each stage functions correctly
    • Check routing logic and decision points
    • Validate output quality and format
  3. Performance Testing:

    • Test with larger data volumes
    • Monitor processing times and resource usage
    • Identify potential bottlenecks
    • Verify scalability under load

User Acceptance Testing

  1. Reviewer Training:

    • Train human reviewers on new workflow
    • Provide guidelines and examples
    • Conduct practice sessions
    • Address questions and concerns
  2. Feedback Collection:

    • Gather input from all stakeholders
    • Test user interfaces and tools
    • Identify usability improvements
    • Refine processes based on feedback

Step 6: Launching Your Workflow

Deployment Preparation

  1. Final Configuration Review:

    • Verify all settings and parameters
    • Double-check reviewer assignments
    • Confirm quality thresholds
    • Test notification systems
  2. Documentation:

    • Create user guides for reviewers
    • Document workflow processes
    • Prepare troubleshooting guides
    • Set up training materials

Go-Live Process

  1. Pilot Launch:

    • Start with limited data volume
    • Monitor closely for issues
    • Make adjustments as needed
    • Gradually increase volume
  2. Full Deployment:

    • Scale up to production volumes
    • Monitor performance metrics
    • Collect user feedback
    • Optimize based on real-world usage

Step 7: Monitoring and Optimization

Performance Metrics

Monitor these key indicators:

  • Throughput: Items processed per hour/day
  • Quality: Accuracy rates and error percentages
  • Efficiency: Average processing time per stage
  • User Satisfaction: Feedback from reviewers and stakeholders

Regular Optimization

  1. Weekly Reviews:

    • Analyze performance metrics
    • Identify bottlenecks or issues
    • Review quality outcomes
    • Collect user feedback
  2. Monthly Optimization:

    • Adjust thresholds and parameters
    • Retrain AI models with new data
    • Update reviewer assignments
    • Implement process improvements
  3. Quarterly Assessment:

    • Comprehensive workflow review
    • Cost-benefit analysis
    • Strategic alignment check
    • Planning for future enhancements

Troubleshooting Common Issues

Low Throughput

  • Check for processing bottlenecks
  • Verify adequate reviewer capacity
  • Optimize batch sizes and timeouts
  • Consider parallel processing options

Quality Issues

  • Review and update assessment criteria
  • Provide additional reviewer training
  • Adjust AI model thresholds
  • Implement additional quality controls

User Adoption Challenges

  • Gather detailed user feedback
  • Provide additional training and support
  • Simplify complex processes
  • Improve user interface design

Next Steps

After completing this tutorial:

  1. Explore Advanced Features: Learn about complex routing, multi-agent systems, and advanced analytics
  2. Integration: Connect your workflow with existing business systems
  3. Scaling: Expand to handle larger volumes and more complex scenarios
  4. Optimization: Continuously improve performance and quality

Additional Resources

  • Advanced Workflow Configuration: Learn about complex scenarios
  • Agent Management: Deep dive into AI agent configuration
  • Performance Optimization: Advanced techniques for scaling
  • Integration Guides: Connect with external systems

Congratulations! You've successfully created your first assessment workflow. This foundation will serve you well as you build more sophisticated assessment systems in Highlighter.