Skip to content

[TD] Decouple AiExecutor Batch Pattern #236

@ArthurCRodrigues

Description

@ArthurCRodrigues

Summary

AI-enabled tests currently register themselves with AiExecutor, return placeholder results, and rely on GraderService to detect executors after grading to send a batch request. Results are then mutated in place, breaking the step-by-step pipeline model.

Scope

  • Redesign AI execution so batch processing happens in a dedicated pipeline-aware component (e.g., separate step or executor strategy) instead of hidden inside test functions.
  • Remove the hasattr(test_func, "executor") check in GraderService and disallow in-place mutation of TestResult objects.
  • Ensure AI tests follow the same contract as other tests (execute → return result).

Impacted Files

  • autograder/utils/executors/ai_executor.py
  • autograder/services/grader_service.py
  • Templates/tests using AI executors

Risks

  • AI grading may regress if batch semantics change; add targeted integration tests.

Acceptance Criteria

  • AI execution no longer hijacks test functions; results flow through the pipeline like other tests.
  • GraderService has no executor-specific checks.

References

  • docs/roadmaps/TECHNICAL_DEBT_ROADMAP.md (Item 21)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions