Skip to content

feat: expose inference latency and FPS metrics in Streamlit GUI evaluator tab #538

@AdityaX18

Description

@AdityaX18

Problem

perceptionmetrics/cli/computational_cost.py already measures inference
latency for CLI users. However tabs/evaluator.py runs inference with no
timing instrumentation — GUI users have no way to see FPS or per-image latency.

For robotics deployment, accuracy alone is insufficient. A model with 0.85 mAP
at 2 FPS is unusable; one at 0.78 mAP at 30 FPS is deployable. The GUI gives
zero signal on this axis despite the CLI having full support.

Proposed Solution

  1. Add perceptionmetrics/utils/latency_profiler.py — lightweight LatencyReport
    class recording per-image wall-clock time (stdlib only: time + statistics)
  2. Instrument tabs/evaluator.py inference call with time.perf_counter()
  3. Render mean latency, FPS, P95, P99 as st.metric() row in the evaluator tab

No new dependencies. Consistent with existing computational_cost.py approach.

I plan to submit a PR for this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions