feat: display inference statistics in chat UI #45

Merged
shahondin1624 merged 3 commits from feature/issue-43-inference-stats into main 2026-03-13 14:50:35 +01:00

Summary

  • Add InferenceStats proto message (prompt tokens, completion tokens, total tokens, context window size, tokens/sec) and regenerate TypeScript types
  • Create InferenceStatsPanel.svelte collapsible component with token counts grid, throughput display, and color-coded context utilization progress bar
  • Integrate into chat page below FinalResult, shown after orchestration completes
  • Full dark mode and accessibility support (ARIA attributes on progress bar)

Closes #43

Test plan

  • Verify build, lint, and typecheck pass (npm run build && npm run check && npm run lint)
  • Confirm InferenceStatsPanel renders when backend sends inference_stats in the final ProcessRequestResponse
  • Verify panel is collapsed by default and expands on click
  • Check context utilization progress bar colors: blue (<70%), amber (70-90%), red (>90%)
  • Verify dark mode styling
  • Confirm panel does not appear during streaming, only after completion
## Summary - Add `InferenceStats` proto message (prompt tokens, completion tokens, total tokens, context window size, tokens/sec) and regenerate TypeScript types - Create `InferenceStatsPanel.svelte` collapsible component with token counts grid, throughput display, and color-coded context utilization progress bar - Integrate into chat page below `FinalResult`, shown after orchestration completes - Full dark mode and accessibility support (ARIA attributes on progress bar) Closes #43 ## Test plan - [ ] Verify build, lint, and typecheck pass (`npm run build && npm run check && npm run lint`) - [ ] Confirm `InferenceStatsPanel` renders when backend sends `inference_stats` in the final `ProcessRequestResponse` - [ ] Verify panel is collapsed by default and expands on click - [ ] Check context utilization progress bar colors: blue (<70%), amber (70-90%), red (>90%) - [ ] Verify dark mode styling - [ ] Confirm panel does not appear during streaming, only after completion
shahondin1624 added 3 commits 2026-03-13 14:48:02 +01:00
Add InferenceStats proto message and InferenceStatsPanel component that
displays token counts, throughput, and context window utilization as a
collapsible panel below assistant messages after orchestration completes.

Closes #43

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
shahondin1624 merged commit 6f83311d94 into main 2026-03-13 14:50:35 +01:00
shahondin1624 deleted branch feature/issue-43-inference-stats 2026-03-13 14:50:35 +01:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: llm-multiverse/llm-multiverse-ui#45