feat: connect chat UI to gRPC-Web streaming with loading indicator

- Wire processRequest() async generator to chat page
- Progressive message rendering as stream chunks arrive
- Animated loading dots while waiting for first chunk
- Error display with OrchestratorError code mapping
- Session ID management with crypto.randomUUID()

Closes #7

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
shahondin1624
2026-03-12 11:29:10 +01:00
parent 8802d1bb72
commit f6eef3a7f6
3 changed files with 77 additions and 1 deletions

View File

@@ -0,0 +1,16 @@
---
---
# Issue #7: Streaming response rendering
**Status:** COMPLETED
**Issue:** https://git.shahondin1624.de/llm-multiverse/llm-multiverse-ui/issues/7
**Branch:** `feature/issue-7-streaming-response`
## Acceptance Criteria
- [x] Chat UI connected to gRPC-Web client service from #4
- [x] Streaming responses rendered in real-time as chunks arrive
- [x] `message` field displayed progressively
- [x] Handles stream completion and errors gracefully
- [x] Loading indicator shown while waiting for first chunk