feat: connect chat UI to gRPC-Web streaming with loading indicator
- Wire processRequest() async generator to chat page - Progressive message rendering as stream chunks arrive - Animated loading dots while waiting for first chunk - Error display with OrchestratorError code mapping - Session ID management with crypto.randomUUID() Closes #7 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
16
implementation-plans/issue-007.md
Normal file
16
implementation-plans/issue-007.md
Normal file
@@ -0,0 +1,16 @@
|
||||
---
|
||||
---
|
||||
|
||||
# Issue #7: Streaming response rendering
|
||||
|
||||
**Status:** COMPLETED
|
||||
**Issue:** https://git.shahondin1624.de/llm-multiverse/llm-multiverse-ui/issues/7
|
||||
**Branch:** `feature/issue-7-streaming-response`
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [x] Chat UI connected to gRPC-Web client service from #4
|
||||
- [x] Streaming responses rendered in real-time as chunks arrive
|
||||
- [x] `message` field displayed progressively
|
||||
- [x] Handles stream completion and errors gracefully
|
||||
- [x] Loading indicator shown while waiting for first chunk
|
||||
Reference in New Issue
Block a user