a38ea1db516863f30b138ce7c29e26d9221a1f95
Add async HTTP client wrapping the Ollama REST API with: - OllamaClient with generate, generate_stream, chat, embed, list_models, is_healthy - NDJSON streaming parser for /api/generate streaming responses - Serde types for all Ollama API endpoints - OllamaError enum with Http, Api, Deserialization, StreamIncomplete variants - OllamaClientConfig for timeout and connection pool settings - Integration into ModelGatewayServiceImpl (constructor now returns Result) - 48 tests (types serde, wiremock HTTP mocks, error handling, config) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
llm-multiverse
System for orchestrating local LLM agents
Description
Languages
Rust
60.3%
Python
37.5%
Shell
1.9%
Dockerfile
0.3%