aj dc56021a77 Feature: Add local LLM vision clients (llama.cpp and Ollama)
Add LlamaCppVisionClient and OllamaVisionClient for local AI inference
as alternatives to OpenAI and Claude. Includes text-only prompt support
for LLM-assisted receipt matching.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 16:53:51 -05:00
S
Description
No description provided
103 MiB
Languages
C# 68.4%
HTML 28.5%
JavaScript 1.6%
CSS 1.4%
Dockerfile 0.1%