AJ Isaacs dc56021a77 Feature: Add local LLM vision clients (llama.cpp and Ollama)
Add LlamaCppVisionClient and OllamaVisionClient for local AI inference
as alternatives to OpenAI and Claude. Includes text-only prompt support
for LLM-assisted receipt matching.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 16:53:51 -05:00
Description
No description provided
103 MiB
Languages
C# 66.6%
HTML 31.2%
JavaScript 1.9%
CSS 0.2%
Dockerfile 0.1%