API

Local LLM Setup

Running large language models locally with Ollama — no cloud costs, full data privacy — with a Streamlit UI and Tavily web search integration.