This is meant to be a quick web UI for you to test your local LLM’s (currently only supports Ollama) and check their performance, accuracy, token usage and what not. Nothing fancy, just a dead simple interface without the bloat where everything stays local.