* "Self-hosted: Runs entirely on your infrastructure. No data leaves your network."
* "Bring Your Own LLM: Anthropic, OpenAI, Gemini, or open-weight models via vLLM."
With so many newbies wanting these kinds of services it might be worth adjusting the first bullet to say: "No data leaves your network, at least as long as you don't use any Anthropic, OpenAI, or Gemini models via the network of course"
That's a good point, it might make sense to clarify that for individuals who want to self-host. I'll make the change, thanks!
Most organizations are going to be self hosting on aws, gcp or azure... So as long as you use their inference services as your LLM then you can keep it all within the private network
loading story #47217267
Exactly, enterprise customers almost always use private model endpoints on their cloud provider for any serious deployments. Data stays within the customer's VPC, data security and privacy is guaranteed by the cloud providers.