Open Source · Self-Hosted · Community Edition

Own your AI stack.
On your hardware.

OpenGEENii is a self-hosted AI platform for building chat assistants, autonomous agents, and AI workflows. Your data stays where you put it — but you're never limited to local resources alone.

🏠

Local-first

Runs entirely on your own hardware. No cloud dependency required. Data never leaves your infrastructure.

🔒

Privacy by default

No telemetry, no usage tracking, no third-party data sharing. You control every layer of the stack.

🌐

Network-capable

Connect to any external LLM provider, agent network, or API when you need it. Local-first doesn't mean isolated.

⚙️

Open source

Built on the open-source GEENii framework. Inspect, modify, and extend every component. No vendor lock-in.

What you can build

AI on your terms.

From simple chat interfaces to fully autonomous multi-agent workflows — all running on your own infrastructure.

💬

Custom Chat Assistants

Deploy private, domain-specific chat assistants with full context control. Connect to your documents, databases, and internal tools.

  • RAG pipelines over private data
  • Custom system prompts & personas
  • Multi-user with access controls
  • Chat history stored locally
🤖

Autonomous Agents

Build agents that act, not just answer. Run multi-step tasks, use tools, and orchestrate complex workflows without human intervention.

  • Tool use & function calling
  • Long-running background tasks
  • Agent-to-agent communication
  • Skill marketplace integration
🔁

AI Workflows

Chain models, tools, and agents into repeatable, auditable workflows. Schedule them, trigger them via API, or run them on demand.

  • Visual workflow builder
  • API & webhook triggers
  • Scheduled execution
  • Full run history & logs
Open-source stack

Four components.
One coherent platform.

OpenGEENii is assembled from focused, independently useful open-source projects. Use the full stack or just the parts you need.

geenii
The core framework for building AI agents and workflows. Defines the agent model, tool interfaces, and orchestration primitives.
github →
geeniid
The service daemon that runs AI agents, manages interactions, and exposes the REST API. The runtime heart of OpenGEENii.
github →
geenii-ui
The web-based interface for managing agents, models, tools, and skills. Designed to be accessible to non-technical users.
github →
geenii-aigateway
The AI gateway that normalises connections to local and remote LLM providers behind a single unified API.
github →
LLM support

Local or cloud.
Your choice, always.

Local-first means you start with full privacy and zero dependency. But when you need more capability, connecting to external providers is one config change away.

Local models — private by default
Ollama LM Studio llama.cpp LocalAI Jan Kobold

Data never leaves your machine. No API keys, no usage costs, no rate limits.

Cloud providers — when you need scale
OpenAI Anthropic Google Gemini Mistral Groq Together AI Azure OpenAI + more

Route specific workloads to cloud providers without changing your application code.

Community vs Enterprise

Start open.
Scale with GEENii.

OpenGEENii is the right starting point for most teams. When your needs grow to include compliance, SLAs, or vertical AI, GEENii Enterprise is a natural upgrade — not a migration.

OpenGEENii
GEENii Enterprise
Hosting
Self-hosted, any hardware
Cloud, on-premise, air-gapped
Compliance
Community best-effort
✓ ISO/IEC 42001 certified
SLA
No SLA
✓ 99.9% uptime SLA
Skill Marketplace
Read access, community skills
✓ Full access + Pro Skills
Cost
Free & open source
Contact for pricing
Get started

Your AI. Your data.
Your infrastructure.

OpenGEENii is free, open-source, and runs wherever you do. Clone the repo, spin up the stack, and you're running in minutes.