Nova OS: The Single-Binary AI Platform That Runs Your Entire Agent Stack

Nova OS: The Single-Binary AI Platform That Runs Your Entire Agent Stack

Most enterprise software deployments involve a stack of moving parts.

A web server. A database. A cache layer. A message queue. Configuration files spread across directories. Dependencies that need to match specific versions. Services that need to start in the right order. Environment variables that need to be set correctly on every machine. Documentation that describes what the production environment should look like — and a production environment that inevitably drifts from that description.

Enterprise IT teams spend enormous amounts of time managing this complexity. Not building things. Not solving business problems. Managing the infrastructure that the software runs on.

Nova OS ships as a single compiled binary.

~30-50MB. One file. Your entire enterprise AI platform — 23+ specialized agents, AI Firewall, NovaBrain task planner, knowledge base, app marketplace, embedded React dashboard — compiled into a single executable.


What "Single Binary" Actually Means in Practice

When you deploy Nova OS, you run a Docker Compose stack with four services:

nova-os       ← The binary (~30-50MB), embedded dashboard included
surrealdb     ← Knowledge database
postgres      ← Relational storage
marketplace   ← Apps you choose to install

The nova-os binary contains everything: the HTTP server, the routing engine, the agent framework, the firewall, the task planner, the memory system, the skill runner, the marketplace manager, the authentication system, and the React dashboard.

There is no separate routing service to deploy. No separate firewall service to configure. No separate agent runtime to install. No dashboard server to maintain alongside the API server. All of it is in the binary.

Install comes down to:

curl -fsSL https://get.meganova.ai | sh

Configure your LLM API key, set your database connection, start the stack. Nova OS is running.


What Lives Inside the Binary

The Nova OS binary is not a thin wrapper. It is the full platform:

HTTP Server — Echo framework with authentication middleware (JWT HS256), rate limiting (6,000 req/min default), CORS handling, and request tracing. Every API endpoint, from agent chat to knowledge ingestion to marketplace management, is served by this layer.

Cascade Router — three-tier routing engine. Condition-based rules exit at 5ms when the request matches a known pattern. Semantic matching handles the middle tier at 20-50ms. LLM-based routing resolves ambiguous requests at 500-2000ms. Each tier exits early at 0.8 confidence, so fast paths stay fast.

AI Firewall — 21 threat patterns evaluated on every request before it reaches the model. Prompt injection detection, PII identification and redaction, policy violation scoring. 84.6% F1 on PIGuard. 23ms average latency. No request bypasses it.

NovaBrain Task Planner — LLM-powered request decomposition. Complex multi-step requests are broken into structured dependency graphs and executed via Kahn's algorithm, which runs independent tasks in parallel within each dependency level.

Agent Framework — stateless SkillAgent runs an LLM loop with tool invocation, up to 10 turns per request. Each agent has defined skills, a knowledge scope, and behavioral constraints. The binary ships with 23+ specialized agents pre-configured.

Knowledge Layer — SurrealDB connector providing graph search, vector similarity search, and keyword search over your uploaded knowledge base. Retrieval accuracy: 85.4% F1 on LongMemEval.

Skill Runner — 13 skill packs including PDF extraction, Excel analysis, DOCX processing, web research, code execution (sandboxed), database queries, and image generation. Both native Go skills and Python subprocess skills (compiled to bytecode, source deleted) are bundled.

Embedded Dashboard — full React + Vite frontend, embedded via Go's go:embed directive. No separate frontend server. Open a browser, point it at port 8900, and the dashboard is there.


Why a Single Binary Matters for Enterprise IT

Deployment is deterministic. There is no "it works on staging but not production" problem caused by environment drift. The binary is the binary. What runs in your test environment is identical to what runs in production.

Updates are atomic. Upgrading Nova OS means replacing one file. Roll forward: replace the binary. Roll back: restore the previous binary. No partial upgrades. No service-by-service coordination.

Security surface is minimal. Every additional service in a stack is an additional attack surface, an additional set of credentials to manage, an additional process to monitor. Nova OS's core logic runs in one process with one set of credentials.

IP is protected. The binary ships as a compiled, stripped Go executable with symbols removed (-ldflags="-s -w"). The routing engine, firewall patterns, and task planning logic run on your infrastructure as opaque compiled code — readable by no one. Your investment in the platform doesn't become readable source code sitting on customer servers.

Operational overhead is low. Your IT team monitors one process, not twelve. Log aggregation, health checking, and alerting have one primary target. On-call runbooks are simpler.


What You Configure, What's Baked In

Nova OS draws a deliberate line between what customers control and what ships as protected IP:

You configure Baked into the binary
LLM provider credentials Cascade routing engine
Knowledge base (document uploads) NovaBrain task planner and DAG executor
Marketplace apps (install/remove) AI Firewall rules and patterns
Ports, storage paths, env vars Skill implementations
Agent personas and corporate identity Dashboard and API layer

You control the data and the configuration. The intelligence — the routing logic, the planning engine, the safety system — is compiled in and protected.


The API

Nova OS exposes an OpenAI-compatible endpoint:

POST /v1/chat/completions

Any application built for the OpenAI API works with Nova OS without modification. Point your existing tools at http://localhost:8900, use your Nova OS JWT token, and your stack integrates immediately.

Additional endpoints cover knowledge management, firewall validation, marketplace control, user management, and agent configuration — a complete management API alongside the inference layer.


Available Soon

Nova OS is launching soon. The binary is built. The benchmarks are done. One command, and your enterprise AI platform is running.

Get Early Access to Nova OS →

Stay Connected

💻 Website: meganova.ai

📖 Docs: docs.meganova.ai

✍️ Blog: Read our Blog

🐦 Twitter: @meganovaai

🎮 Discord: Join our Discord