How to Use MegaNova with OpenClaw
Access 100+ AI models through one API — at a fraction of what you'd pay for Claude or OpenAI.
What is MegaNova?
MegaNova is an API gateway that gives you access to 100+ AI models — DeepSeek, Qwen, Kimi, GLM, MiniMax, and more — through a single OpenAI-compatible endpoint. No need to manage separate accounts or billing with each provider.
Why Use MegaNova?
One API, massive cost savings
Models like DeepSeek-V3.2 and Qwen3.5-Plus deliver strong results at a tiny fraction of Claude or OpenAI pricing. MegaNova lets you access all of them with one key.
Claude Sonnet 4.5: $2.40 / $12.00 · Claude Opus 4.5: $4.00 / $20.00
Other benefits
- One API key for 100+ models — no juggling provider accounts
- OpenAI-compatible endpoint — drop-in replacement, works with any OpenAI SDK
- Unified billing and usage tracking across all models
- Free daily quota — test every model before spending a cent
Free Quota — Test OpenClaw for Free
MegaNova's free tier is perfect for trying out OpenClaw before committing any money.
What can you do with 550 free messages/day?
- Test OpenClaw for 1–2 weeks with real workloads
- Evaluate all available models side-by-side
- Build and test your first agents
- ~18 messages per hour during active use
Just sign up → get your key → start using OpenClaw. No payment needed.
When you're ready for more volume:
Setup (5 Minutes)
Step 1: Get Your API Key
- Go to meganova.ai
- Sign up (free, no credit card needed)
- Copy your API key from the Dashboard
Step 2: Configure OpenClaw
Choose Option A (interactive wizard) or Option B (manual config):
Option A: Use the Onboarding Wizard
openclaw onboard
When prompted, select the following options:
- Where will the Gateway run? → Local (this machine)
- Select sections to configure → Model
- Model/auth provider → Custom Provider
- API Base URL → https://api.meganova.ai/v1
- API Key → Paste your MegaNova API key
- Endpoint compatibility → OpenAI-compatible
- Model ID → Enter your model (e.g. moonshotai/Kimi-K2.5 or deepseek/DeepSeek-V3.2)
- Model alias (optional) → Set a shortcut name (e.g. kimi, deepseek)
The wizard will verify your endpoint and confirm the setup.
Option B: Edit Config Directly
Edit ~/.openclaw/openclaw.json and add the MegaNova provider block:
{
"models": {
"providers": {
"meganova": {
"baseUrl": "https://api.meganova.ai/v1",
"apiKey": "YOUR_MEGANOVA_API_KEY",
"api": "openai-completions",
"models": [
{
"id": "moonshotai/Kimi-K2.5",
"name": "Kimi K2.5",
"contextWindow": 128000,
"maxTokens": 262144
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "meganova/moonshotai/Kimi-K2.5"
}
}
}
}
Then restart OpenClaw:
openclaw gateway restart
Step 3: Set Your Default and Fallbacks
# Set DeepSeek as primary (best value)
openclaw models set meganova/deepseek/DeepSeek-V3.2# Add Kimi as fallback
openclaw models fallbacks add meganova/moonshotai/Kimi-K2.5Step 4: Test It
openclaw chat "Hello, test message"
If you get a response, you're good to go! ✅
Recommended Models
Quick Picks
By Use Case
🎭 AI Character RP / Chatbots:
- Primary: DeepSeek-V3.2 (best quality/price)
- Fallback: Kimi-K2.5 (128K context for long conversations)
- Budget: MiniMax-M2.5 (fastest, cheapest)
💻 Coding / Development:
- Primary: DeepSeek-V3.2 (excellent at code generation)
- Fallback: GLM-5 (reasoning enabled for complex logic)
- Alternative: Qwen3.5-Plus (good for debugging)
📚 Research / Document Analysis:
- Primary: Kimi-K2.5 (128K context window)
- Fallback: GLM-5 (200K context, reasoning)
- Budget: DeepSeek-V3.2 (128K, cheaper)
🎨 Multimodal (Images + Text):
- Primary: Qwen3.5-Plus (vision + text)
- Alternative: Kimi-K2.5 (multimodal support)
⚡ Fast Responses Needed:
- Primary: MiniMax-M2.5 (~200ms first token)
- Fallback: DeepSeek-V3.2 (~400ms first token)
🏢 Production / Enterprise:
- Primary: GLM-5 (reasoning, most capable)
- Fallback: DeepSeek-V3.2 + Kimi-K2.5 (multiple fallbacks)
- Why: GLM-5 has reasoning enabled for complex tasks
My Personal Setup
{
"primary": "meganova/deepseek/DeepSeek-V3.2",
"fallbacks": [
"meganova/moonshotai/Kimi-K2.5",
"meganova/zai-org/GLM-5"
]
}Why this combo?
- DeepSeek-V3.2: ~97% cheaper than Claude Sonnet, handles most tasks well
- Kimi-K2.5: 128K context backup for long documents
- GLM-5: 200K context with reasoning for complex tasks
Code Example
from openai import OpenAI
client = OpenAI(
base_url="https://api.meganova.ai/v1",
api_key="sk-YOUR_KEY"
)
response = client.chat.completions.create(
model="deepseek/DeepSeek-V3.2",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)TL;DR
- Sign up free at meganova.ai
- Run openclaw onboard to configure MegaNova as your provider
- Use your 550 free daily messages to test everything
- Save 80–98% compared to Claude or OpenAI pricing
Questions? Check docs.meganova.ai or ask in Discord.
Setup time: ~5 minutes
Stay Connected
- Website: meganova.ai
- Discord: Join our Discord
- Reddit: r/MegaNovaAI
- X: @meganovaai