Skip to content
Enterprise Trust Center Security Request Security Materials
Pricing

Pricing Built Around What You Need

Product-Line Pricing

Start with one product. Expand as your needs grow. Every plan includes controls mapped to SOC 2 TSC and request-level audit records.

LLM Gateway

Governed AI access for your entire organization. Route, monitor, and control every LLM request across providers.

Starter

For teams evaluating governed AI access

Contact Sales

Custom pricing for your needs

  • Rate limits and request throttling
  • Basic cost tracking and usage reports
  • 2 AI providers (e.g., OpenAI, Anthropic)
  • API key management
  • Audit logging
  • Email support (24hr response)
  • Policy-based routing
  • PKCE browser sessions
Contact Sales

Enterprise

Full governance at scale

Contact Sales

Tailored to your organization

  • Custom rate limits and routing rules
  • Priority support team — response targets are scoped to your contract
  • High-availability operating target (no formal SLA with credits unless specified in contract)
  • SSO / SAML integration
  • Custom organization policies
  • Priority model routing
  • SCIM provisioning
  • Dedicated infrastructure option
Contact Sales
Controls mapped to SOC 2 TSC
TLS 1.3 + AES-256
Audit Logging
RBAC + MFA
High-availability SLA
Priority Support Available

Frequently Asked Questions

Absolutely. Each product line is available independently. Most organizations start with LLM Gateway to govern AI access across their teams, then expand to Magic Runtime or the full platform as their needs grow. There are no bundling requirements.
Yes. The core Magic Runtime is released under the Apache 2.0 license. You can self-host it, modify it, and contribute to it. The Professional and Enterprise tiers add managed deployment, enterprise admin tooling, procurement materials, and priority support on top of the open-source core.
The Enterprise Platform bundle is contracted as: LLM Gateway for governed AI access, Magic Runtime for governed execution, and ThreadSync Core for enterprise data integration. Platform reliability and observability are built into the platform itself, not sold as a separate tier. Architecture review, SLA terms, and Customer Success Manager assignment are negotiated as part of the contract where included.
LLM Gateway pricing covers the governance, routing, monitoring, and management layer. AI provider costs (tokens from OpenAI, Anthropic, etc.) are separate and billed directly by those providers or passed through transparently. Our budget controls and cost tracking help you manage and optimize those provider costs across your organization.
Yes. We work with companies at every stage. If you're a startup or growth-stage company, contact our sales team to discuss flexible pricing options tailored to your scale and trajectory. We also offer proof-of-concept periods for qualified organizations.

Ready to Get Started?

Talk to our solutions engineering team to find the right configuration for your organization.