Instant expert answers about HeliosDB — powered by a model trained on every page of documentation, every SQL example, and every configuration option.
Connect your HeliosDB deployment to an AI knowledge assistant that understands every SQL dialect, configuration parameter, and best practice across the entire HeliosDB platform. Ask questions in natural language, get answers with ready-to-run SQL.
Available for all three tiers — Nano, Lite, and Full. Included free with all HeliosDB Cloud plans. On-premises deployments use a licensed API key.
from heliosdb import AISupport
support = AISupport(api_key="HELIOSDB_AISUPPORT_API_KEY")
answer = support.ask(
"How do I create an HNSW index with PQ compression?",
context={"tier": "nano", "version": "3.6.0"}
)
print(answer.response) # Step-by-step guide
print(answer.sql_example) # Ready-to-run SQL
print(answer.references) # Links to docs
AI Support combines a purpose-trained language model with a retrieval-augmented generation (RAG) pipeline over the entire HeliosDB knowledge base. Every question is matched against documentation, code examples, configuration references, and troubleshooting guides to deliver accurate, context-aware answers.
# AI Support Pipeline
User Question
→ "How do I set up vector search?"
│
Context Retrieval (RAG)
→ Semantic search over 6K+ doc pages
→ Match: tier=nano, topic=vector_search
→ Retrieve: SQL syntax, config params, examples
│
Trained Model
→ Generate answer with context
→ Include SQL examples
→ Cite documentation sources
│
Response
→ Step-by-step explanation
→ Ready-to-run SQL
→ Links to relevant docs
AI Support is trained on the complete HeliosDB documentation corpus, including internal architecture guides, code examples across all supported languages, and every configuration parameter. It understands the differences between tiers and can guide you from getting started to production-grade optimization.
heliosdb.toml parameter, ALTER SYSTEM command, and runtime tuning option with explanations and recommended values.# Ask about configuration
> "What's the optimal wal_sync_mode for SSD?"
Answer:
For SSD deployments, set wal_sync_mode to
"fdatasync" in heliosdb.toml:
[storage]
wal_sync_mode = "fdatasync"
This avoids the overhead of full fsync while
still guaranteeing durability on SSD hardware.
For NVMe drives, consider "io_uring" mode
(Lite/Full only) for async I/O.
# Ask about SQL
> "How do I branch a database?"
Answer:
Database branching creates an isolated copy-on-
write snapshot:
CREATE BRANCH dev;
USE BRANCH dev;
-- Changes here don't affect main
MERGE BRANCH dev INTO main;
AI Support is included free with HeliosDB Cloud. For self-hosted and on-premises deployments, license an API key.
AI Support is built on an extensible architecture. Add knowledge about your own systems, connect external tools, or integrate with other database platforms.
Deep understanding of all three tiers — Nano, Lite, and Full — including SQL syntax, configuration, architecture internals, and deployment best practices.
Add custom knowledge about your specific deployment, internal runbooks, team conventions, and proprietary schema documentation. Extend the knowledge base via RAG pipelines over your own documents.
Connect AI Support to external tools and workflows via the Model Context Protocol (MCP). Integrate with CI/CD pipelines, monitoring dashboards, and AI agent frameworks like LangChain and CrewAI.
New developers get instant answers to HeliosDB questions without waiting for senior team members. Reduce onboarding time from weeks to days.
No more waiting for support tickets or searching through documentation. Get expert-level answers about HeliosDB at any hour, in any timezone.
Validate SQL queries before production. Get optimized alternatives, identify potential performance issues, and generate complex queries from natural language descriptions.
Ask about encryption setup, access control policies, audit logging, and regulatory compliance configuration. Get step-by-step guidance for SOC 2, HIPAA, and GDPR requirements.
Be among the first to use HeliosDB AI Support. Available for all tiers — free on Cloud, licensed for on-premises.