BETA The Aivyx ecosystem is currently in beta. Features may change and bugs are expected.

Documentation

Everything you need to get started and go deep.

Quick Start

From zero to chatting with your own AI agent.

📦

Install

cargo install --git https://github.com/AivyxDev/aivyx.git aivyx-cli

Need Rust? See the Download page for complete step-by-step instructions.

🔑

Set Up

aivyx init

Creates your encrypted data directory, configures your LLM provider, and sets your passphrase.

💬

Chat

aivyx chat assistant

Start an interactive conversation. Or use aivyx tui assistant for the full terminal UI.

Everyday Commands

The commands you'll use most often.

Run a Task

aivyx run assistant "Explain closures in Rust"

Single-turn — ask a question, get an answer.

Interactive Chat

aivyx chat assistant

Multi-turn conversation with memory.

Terminal UI

aivyx tui assistant

Full-screen chat with sessions, markdown, and vim keys.

Use a Skill

aivyx run assistant --skill code-review

22 built-in skills for research, audits, analysis, and more.

Store an API Key

aivyx secret set OPENAI_API_KEY

Encrypted secret store — enter the value interactively.

Add an MCP Tool

aivyx mcp add github \
  --command "npx -y @mcp/server-github"

Connect external tools via MCP.

Search Memories

aivyx memory search "Rust ownership"

Find knowledge your agent has learned.

Verify Audit Log

aivyx audit verify

Confirm no one has tampered with your logs.

Backup Everything

aivyx backup

Create an encrypted archive of all your data.

Supported LLM Providers

Choose the provider that fits your needs.

🏠 Ollama (Local)

100% offline. Your data never leaves your machine.

aivyx config set provider ollama
aivyx config set ollama.model llama3.2

OpenAI

aivyx config set provider openai
aivyx secret set OPENAI_API_KEY

Anthropic

aivyx config set provider anthropic
aivyx secret set ANTHROPIC_API_KEY

Google Gemini

aivyx config set provider gemini
aivyx secret set GEMINI_API_KEY