Quickstart Guide
Get your AI remembering things in under 5 minutes. No complex setup required.
Get Your API Key
Sign up and create an API key from the dashboard.
Install the SDK
Choose Python or TypeScript and install the package.
Start Building
Add memories and search with just a few lines of code.
Step 1: Create an account & get an API key
Brand new to Hebbrix? Follow these three short steps:
- Sign up at /signup using an email & password (free tier: 1,000 credits/month, no credit card required).
- Confirm your email, then log in to the Developer Dashboard.
- Open Dashboard → API Keys and click Create key. Copy it — you only see the full value once.
Prefer scripting the signup? You can also hit POST /v1/auth/register with {email, password, full_name} and use the returned access_token directly.
export HEBBRIX_API_KEY="mem_sk_your_api_key_here"Step 2: Install the SDK
We support Python and TypeScript. Pick your favorite:
pip install hebbrixnpm install hebbrixStep 3: Create Your First Memory
Store a memory and then search for it. That's all it takes.
import asyncio
from hebbrix import MemoryClient
async def main():
async with MemoryClient(api_key="mem_sk_...") as client:
# Create a collection (first time only)
collection = await client.collections.create(name="my-first-agent")
# Store a memory
memory = await client.memories.create(
collection_id=collection["id"],
content="User prefers dark mode and uses Python for development",
)
print(f"Created memory: {memory['id']}")
# Search it
results = await client.search(
query="what programming language does the user prefer?",
collection_id=collection["id"],
)
for r in results:
print(f"Found: {r['content']}")
asyncio.run(main())import { MemoryClient } from 'hebbrix';
const client = new MemoryClient({ apiKey: 'mem_sk_...' });
// Create a collection (first time only)
const collection = await client.collections.create({
name: 'my-first-agent',
});
// Store a memory
const memory = await client.memories.create({
collectionId: collection.id,
content: 'User prefers dark mode and uses TypeScript for development',
});
console.log('Created memory:', memory.id);
// Search it
const results = await client.search({
query: 'what programming language does the user prefer?',
collectionId: collection.id,
});
for (const r of results) {
console.log('Found:', r.content);
}Step 4: Organize with collections
Collections are buckets you can route memories, documents, and media into. If you don't pick one, Hebbrix assigns everything to your __default__ collection — which is fine for solo experiments but hard to untangle once you have real traffic. A minute spent creating a collection now saves an hour of sorting later.
import requests
BASE = "https://api.hebbrix.com/v1"
H = {"Authorization": f"Bearer {YOUR_API_KEY}"}
# 1) Create a collection for this customer / project / agent
r = requests.post(f"{BASE}/collections",
headers=H,
json={"name": "customer-acme", "description": "Acme Corp data"})
collection_id = r.json()["id"]
# 2) Store a memory INSIDE that collection
requests.post(f"{BASE}/memories/raw",
headers=H,
json={"content": "Acme uses Python on the backend",
"collection_id": collection_id})
# 3) Search inside that collection only
results = requests.post(f"{BASE}/search",
headers=H,
json={"query": "What does Acme use?",
"collection_id": collection_id}).json()
# 4) Delete the collection when the customer leaves — this also removes
# every memory, document, and media file inside it in a single call.
requests.delete(f"{BASE}/collections/{collection_id}", headers=H)Want to enforce explicit routing? Add the header X-Hebbrix-Require-Collection: true on any upload / create request. Missing collection_id will return HTTP 422 instead of silently defaulting.
Deleting a collection is destructive. It hard-deletes every memory, document, and media file inside — orphans are never left behind. The collection ID becomes unusable. Back up anything you need first, or use DELETE /v1/memories/{id} to remove individual items.
Bonus: OpenAI-Compatible Chat
Use our chat completions endpoint as a drop-in replacement for OpenAI. Memories are automatically injected into the context.
from openai import OpenAI
# Point to Hebbrix instead of OpenAI
client = OpenAI(
api_key="your_hebbrix_api_key",
base_url="https://api.hebbrix.com/v1"
)
# Chat with memory
response = client.chat.completions.create(
model="gpt-5-nano", # or any allowed model
messages=[
{"role": "user", "content": "What do I prefer?"}
]
)
# The response will include relevant memories automatically
print(response.choices[0].message.content)What's Next?
Now that you've got the basics, explore more features:
