Installation
Quickstart
Without configuration, the SDK defaults to the demo server. For production use:
- Get your API key at app.honcho.dev/api-keys
- Set
environment="production"and provide yourapi_key
Core Concepts
Peers and Representations
Representations are how Honcho models what peers know. Each peer has a global representation (everything they know across all sessions) and local representations (what other specific peers know about them, scoped by session or globally).
Core Classes
Honcho Client
The main entry point for workspace operations:HONCHO_API_KEY- API key for authenticationHONCHO_BASE_URL- Base URL for the Honcho APIHONCHO_WORKSPACE_ID- Default workspace ID
Peer and session creation is lazy - no API calls are made until you actually use the peer or session.
Peer
Represents an entity that can participate in conversations:Peer Context
Theget_context() method on peers retrieves both the working representation and peer card in a single API call:
Observations
Peers can access their observations (facts derived from messages) through theobservations property and observations_of() method:
Peer Context
Theget_context() method on peers retrieves both the working representation and peer card in a single API call:
Observations
Peers can access their observations (facts derived from messages) through theobservations property and observations_of() method:
Creating Observations Manually
You can also create observations directly, which is useful for importing data or adding explicit facts:Manually created observations are marked as “explicit” and are treated the same as system-derived observations. Each observation must be tied to a session and the content length is validated against the embedding token limit.
Session
Manages multi-party conversations:Theory of Mind controls whether peers can form models of what other peers think. Use
observe_others=False to prevent a peer from modeling others within a session, and observe_me=False to prevent others from modeling this peer within a session.SessionContext
Provides formatted conversation context for LLM integration:| Parameter | Type | Description |
|---|---|---|
summary | bool | Whether to include summary (default: true) |
tokens | int | Maximum tokens to include |
peer_target | str | Peer ID to get representation for |
peer_perspective | str | Peer ID for perspective (requires peer_target) |
last_user_message | str | Most recent message for semantic search |
limit_to_session | bool | Limit representation to session only |
search_top_k | int | Number of semantic search results (1-100) |
search_max_distance | float | Max semantic distance (0.0-1.0) |
include_most_derived | bool | Include most derived observations |
max_observations | int | Max observations to include (1-100) |
Advanced Usage
Multi-Party Conversations
LLM Integration
Custom Message Timestamps
When creating messages, you can optionally specify a customcreated_at timestamp instead of using the server’s current time:
- Importing historical conversation data
- Backfilling messages from other systems
- Maintaining accurate timeline ordering when processing batch data
created_at is not provided, messages will use the server’s current timestamp.