Learn how to use get_context() to retrieve and format conversation context for LLM integration
get_context()
method is a powerful feature that retrieves formatted conversation context from sessions, making it easy to integrate with LLMs like OpenAI, Anthropic, and others. This guide covers everything you need to know about working with session context.
By default, the context includes a blend of summary and messages which covers the entire history of the session. Summaries are automatically generated at intervals and recent messages are included depending on how many tokens the context is intended to be. You can specify any token limit you want, and can disable summaries to fill that limit entirely with recent messages.
get_context()
method is available on all Session objects and returns a SessionContext
that contains the formatted conversation history.
get_context()
method accepts several optional parameters to customize the retrieved context:
SessionContext
object provides methods to convert the context into formats compatible with popular LLM APIs. When converting to OpenAI format, you must specify the assistant peer to format the context in such a way that the LLM can understand it.
get_context()
method is essential for integrating Honcho sessions with LLMs. By understanding how to: