Welcome to SocialAGI
Subroutines for AI Souls
SocialAGI offers developers clean, simple, and extensible abstractions for directing the cognitive processes of large language models (LLMs), critical for the creation of AI Souls. AI Souls will comprise thousands of linguistic instructions (formerly known as 'prompts'): our focus is on streamlining the management of this complexity, freeing you to create more effective and engaging AI experiences.
The library has three main value propositions:
- Streamlined Context Management with
new CortexStep(...)
. CortexStep facilitates the ordered construction of context with LLMs. It works on the principle of treating each interaction as a single step or functional transformation on working memory, offering a predictable and manageable way to guide the thought process of an LLM. This approach results in consistent, easier-to-follow interaction flows. - Efficient Scheduling of Mental Processes with
new CortexScheduler(...)
. CortexScheduler orchestrates the scheduling and dispatching of mental processes, ensuring a synchronous flow of memory transformations from one event to the next. By turning the event-driven world into a synchronous system, CortexScheduler allows for straightforward debugging, testing, and reasoning, making the cognitive structure of your AI more understandable and predictable. - Relevant Memory Retrieval with
new MemoryStream()
MemoryStream provides a simple way to store and retrieve relevant memories based on vector embeddings, importance, and recency.
Getting Started with SocialAGI
You can start using SocialAGI's cognitive tools:
$ npm install socialagi
These docs describe importing the "socialagi/next" export which is a recent addition to our codebase
import { CortexStep } from "socialagi/next"
Supported LLMs
SocialAGI is primarily intended to work with OpenAI, however, it is possible to substitute in any language model through our executor and streaming interfaces.