langchain_experimental.generative_agents.memory
.GenerativeAgentMemory¶
- class langchain_experimental.generative_agents.memory.GenerativeAgentMemory[source]¶
Bases:
BaseMemory
Memory for the generative agent.
- param add_memory_key: str = 'add_memory'¶
- param aggregate_importance: float = 0.0¶
Track the sum of the ‘importance’ of recent memories.
Triggers reflection when it reaches reflection_threshold.
- param current_plan: List[str] = []¶
The current plan of the agent.
- param importance_weight: float = 0.15¶
How much weight to assign the memory importance.
- param llm: BaseLanguageModel [Required]¶
The core language model.
- param max_tokens_limit: int = 1200¶
- param memory_retriever: TimeWeightedVectorStoreRetriever [Required]¶
The retriever to fetch related memories.
- param most_recent_memories_key: str = 'most_recent_memories'¶
- param most_recent_memories_token_key: str = 'recent_memories_token'¶
- param now_key: str = 'now'¶
- param queries_key: str = 'queries'¶
- param reflecting: bool = False¶
- param reflection_threshold: Optional[float] = None¶
When aggregate_importance exceeds reflection_threshold, stop to reflect.
- param relevant_memories_key: str = 'relevant_memories'¶
- param relevant_memories_simple_key: str = 'relevant_memories_simple'¶
- param verbose: bool = False¶
- async aclear() None ¶
Async clear memory contents.
- Return type
None
- add_memories(memory_content: str, now: Optional[datetime] = None) List[str] [source]¶
Add an observations or memories to the agent’s memory.
- Parameters
memory_content (str) –
now (Optional[datetime]) –
- Return type
List[str]
- add_memory(memory_content: str, now: Optional[datetime] = None) List[str] [source]¶
Add an observation or memory to the agent’s memory.
- Parameters
memory_content (str) –
now (Optional[datetime]) –
- Return type
List[str]
- async aload_memory_variables(inputs: Dict[str, Any]) Dict[str, Any] ¶
Async return key-value pairs given the text input to the chain.
- Parameters
inputs (Dict[str, Any]) – The inputs to the chain.
- Returns
A dictionary of key-value pairs.
- Return type
Dict[str, Any]
- async asave_context(inputs: Dict[str, Any], outputs: Dict[str, str]) None ¶
Async save the context of this chain run to memory.
- Parameters
inputs (Dict[str, Any]) – The inputs to the chain.
outputs (Dict[str, str]) – The outputs of the chain.
- Return type
None
- chain(prompt: PromptTemplate) LLMChain [source]¶
- Parameters
prompt (PromptTemplate) –
- Return type
- fetch_memories(observation: str, now: Optional[datetime] = None) List[Document] [source]¶
Fetch related memories.
- Parameters
observation (str) –
now (Optional[datetime]) –
- Return type
List[Document]
- format_memories_detail(relevant_memories: List[Document]) str [source]¶
- Parameters
relevant_memories (List[Document]) –
- Return type
str
- format_memories_simple(relevant_memories: List[Document]) str [source]¶
- Parameters
relevant_memories (List[Document]) –
- Return type
str
- load_memory_variables(inputs: Dict[str, Any]) Dict[str, str] [source]¶
Return key-value pairs given the text input to the chain.
- Parameters
inputs (Dict[str, Any]) –
- Return type
Dict[str, str]
- pause_to_reflect(now: Optional[datetime] = None) List[str] [source]¶
Reflect on recent observations and generate ‘insights’.
- Parameters
now (Optional[datetime]) –
- Return type
List[str]
- save_context(inputs: Dict[str, Any], outputs: Dict[str, Any]) None [source]¶
Save the context of this model run to memory.
- Parameters
inputs (Dict[str, Any]) –
outputs (Dict[str, Any]) –
- Return type
None
- property memory_variables: List[str]¶
Input keys this memory class will load dynamically.