langchain_core.language_models.llms
.update_cacheΒΆ
- langchain_core.language_models.llms.update_cache(cache: Optional[Union[BaseCache, bool]], existing_prompts: Dict[int, List], llm_string: str, missing_prompt_idxs: List[int], new_results: LLMResult, prompts: List[str]) Optional[dict] [source]ΒΆ
Update the cache and get the LLM output.
- Parameters
cache (Optional[Union[BaseCache, bool]]) β Cache object.
existing_prompts (Dict[int, List]) β Dictionary of existing prompts.
llm_string (str) β LLM string.
missing_prompt_idxs (List[int]) β List of missing prompt indexes.
new_results (LLMResult) β LLMResult object.
prompts (List[str]) β List of prompts.
- Returns
LLM output.
- Raises
ValueError β If the cache is not set and cache is True.
- Return type
Optional[dict]