langchain_core.globals
.set_llm_cache¶
- langchain_core.globals.set_llm_cache(value: Optional[BaseCache]) None [source]¶
Set a new LLM cache, overwriting the previous value, if any.
- Parameters
value (Optional[BaseCache]) – The new LLM cache to use. If None, the LLM cache is disabled.
- Return type
None