langchain_core.globals.set_llm_cache

langchain_core.globals.set_llm_cache(value: Optional[BaseCache]) None[source]

Set a new LLM cache, overwriting the previous value, if any.

Parameters

value (Optional[BaseCache]) –

Return type

None

Examples using set_llm_cache