langchain_community.chat_models.premai
.chat_with_retryΒΆ
- langchain_community.chat_models.premai.chat_with_retry(llm: ChatPremAI, project_id: int, messages: List[dict], stream: bool = False, run_manager: Optional[CallbackManagerForLLMRun] = None, **kwargs: Any) Any [source]ΒΆ
Using tenacity for retry in completion call
- Parameters
llm (ChatPremAI) β
project_id (int) β
messages (List[dict]) β
stream (bool) β
run_manager (Optional[CallbackManagerForLLMRun]) β
kwargs (Any) β
- Return type
Any