langchain_community.chat_models.premai.chat_with_retryΒΆ

langchain_community.chat_models.premai.chat_with_retry(llm: ChatPremAI, project_id: int, messages: List[dict], stream: bool = False, run_manager: Optional[CallbackManagerForLLMRun] = None, **kwargs: Any) Any[source]ΒΆ

Using tenacity for retry in completion call

Parameters
  • llm (ChatPremAI) –

  • project_id (int) –

  • messages (List[dict]) –

  • stream (bool) –

  • run_manager (Optional[CallbackManagerForLLMRun]) –

  • kwargs (Any) –

Return type

Any