langchain.chains.openai_tools.extraction.create_extraction_chain_pydantic¶

langchain.chains.openai_tools.extraction.create_extraction_chain_pydantic(pydantic_schemas: Union[List[Type[BaseModel]], Type[BaseModel]], llm: BaseLanguageModel, system_message: str = 'Extract and save the relevant entities mentioned in the following passage together with their properties.\n\nIf a property is not present and is not required in the function parameters, do not include it in the output.') Runnable[source]¶

[Deprecated] Creates a chain that extracts information from a passage.

Parameters
  • pydantic_schemas (Union[List[Type[BaseModel]], Type[BaseModel]]) – The schema of the entities to extract.

  • llm (BaseLanguageModel) – The language model to use.

  • system_message (str) – The system message to use for extraction.

Returns

A runnable that extracts information from a passage.

Return type

Runnable

Notes

Deprecated since version 0.1.14: LangChain has introduced a method called with_structured_output thatis available on ChatModels capable of tool calling.You can read more about the method here: https://python.langchain.com/docs/modules/model_io/chat/structured_output/Please follow our extraction use case documentation for more guidelineson how to do information extraction with LLMs.https://python.langchain.com/docs/use_cases/extraction/.with_structured_output does not currently support a list of pydantic schemas. If this is a blocker or if you notice other issues, please provide feedback here:https://github.com/langchain-ai/langchain/discussions/18154 Use from langchain_core.pydantic_v1 import BaseModel, Field from langchain_anthropic import ChatAnthropic

class Joke(BaseModel):

setup: str = Field(description=”The setup of the joke”) punchline: str = Field(description=”The punchline to the joke”)

# Or any other chat model that supports tools. # Please reference to to the documentation of structured_output # to see an up to date list of which models support # with_structured_output. model = ChatAnthropic(model=”claude-3-opus-20240229”, temperature=0) structured_llm = model.with_structured_output(Joke) structured_llm.invoke(“Tell me a joke about cats.

Make sure to call the Joke function.”)

instead.

Examples using create_extraction_chain_pydantic¶