langchain.chains.combine_documents.stuff.create_stuff_documents_chain(llm: Runnable[Union[PromptValue, str, Sequence[Union[BaseMessage, List[str], Tuple[str, str], str, Dict[str, Any]]]], Union[BaseMessage, str]], prompt: BasePromptTemplate, *, output_parser: Optional[BaseOutputParser] = None, document_prompt: Optional[BasePromptTemplate] = None, document_separator: str = '\n\n') Runnable[Dict[str, Any], Any][source]

Create a chain for passing a list of Documents to a model.

  • llm (Runnable[Union[PromptValue, str, Sequence[Union[BaseMessage, List[str], Tuple[str, str], str, Dict[str, Any]]]], Union[BaseMessage, str]]) – Language model.

  • prompt (BasePromptTemplate) – Prompt template. Must contain input variable “context”, which will be used for passing in the formatted documents.

  • output_parser (Optional[BaseOutputParser]) – Output parser. Defaults to StrOutputParser.

  • document_prompt (Optional[BasePromptTemplate]) – Prompt used for formatting each document into a string. Input variables can be “page_content” or any metadata keys that are in all documents. “page_content” will automatically retrieve the Document.page_content, and all other inputs variables will be automatically retrieved from the Document.metadata dictionary. Default to a prompt that only contains Document.page_content.

  • document_separator (str) – String separator to use between formatted document strings.


An LCEL Runnable. The input is a dictionary that must have a “context” key that maps to a List[Document], and any other input variables expected in the prompt. The Runnable return type depends on output_parser used.

Return type

Runnable[Dict[str, Any], Any]


# pip install -U langchain langchain-community

from langchain_community.chat_models import ChatOpenAI
from langchain_core.documents import Document
from langchain_core.prompts import ChatPromptTemplate
from langchain.chains.combine_documents import create_stuff_documents_chain

prompt = ChatPromptTemplate.from_messages(
    [("system", "What are everyone's favorite colors:\n\n{context}")]
llm = ChatOpenAI(model="gpt-3.5-turbo")
chain = create_stuff_documents_chain(llm, prompt)

docs = [
    Document(page_content="Jesse loves red but not yellow"),
    Document(page_content = "Jamal loves green but not as much as he loves orange")

chain.invoke({"context": docs})

Examples using create_stuff_documents_chain