langchain_core.messages.ai
.AIMessageΒΆ
- class langchain_core.messages.ai.AIMessage[source]ΒΆ
Bases:
BaseMessage
Message from an AI.
AIMessage is returned from a chat model as a response to a prompt.
This message represents the output of the model and consists of both the raw output as returned by the model together standardized fields (e.g., tool calls, usage metadata) added by the LangChain framework.
Pass in content as positional arg.
- Parameters
content β The content of the message.
kwargs β Additional arguments to pass to the parent class.
- param additional_kwargs: dict [Optional]ΒΆ
Reserved for additional payload data associated with the message.
For example, for a message from an AI, this could include tool calls as encoded by the model provider.
- param content: Union[str, List[Union[str, Dict]]] [Required]ΒΆ
The string contents of the message.
- param example: bool = FalseΒΆ
Use to denote that a message is part of an example conversation.
At the moment, this is ignored by most models. Usage is discouraged.
- param id: Optional[str] = NoneΒΆ
An optional unique identifier for the message. This should ideally be provided by the provider/model which created the message.
- param invalid_tool_calls: List[InvalidToolCall] = []ΒΆ
If provided, tool calls with parsing errors associated with the message.
- param name: Optional[str] = NoneΒΆ
An optional name for the message.
This can be used to provide a human-readable name for the message.
Usage of this field is optional, and whether itβs used or not is up to the model implementation.
- param response_metadata: dict [Optional]ΒΆ
Response metadata. For example: response headers, logprobs, token counts.
- param type: Literal['ai'] = 'ai'ΒΆ
The type of the message (used for deserialization). Defaults to βaiβ.
- param usage_metadata: Optional[UsageMetadata] = NoneΒΆ
If provided, usage metadata for a message, such as token counts.
This is a standard representation of token usage that is consistent across models.
- pretty_print() None ΒΆ
- Return type
None