class langchain_core.messages.system.SystemMessage[source]

Bases: BaseMessage

Message for priming AI behavior.

The system message is usually passed in as the first of a sequence of input messages.


from langchain_core.messages import HumanMessage, SystemMessage

messages = [
        content="You are a helpful assistant! Your name is Bob."
        content="What is your name?"

# Define a chat model and invoke it with the messages

Pass in content as positional arg.

param additional_kwargs: dict [Optional]

Reserved for additional payload data associated with the message.

For example, for a message from an AI, this could include tool calls as encoded by the model provider.

param content: Union[str, List[Union[str, Dict]]] [Required]

The string contents of the message.

param id: Optional[str] = None

An optional unique identifier for the message. This should ideally be provided by the provider/model which created the message.

param name: Optional[str] = None

An optional name for the message.

This can be used to provide a human-readable name for the message.

Usage of this field is optional, and whether it’s used or not is up to the model implementation.

param response_metadata: dict [Optional]

Response metadata. For example: response headers, logprobs, token counts.

param type: Literal['system'] = 'system'

The type of the message. Must be a string that is unique to the message type.

The purpose of this field is to allow for easy identification of the message type when deserializing messages.

pretty_print() None
Return type


pretty_repr(html: bool = False) str

html (bool) –

Return type


Examples using SystemMessage