langchain_core.runnables.fallbacks.RunnableWithFallbacks

Note

RunnableWithFallbacks implements the standard Runnable Interface. 🏃

class langchain_core.runnables.fallbacks.RunnableWithFallbacks[source]

Bases: RunnableSerializable[Input, Output]

Runnable that can fallback to other Runnables if it fails.

External APIs (e.g., APIs for a language model) may at times experience degraded performance or even downtime.

In these cases, it can be useful to have a fallback Runnable that can be used in place of the original Runnable (e.g., fallback to another LLM provider).

Fallbacks can be defined at the level of a single Runnable, or at the level of a chain of Runnables. Fallbacks are tried in order until one succeeds or all fail.

While you can instantiate a RunnableWithFallbacks directly, it is usually more convenient to use the with_fallbacks method on a Runnable.

Example

from langchain_core.chat_models.openai import ChatOpenAI
from langchain_core.chat_models.anthropic import ChatAnthropic

model = ChatAnthropic(
    model="claude-3-haiku-20240307"
).with_fallbacks([ChatOpenAI(model="gpt-3.5-turbo-0125")])
# Will usually use ChatAnthropic, but fallback to ChatOpenAI
# if ChatAnthropic fails.
model.invoke('hello')

# And you can also use fallbacks at the level of a chain.
# Here if both LLM providers fail, we'll fallback to a good hardcoded
# response.

from langchain_core.prompts import PromptTemplate
from langchain_core.output_parser import StrOutputParser
from langchain_core.runnables import RunnableLambda

def when_all_is_lost(inputs):
    return ("Looks like our LLM providers are down. "
            "Here's a nice 🦜️ emoji for you instead.")

chain_with_fallback = (
    PromptTemplate.from_template('Tell me a joke about {topic}')
    | model
    | StrOutputParser()
).with_fallbacks([RunnableLambda(when_all_is_lost)])

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be parsed to form a valid model.

param exception_key: Optional[str] = None

If string is specified then handled exceptions will be passed to fallbacks as part of the input under the specified key. If None, exceptions will not be passed to fallbacks. If used, the base runnable and its fallbacks must accept a dictionary as input.

param exceptions_to_handle: Tuple[Type[BaseException], ...] = (<class 'Exception'>,)

The exceptions on which fallbacks should be tried.

Any exception that is not a subclass of these exceptions will be raised immediately.

param fallbacks: Sequence[Runnable[Input, Output]] [Required]

A sequence of fallbacks to try.

param runnable: Runnable[Input, Output] [Required]

The runnable to run first.

property runnables: Iterator[Runnable[Input, Output]]