langchain_community.embeddings.localai.LocalAIEmbeddings¶

class langchain_community.embeddings.localai.LocalAIEmbeddings[source]¶

Bases: BaseModel, Embeddings

LocalAI embedding models.

Since LocalAI and OpenAI have 1:1 compatibility between APIs, this class uses the openai Python package’s openai.Embedding as its client. Thus, you should have the openai python package installed, and defeat the environment variable OPENAI_API_KEY by setting to a random string. You also need to specify OPENAI_API_BASE to point to your LocalAI service endpoint.

Example

from langchain_community.embeddings import LocalAIEmbeddings
openai = LocalAIEmbeddings(
    openai_api_key="random-string",
    openai_api_base="http://localhost:8080"
)

Create a new model by parsing and validating input data from keyword arguments.

Raises ValidationError if the input data cannot be parsed to form a valid model.

param allowed_special: Union[Literal['all'], Set[str]] = {}¶
param chunk_size: int = 1000¶

Maximum number of texts to embed in each batch

param deployment: str = 'text-embedding-ada-002'¶
param disallowed_special: Union[Literal['all'], Set[str], Sequence[str]] = 'all'¶
param embedding_ctx_length: int = 8191¶

The maximum number of tokens to embed at once.

param headers: Any = None¶
param max_retries: int = 6¶

Maximum number of retries to make when generating.

param model: str = 'text-embedding-ada-002'¶
param model_kwargs: Dict[str, Any] [Optional]¶

Holds any model parameters valid for create call not explicitly specified.

param openai_api_base: Optional[str] = None¶
param openai_api_key: Optional[str] = None¶
param openai_api_version: Optional[str] = None¶
param openai_organization: Optional[str] = None¶
param openai_proxy: Optional[str] = None¶
param request_timeout: Optional[Union[float, Tuple[float, float]]] = None¶

Timeout in seconds for the LocalAI request.

param show_progress_bar: bool = False¶

Whether to show a progress bar when embedding.

async aembed_documents(texts: List[str], chunk_size: Optional[int] = 0) List[List[float]][source]¶

Call out to LocalAI’s embedding endpoint async for embedding search docs.

Parameters
  • texts (List[str]) – The list of texts to embed.

  • chunk_size (Optional[int]) – The chunk size of embeddings. If None, will use the chunk size specified by the class.

Returns

List of embeddings, one for each text.

Return type

List[List[float]]

async aembed_query(text: str) List[float][source]¶

Call out to LocalAI’s embedding endpoint async for embedding query text.

Parameters

text (str) – The text to embed.

Returns

Embedding for the text.

Return type

List[float]

classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model¶

Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values

Parameters
  • _fields_set (Optional[SetStr]) –

  • values (Any) –

Return type

Model

copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model¶

Duplicate a model, optionally choose which fields to include, exclude and change.

Parameters
  • include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model

  • exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include

  • update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data

  • deep (bool) – set to True to make a deep copy of the model

  • self (Model) –

Returns

new model instance

Return type

Model

dict(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny¶

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters
  • include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –

  • exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –

  • by_alias (bool) –

  • skip_defaults (Optional[bool]) –

  • exclude_unset (bool) –

  • exclude_defaults (bool) –

  • exclude_none (bool) –

Return type

DictStrAny

embed_documents(texts: List[str], chunk_size: Optional[int] = 0) List[List[float]][source]¶

Call out to LocalAI’s embedding endpoint for embedding search docs.

Parameters
  • texts (List[str]) – The list of texts to embed.

  • chunk_size (Optional[int]) – The chunk size of embeddings. If None, will use the chunk size specified by the class.

Returns

List of embeddings, one for each text.

Return type

List[List[float]]

embed_query(text: str) List[float][source]¶

Call out to LocalAI’s embedding endpoint for embedding query text.

Parameters

text (str) – The text to embed.

Returns

Embedding for the text.

Return type

List[float]

classmethod from_orm(obj: Any) Model¶
Parameters

obj (Any) –

Return type

Model

json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode¶

Generate a JSON representation of the model, include and exclude arguments as per dict().

encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().

Parameters
  • include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –

  • exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –

  • by_alias (bool) –

  • skip_defaults (Optional[bool]) –

  • exclude_unset (bool) –

  • exclude_defaults (bool) –

  • exclude_none (bool) –

  • encoder (Optional[Callable[[Any], Any]]) –

  • models_as_dict (bool) –

  • dumps_kwargs (Any) –

Return type

unicode

classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
Parameters
  • path (Union[str, Path]) –

  • content_type (unicode) –

  • encoding (unicode) –

  • proto (Protocol) –

  • allow_pickle (bool) –

Return type

Model

classmethod parse_obj(obj: Any) Model¶
Parameters

obj (Any) –

Return type

Model

classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
Parameters
  • b (Union[str, bytes]) –

  • content_type (unicode) –

  • encoding (unicode) –

  • proto (Protocol) –

  • allow_pickle (bool) –

Return type

Model

classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny¶
Parameters
  • by_alias (bool) –

  • ref_template (unicode) –

Return type

DictStrAny

classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode¶
Parameters
  • by_alias (bool) –

  • ref_template (unicode) –

  • dumps_kwargs (Any) –

Return type

unicode

classmethod update_forward_refs(**localns: Any) None¶

Try to update ForwardRefs on fields based on this Model, globalns and localns.

Parameters

localns (Any) –

Return type

None

classmethod validate(value: Any) Model¶
Parameters

value (Any) –

Return type

Model

Examples using LocalAIEmbeddings¶