langchain_text_splitters.base.split_text_on_tokens¶

langchain_text_splitters.base.split_text_on_tokens(*, text: str, tokenizer: Tokenizer) List[str][source]¶

Split incoming text and return chunks using tokenizer.

Parameters
  • text (str) –

  • tokenizer (Tokenizer) –

Return type

List[str]