Class JsonOutputFunctionsParser

Class for parsing the output of an LLM into a JSON object. Uses an instance of OutputFunctionsParser to parse the output.

Hierarchy

Constructors

Properties

argsOnly: boolean = true
outputParser: OutputFunctionsParser
diff: boolean = false

Methods

  • Return a string describing the format of the output.

    Returns string

    Format instructions.

    Example

    {
    "foo": "bar"
    }
  • Calls the parser with a given input and optional configuration options. If the input is a string, it creates a generation with the input as text and calls parseResult. If the input is a BaseMessage, it creates a generation with the input as a message and the content of the input as text, and then calls parseResult.

    Parameters

    • input: string | BaseMessage

      The input to the parser, which can be a string or a BaseMessage.

    • Optional options: BaseCallbackConfig

      Optional configuration options.

    Returns Promise<object>

    A promise of the parsed output.

  • Parse the output of an LLM call.

    Parameters

    • text: string

      LLM output to parse.

    Returns Promise<object>

    Parsed output.

  • Parses the output and returns a JSON object. If argsOnly is true, only the arguments of the function call are returned.

    Parameters

    Returns Promise<object>

    A JSON object representation of the function call or its arguments.

  • Parses the result of an LLM call with a given prompt. By default, it simply calls parseResult.

    Parameters

    Returns Promise<object>

    A promise of the parsed output.

  • Create a new runnable sequence that runs each individual runnable in series, piping the output of one runnable into another runnable or runnable-like.

    Type Parameters

    • NewRunOutput

    Parameters

    • coerceable: RunnableLike<object, NewRunOutput>

      A runnable, function, or object whose values are functions or runnables.

    Returns RunnableSequence<string | BaseMessage, Exclude<NewRunOutput, Error>>

    A new runnable sequence.

  • Stream output in chunks.

    Parameters

    Returns Promise<IterableReadableStream<object>>

    A readable stream that is also an iterable.

  • Stream all output from a runnable, as reported to the callback system. This includes all inner runs of LLMs, Retrievers, Tools, etc. Output is streamed as Log objects, which include a list of jsonpatch ops that describe how the state of the run has changed in each step, and the final state of the run. The jsonpatch ops can be applied in order to construct state.

    Parameters

    • input: string | BaseMessage
    • Optional options: Partial<BaseCallbackConfig>
    • Optional streamOptions: Omit<LogStreamCallbackHandlerInput, "autoClose">

    Returns AsyncGenerator<RunLogPatch, any, unknown>

  • Transforms an asynchronous generator of input into an asynchronous generator of parsed output.

    Parameters

    • inputGenerator: AsyncGenerator<string | BaseMessage, any, unknown>

      An asynchronous generator of input.

    • options: BaseCallbackConfig

      A configuration object.

    Returns AsyncGenerator<object, any, unknown>

    An asynchronous generator of parsed output.

Generated using TypeDoc