Skip to contents

This function generates a callback function that processes streaming responses from different language model APIs. The callback function is specific to the API provided (claude, ollama, "mistral", or chatgpt) and processes incoming data streams, printing the content to the console and updating a global environment for further use.

Usage

generate_callback_function(.api)

Arguments

.api

A character string indicating the API type. Supported values are "claude", "ollama", "mistral", and "chatgpt".

Value

A function that serves as a callback to handle streaming responses from the specified API. The callback function processes the raw data, updates the .tidyllm_stream_env$stream object, and prints the streamed content to the console. The function returns TRUE if streaming should continue, and FALSE when streaming is finished.

Details

  • For Claude API: The function processes event and data lines, and handles the message_start and message_stop events to control streaming flow.

  • For Ollama API: The function directly parses the stream content as JSON and extracts the message$content field.

  • For ChatGPT API: The function handles JSON data streams and processes content deltas. It stops processing when the [DONE] message is encountered.

  • For Mistral API: The function is very similar to the ChatGPT callback function. It stops processing when the [DONE] message is encountered.