Skip to contents

LLM Message Handling

llm_message()
Create or Update Large Language Model Message Object
df_llm_message()
Convert a Data Frame to an LLMMessage Object
last_reply()
Retrieve the Last Assistant Reply
last_user_message()
Retrieve the Last User Message
get_reply()
Retrieve an Assistant Reply by Index
get_user_message()
Retrieve a User Message by Index

Core API Functions

chatgpt()
Call the OpenAI API to interact with ChatGPT or o-reasoning models
claude()
Call the Anthropic API to interact with Claude models
groq()
Call the Groq API to interact with fast opensource models on Groq
mistral()
Send LLMMessage to Mistral API
ollama()
Send LLMMessage to ollama API

Ollama Functions

ollama_download_model()
Download a model from the Ollama API
ollama_embedding()
Generate Embeddings Using Ollama API
ollama_list_models()
Retrieve and return model information from the Ollama API

Internals and Utility Functions

LLMMessage
Large Language Model Message Class
perform_api_request()
Perform an API request to interact with language models
generate_callback_function()
Generate API-Specific Callback Function for Streaming Responses
initialize_api_env()
Initialize or Retrieve API-specific Environment
rate_limit_info()
Get the current rate limit information for all or a specific API
update_rate_limit()
Update the standard API rate limit info in the hidden .tidyllm_rate_limit_env environment
wait_rate_limit()
Wait for ratelimit restore times to ellapse if necessary
parse_duration_to_seconds()
An internal function to parse the duration strings that OpenAI APIs return for ratelimit resets

PDF Processing

pdf_page_batch()
Batch Process PDF into LLM Messages