
Package index
-
llm_message() - Create or Update Large Language Model Message Object
-
df_llm_message() - Convert a Data Frame to an LLMMessage Object
-
get_reply()last_reply() - Retrieve Assistant Reply as Text
-
get_reply_data()last_reply_data() - Retrieve Assistant Reply as Structured Data
-
get_user_message()last_user_message() - Retrieve a User Message by Index
-
get_metadata()last_metadata() - Retrieve Metadata from Assistant Replies
-
get_logprobs() - Retrieve Log Probabilities from Assistant Replies
-
rate_limit_info() - Get the current rate limit information for all or a specific API
Tidyllm Main Verbs
Core verbs facilitating interactions with LLMs, including sending messages, generating embeddings, and managing batch requests.
-
chat() - Chat with a Language Model
-
embed() - Generate text embeddings
-
send_batch() - Send a batch of messages to a batch API
-
check_batch() - Check Batch Processing Status
-
fetch_batch() - Fetch Results from a Batch API
-
list_batches() - List all Batch Requests on a Batch API
-
list_models() - List Available Models for a Provider
-
tidyllm_schema() - Create a JSON Schema for Structured Outputs
-
field_chr()field_fct()field_dbl()field_lgl() - Define Field Descriptors for JSON Schema
-
field_object() - Define a nested object field
-
tidyllm_tool() - Create a Tool Definition for tidyllm
-
img() - Create an Image Object
-
openai() - OpenAI Provider Function
-
claude() - Provider Function for Claude models on the Anthropic API
-
gemini() - Google Gemini Provider Function
-
groq() - Groq API Provider Function
-
mistral() - Mistral Provider Function
-
ollama() - Ollama API Provider Function
-
perplexity() - Perplexity Provider Function
-
deepseek() - Deepseek Provider Function
-
voyage() - Voyage Provider Function
-
azure_openai() - Azure OpenAI Endpoint Provider Function
-
chatgpt() - Alias for the OpenAI Provider Function
OpenAI-Specific Functions
Functions for OpenAI services, including chat interactions, batch processing and embedding generation.
-
openai_chat() - Send LLM Messages to the OpenAI Chat Completions API
-
send_openai_batch() - Send a Batch of Messages to OpenAI Batch API
-
check_openai_batch() - Check Batch Processing Status for OpenAI Batch API
-
fetch_openai_batch() - Fetch Results for an OpenAI Batch
-
list_openai_batches() - List OpenAI Batch Requests
-
cancel_openai_batch() - Cancel an In-Progress OpenAI Batch
-
openai_embedding() - Generate Embeddings Using OpenAI API
-
openai_list_models() - List Available Models from the OpenAI API
Claude-Specific Functions
Functions designed for Claude services for chat interactions and batch processing.
-
claude_chat() - Interact with Claude AI models via the Anthropic API
-
send_claude_batch() - Send a Batch of Messages to Claude API
-
check_claude_batch() - Check Batch Processing Status for Claude API
-
fetch_claude_batch() - Fetch Results for a Claude Batch
-
list_claude_batches() - List Claude Batch Requests
-
claude_list_models() - List Available Models from the Anthropic Claude API
-
claude_upload_file() - Upload a File to Claude API
-
claude_delete_file() - Delete a File from Claude API
-
claude_file_metadata() - Retrieve Metadata for a File from Claude API
-
claude_list_files() - List Files in Claude API
Gemini-Specific Functions
Functions specific to Google Gemini services, including chat, embedding, and file management operations.
-
gemini_chat() - Send LLMMessage to Gemini API
-
gemini_embedding() - Generate Embeddings Using the Google Gemini API
-
gemini_upload_file() - Upload a File to Gemini API
-
gemini_list_files() - List Files in Gemini API
-
gemini_file_metadata() - Retrieve Metadata for a File from Gemini API
-
gemini_delete_file() - Delete a File from Gemini API
-
send_gemini_batch() - Submit a list of LLMMessage objects to Gemini's batch API
-
check_gemini_batch() - Check the Status of a Gemini Batch Operation
-
list_gemini_batches() - List Recent Gemini Batch Operations
-
fetch_gemini_batch() - Fetch Results for a Gemini Batch
-
gemini_list_models() - List Available Models from the Google Gemini API
Ollama-Specific Functions
Functions for engaging with Ollama services, including chat, embedding, and model management.
-
ollama_chat() - Interact with local AI models via the Ollama API
-
ollama_embedding() - Generate Embeddings Using Ollama API
-
send_ollama_batch() - Send a Batch of Messages to Ollama API
-
ollama_download_model() - Download a model from the Ollama API
-
ollama_delete_model() - Delete a model from the Ollama API
-
ollama_list_models() - Retrieve and return model information from the Ollama API
-
mistral_chat() - Send LLMMessage to Mistral API
-
mistral_embedding() - Generate Embeddings Using Mistral API
-
send_mistral_batch() - Send a Batch of Requests to the Mistral API
-
check_mistral_batch() - Check Batch Processing Status for Mistral Batch API
-
fetch_mistral_batch() - Fetch Results for an Mistral Batch
-
list_mistral_batches() - List Mistral Batch Requests
-
mistral_list_models() - List Available Models from the Mistral API
-
perplexity_chat() - Send LLM Messages to the Perplexity Chat API (All Features, No .json Option)
Groq-Specific Functions
Functions for interacting with Groq services, such as chat and transcription.
-
groq_chat() - Send LLM Messages to the Groq Chat API
-
groq_transcribe() - Transcribe an Audio File Using Groq transcription API
-
groq_list_models() - List Available Models from the Groq API
-
send_groq_batch() - Send a Batch of Messages to the Groq API
-
check_groq_batch() - Check Batch Processing Status for Groq API
-
fetch_groq_batch() - Fetch Results for a Groq Batch
-
list_groq_batches() - List Groq Batch Requests
-
azure_openai_chat() - Send LLM Messages to an Azure OpenAI Chat Completions endpoint
-
azure_openai_embedding() - Generate Embeddings Using OpenAI API on Azure
-
send_azure_openai_batch() - Send a Batch of Messages to Azure OpenAI Batch API
-
check_azure_openai_batch() - Check Batch Processing Status for Azure OpenAI Batch API
-
list_azure_openai_batches() - List Azure OpenAI Batch Requests
-
fetch_azure_openai_batch() - Fetch Results for an Azure OpenAI Batch
-
deepseek_chat() - Send LLM Messages to the DeepSeek Chat API
-
voyage_embedding() - Generate Embeddings Using Voyage AI API
-
pdf_page_batch() - Batch Process PDF into LLM Messages
-
LLMMessage() - Large Language Model Message Class