Package index
-
llm_message()
- Create or Update Large Language Model Message Object
-
df_llm_message()
- Convert a Data Frame to an LLMMessage Object
-
get_reply()
last_reply()
- Retrieve Assistant Reply as Text
-
get_reply_data()
last_reply_data()
- Retrieve Assistant Reply as Structured Data
-
get_user_message()
last_user_message()
- Retrieve a User Message by Index
-
get_metadata()
last_metadata()
- Retrieve Metadata from Assistant Replies
-
tidyllm_schema()
- Create a JSON schema for structured outputs
-
rate_limit_info()
- Get the current rate limit information for all or a specific API
Tidyllm Main Verbs
Core verbs facilitating interactions with LLMs, including sending messages, generating embeddings, and managing batch requests.
-
chat()
- Chat with a Language Model
-
embed()
- Generate text embeddings
-
send_batch()
- Send a batch of messages to a batch API
-
check_batch()
- Check Batch Processing Status
-
fetch_batch()
- Fetch Results from a Batch API
-
list_batches()
- List all Batch Requests on a Batch API
-
openai()
- OpenAI Provider Function
-
claude()
- Provider Function for Claude models on the Anthropic API
-
gemini()
- Google Gemini Provider Function
-
groq()
- Groq API Provider Function
-
mistral()
- Mistral Provider Function
-
ollama()
- Ollama API Provider Function
-
azure_openai()
- Azure-OpenAI Endpoint Provider Function
-
chatgpt()
- Alias for the OpenAI Provider Function
OpenAI-Specific Functions
Functions for OpenAI services, including chat interactions, batch processing and embedding generation.
-
openai_chat()
- Send LLM Messages to the OpenAI Chat Completions API
-
send_openai_batch()
- Send a Batch of Messages to OpenAI Batch API
-
check_openai_batch()
- Check Batch Processing Status for OpenAI Batch API
-
fetch_openai_batch()
- Fetch Results for an OpenAI Batch
-
list_openai_batches()
- List OpenAI Batch Requests
-
openai_embedding()
- Generate Embeddings Using OpenAI API
Claude-Specific Functions
Functions designed for Claude services for chat interactions and batch processing.
-
claude_chat()
- Interact with Claude AI models via the Anthropic API
-
send_claude_batch()
- Send a Batch of Messages to Claude API
-
check_claude_batch()
- Check Batch Processing Status for Claude API
-
fetch_claude_batch()
- Fetch Results for a Claude Batch
-
list_claude_batches()
- List Claude Batch Requests
Gemini-Specific Functions
Functions specific to Google Gemini services, including chat, embedding, and file management operations.
-
gemini_chat()
- Send LLMMessage to Gemini API
-
gemini_embedding()
- Generate Embeddings Using the Google Gemini API
-
gemini_upload_file()
- Upload a File to Gemini API
-
gemini_list_files()
- List Files in Gemini API
-
gemini_file_metadata()
- Retrieve Metadata for a File from Gemini API
-
gemini_delete_file()
- Delete a File from Gemini API
Groq-Specific Functions
Functions for interacting with Groq services, such as chat and transcription.
-
groq_chat()
- Send LLM Messages to the Groq Chat API
-
groq_transcribe()
- Transcribe an Audio File Using Groq transcription API
-
mistral_chat()
- Send LLMMessage to Mistral API
-
mistral_embedding()
- Generate Embeddings Using Mistral API
-
azure_openai_chat()
- Send LLM Messages to an OpenAI Chat Completions endpoint on Azure
-
azure_openai_embedding()
- Generate Embeddings Using OpenAI API on Azure
Ollama-Specific Functions
Functions for engaging with Ollama services, including chat, embedding, and model management.
-
ollama_chat()
- Interact with local AI models via the Ollama API
-
ollama_embedding()
- Generate Embeddings Using Ollama API
-
ollama_download_model()
- Download a model from the Ollama API
-
ollama_list_models()
- Retrieve and return model information from the Ollama API
-
pdf_page_batch()
- Batch Process PDF into LLM Messages
-
LLMMessage
- Large Language Model Message Class