Skip to contents

License: MIT

tidyllm is an R package designed to access various large language model APIs, including Claude, ChatGPT, Groq, Mistral, and local models via Ollama. Built for simplicity and functionality, it helps you generate text, analyze media, and integrate model feedback into your data workflows with ease.

Features

  • Multiple Model Support: Seamlessly switch between various model providers like Claude, ChatGPT, Groq, Mistral or Ollama using the best of what each has to offer.
  • Media Handling: Extract and process text from PDFs and capture console outputs for messaging. Upload imagefiles or the last plotpane to multimodal models.
  • Interactive Messaging History: Manage an ongoing conversation with models, maintaining a structured history of messages and media interactions, which are automatically formatted for each API
  • Stateful handling of rate limits: API rate limits are handled statefully within each R Session and API functions can wait automatically for rate limits to reset
  • Tidy Workflow: Use R’s functional programming features for a side-effect-free, pipeline-oriented operation style.

Installation

To install tidyllm from CRAN, use:

install.packages("tidyllm")

Or for the development version from GitHub:

# Install devtools if not already installed
if (!requireNamespace("devtools", quietly = TRUE)) {
  install.packages("devtools")
}
devtools::install_github("edubruell/tidyllm")

Basic Example

Here’s a quick example using tidyllm to describe an image using the Claude model to and follow up with local open-source models:

library("tidyllm")

# Describe an image with  claude
conversation <- llm_message("Describe this image", 
                              .imagefile = here("image.png")) |>
  claude()

# Use the description to query further with groq
conversation |>
  llm_message("Based on the previous description,
  what could the research in the figure be about?") |>
  ollama(.model = "gemma2")

For more examples and advanced usage, check the Get Started vignette.

Please note: To use tidyllm, you need either an installation of ollama or an active API key for one of the supported providers (e.g., Claude, ChatGPT). See the Get Started vignette for setup instructions.

Learn More

For detailed instructions and advanced features, see:

Contributing

We welcome contributions! Feel free to open issues or submit pull requests on GitHub.

License

This project is licensed under the MIT License - see the LICENSE file for details.