Tidy Integration of Large Language Models


[Up] [Top]

Documentation for package ‘tidyllm’ version 0.1.0

Help Pages

chatgpt Call the OpenAI API to interact with ChatGPT or o-reasoning models
claude Call the Anthropic API to interact with Claude models
df_llm_message Convert a Data Frame to an LLMMessage Object
generate_callback_function Generate API-Specific Callback Function for Streaming Responses
groq Call the Groq API to interact with fast opensource models on Groq
initialize_api_env Initialize or Retrieve API-specific Environment
last_reply Retrieve Last Reply from an Assistant
LLMMessage Large Language Model Message Class
llm_message Create or Update Large Language Model Message Object
ollama Send LLMMessage to ollama API
ollama_list_models Retrieve and return model information from the Ollama API
parse_duration_to_seconds An internal function to parse the duration strings that OpenAI APIs return for ratelimit resets
perform_api_request Perform an API request to interact with language models
rate_limit_info Get the current rate limit information for all or a specific API
update_rate_limit Update the standard API rate limit info in the hidden .tidyllm_rate_limit_env environment
wait_rate_limit Wait for ratelimit restore times to ellapse if necessary