Sends a single user message (with optional system message) to a specified LLM provider and model, handling authentication and API differences. Returns either the extracted text response or appends the full JSON response to a JSONL file.
Arguments
- user
Character string. The user's message/prompt. Required.
- system
Character string. An optional system message/instruction. Default is NULL.
- org
Character vector. The LLM provider. Defaults to "google". Handles partial matching (e.g., "goog", "anthro", "open"). Allowed values: "google", "anthropic", "openai".
- model
Character string. The specific model ID to use. If NULL (default), a provider-specific default is chosen (e.g., "gemini-2.0-flash", "claude-3-haiku-20240307", "gpt-4o-mini").
- temperature
Numeric. Sampling temperature (>= 0). Lower values are more deterministic. Default is 0.0. Note: Different providers may have different effective upper bounds (e.g., Google <= 1.0, OpenAI/Anthropic <= 2.0). Validation only checks for >= 0.
- max_tokens
Integer. Maximum number of tokens to generate in the response. Default is 1024L. Required by some providers (Anthropic).
- timeout
Numeric. Request timeout in seconds. Default is 60.
- output_format
Character vector. How to return the result. Allowed values: "text" (default), "jsonl". Handles partial matching.
- jsonl_file
Character string. The path to the output file if
output_format
is "jsonl". Required in that case, otherwise ignored. The full JSON response will be appended as a single line.
Value
If output_format
is "text", returns the extracted text content
as a character string (or NA if extraction fails).
If output_format
is "jsonl", appends the full JSON response to the
specified file and returns invisible(extracted_text)
(where
extracted_text
is the attempted text extraction, possibly NA).
Stops execution with an error message on failure (e.g., missing API key,
API error, validation failure, JSON parsing failure). Extraction failures
when output_format
is "jsonl" produce a warning but allow the function
to complete the file write.
Examples
if (FALSE) { # \dontrun{
# Ensure API keys are set as environment variables:
# Sys.setenv(GOOGLE_API_KEY = "YOUR_GOOGLE_KEY")
# Sys.setenv(ANTHROPIC_API_KEY = "YOUR_ANTHROPIC_KEY")
# Sys.setenv(OPENAI_API_KEY = "YOUR_OPENAI_KEY")
# --- Text Output Examples ---
# Google (default org)
response_text_google <- single_turn(user = "Explain the concept of recursion simply.")
print(response_text_google)
# Anthropic
response_text_anthropic <- single_turn(
user = "Write a short poem about R programming.",
org = "anthropic",
model = "claude-3-sonnet-20240229" # Use Sonnet instead of default Haiku
)
print(response_text_anthropic)
# OpenAI with system message
response_text_openai <- single_turn(
user = "Why is the sky blue?",
system = "Explain like I'm five years old.",
org = "openai",
model = "gpt-4o",
temperature = 0.7
)
print(response_text_openai)
# --- JSONL Output Example ---
tmp_file <- tempfile(fileext = ".jsonl")
print(paste("Writing JSONL output to:", tmp_file))
# The return value is now the text (invisibly)
invisible_text_google <- single_turn(
user = "What is the capital of France?",
org = "google",
output_format = "jsonl",
jsonl_file = tmp_file
)
# Can capture it if needed:
print(paste("Invisible text from Google call:", invisible_text_google))
invisible_text_openai <- single_turn(
user = "What are the main benefits of using version control?",
system = "You are a helpful software development assistant.",
org = "openai",
output_format = "jsonl",
jsonl_file = tmp_file
)
print(paste("Invisible text from OpenAI call:", invisible_text_openai))
# Read the results from the file
results <- readLines(tmp_file)
cat("Contents of JSONL file:\n")
cat(results, sep = "\n")
# Clean up the temporary file
unlink(tmp_file)
} # }