MyNixOS website logo
Description

Unified Interface for Large Language Model Interactions.

Provides a unified interface for interacting with Large Language Models (LLMs) through various providers including OpenAI <https://platform.openai.com/docs/api-reference>, Ollama <https://ollama.com/>, and other OpenAI-compatible APIs. Features include automatic connection testing, max_tokens limit auto-adjustment, structured JSON responses with schema validation, interactive JSON schema generation, prompt templating, and comprehensive diagnostics.

llmhelper

R-CMD-check CRAN status r-universe Lifecycle: experimental

Overview

llmhelper provides a unified and user-friendly interface for interacting with Large Language Models (LLMs) in R. Key features:

  • Multiple Provider Support: OpenAI, Ollama, DeepSeek, and any OpenAI-compatible APIs
  • Automatic Connection Testing: Validates your LLM setup before use
  • Smart max_tokens Handling: Auto-adjusts when model limits are exceeded
  • Structured JSON Responses: With schema validation
  • Interactive JSON Schema Generator: Create schemas through conversation
  • Prompt Templating: Build dynamic prompts with variable injection
  • Comprehensive Diagnostics: Debug connection issues easily

Installation

From CRAN (once available)

install.packages("llmhelper")

From GitHub

# install.packages("pak")
pak::pak("Zaoqu-Liu/llmhelper")

From R-universe

install.packages("llmhelper", repos = "https://Zaoqu-Liu.r-universe.dev")

Quick Start

Setting up an LLM Provider

library(llmhelper)

# OpenAI
openai_client <- llm_provider(
  base_url = "https://api.openai.com/v1/chat/completions",
  api_key = Sys.getenv("OPENAI_API_KEY"),
  model = "gpt-4o-mini"
)

# Ollama (local)
ollama_client <- llm_ollama(
  model = "qwen2.5:1.5b-instruct",
  auto_download = TRUE
)

# DeepSeek
deepseek_client <- llm_provider(
  base_url = "https://api.deepseek.com/v1/chat/completions",
  api_key = Sys.getenv("DEEPSEEK_API_KEY"),
  model = "deepseek-chat"
)

Getting Responses

# Simple text response
response <- get_llm_response(
  prompt = "What is machine learning?",
  llm_client = openai_client,
  max_words = 100
)

# Structured JSON response
schema <- list(
  name = "analysis_result",
  schema = list(
    type = "object",
    properties = list(
      summary = list(type = "string", description = "Brief summary"),
      key_points = list(
        type = "array",
        items = list(type = "string"),
        description = "Main key points"
      ),
      confidence = list(type = "number", description = "Confidence score 0-1")
    ),
    required = c("summary", "key_points", "confidence")
  )
)

json_response <- get_llm_response(
  prompt = "Analyze the benefits of R programming",
  llm_client = openai_client,
  json_schema = schema
)

Using Prompt Templates

template <- "
Analyze the following dataset: {dataset_name}
Focus on: {focus_area}
Output format: {output_format}
"

prompt <- build_prompt(
  template = template,
  dataset_name = "iris",
  focus_area = "species classification",
  output_format = "bullet points"
)

Interactive JSON Schema Generation

result <- generate_json_schema(
  description = "A user profile with name, email, and preferences",
  llm_client = openai_client
)

# Use the generated schema
final_schema <- extract_schema_only(result)

Managing Ollama Models

# List available models
ollama_list_models()

# Download a new model
ollama_download_model("llama3.2:1b")

# Delete a model
ollama_delete_model("old-model:latest")

Diagnostics

# Debug connection issues
diagnose_llm_connection(
  base_url = "https://api.openai.com/v1/chat/completions",
  api_key = Sys.getenv("OPENAI_API_KEY"),
  model = "gpt-4o-mini"
)

Main Functions

FunctionDescription
llm_provider()Create an OpenAI-compatible LLM provider
llm_ollama()Create an Ollama LLM provider
get_llm_response()Get text or JSON responses from LLM
build_prompt()Build prompts from templates
set_prompt()Create prompt objects with system/user messages
generate_json_schema()Interactively generate JSON schemas
diagnose_llm_connection()Debug connection issues
ollama_list_models()List available Ollama models
ollama_download_model()Download Ollama models
ollama_delete_model()Delete Ollama models

Environment Variables

Set your API keys as environment variables:

# In your .Renviron file or before using the package:
Sys.setenv(OPENAI_API_KEY = "your-openai-key")
Sys.setenv(DEEPSEEK_API_KEY = "your-deepseek-key")
Sys.setenv(LLM_API_KEY = "your-default-key")

Requirements

  • R >= 4.1.0
  • A running Ollama server (for local LLM usage)
  • Valid API keys (for cloud LLM providers)

Related Packages

Citation

citation("llmhelper")

License

GPL (>= 3)

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Author

Zaoqu Liu ([email protected])

Metadata

Version

1.0.0

License

Unknown

Platforms (78)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    uefi
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-freebsd
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64-uefi
  • aarch64-windows
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-linux
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-uefi
  • x86_64-windows