MyNixOS website logo
Description

Local Large Language Model Inference Engine.

Enables R users to run large language models locally using 'GGUF' model files and the 'llama.cpp' inference engine. Provides a complete R interface for loading models, generating text completions, and streaming responses in real-time. Supports local inference without requiring cloud APIs or internet connectivity, ensuring complete data privacy and control. Based on the 'llama.cpp' project by Georgi Gerganov (2023) <https://github.com/ggml-org/llama.cpp>.

edgemodelr Examples

This directory contains examples for using edgemodelr with modern small language models.

📁 Organization

🚀 Quick Start Examples

For immediate solutions and learning:

  • working_document_analysis.R - Simple document analysis (works immediately)
  • manual_setup_example.R - Troubleshooting model downloads
  • document_analysis_example.R - Alternative analysis approach

🎯 Professional Examples

For production applications:

  • 01_model_comparison.R - Modern model selection and comparison
  • 02_document_analysis.R - Advanced document analysis system
  • 03_content_generation.R - Multi-format content generation
  • 04_streaming_chat.R - Interactive streaming conversations
  • 05_model_benchmarking.R - Systematic model evaluation

🏃 Quick Start

# Option 1: Quick working solution
source(system.file("examples", "working_document_analysis.R", package = "edgemodelr"))

# Option 2: Professional system  
source(system.file("examples", "02_document_analysis.R", package = "edgemodelr"))
analyzer <- DocumentAnalyzer()
results <- analyzer$analyze_documents(your_texts)

📚 Choose Your Path

  • Need it working NOW → Start with working_document_analysis.R
  • Building production app → Use 02_document_analysis.R
  • Learning about models → Check 01_model_comparison.R
  • Having download issues → Use manual_setup_example.R

🆚 Quick vs Professional Examples

AspectQuick ExamplesProfessional Examples
GoalSolve immediate problemProduction-ready systems
ComplexitySimple, directComprehensive, feature-rich
Error HandlingBasicExtensive
DocumentationMinimalComplete

📖 Additional Documentation

  • ../../MODERN_MODELS.md - Guide to 2024 small language models
  • ../../EXAMPLES_GUIDE.md - Complete navigation guide

All examples include error handling and work with the latest 2024 quantized models including Llama 3.2, Phi-3.5, Qwen2.5, and Gemma 2.

Metadata

Version

0.2.0

License

Unknown

Platforms (78)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    uefi
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-freebsd
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64-uefi
  • aarch64-windows
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-linux
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-uefi
  • x86_64-windows