MyNixOS website logo
Description

R Wrapper for OpenAI API.

An R wrapper of OpenAI API endpoints (see <https://platform.openai.com/docs/introduction> for details). This package covers Models, Completions, Chat, Edits, Images, Embeddings, Audio, Files, Fine-tunes, Moderations, and legacy Engines endpoints.

openai

R-CMD-check Codecov testcoverage CRANstatus CRANDownloads

Overview

{openai} is an R wrapper of OpenAI API endpoints. This package covers Models, Completions, Chat, Edits, Images, Embeddings, Audio, Files, Fine-tunes, Moderations, and legacy Engines endpoints. The latter endpoints, namely Engines, are left for backward compatibility and will be removed soon.

Installation

The easiest way to install {openai} from CRAN is to use the “official” install.packages() function:

install.packages("openai")

You can also install the development version of {openai} from GitHub with:

if (!require(remotes))
    install.packages("remotes")
remotes::install_github("irudnyts/openai")

Authentication

To use the OpenAI API, you need to provide an API key. First, sign up for OpenAI API on this page. Once you signed up and logged in, you need to open this page, click on Personal, and select View API keys in drop-down menu. You can then copy the key by clicking on the green text Copy.

By default, functions of {openai} will look for OPENAI_API_KEY environment variable. If you want to set a global environment variable, you can use the following command (where xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx should be replaced with your actual key):

Sys.setenv(
    OPENAI_API_KEY = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
)

Otherwise, you can add the key to the .Renviron file of the project. The following commands will open .Renviron for editing:

if (!require(usethis))
    install.packages("usethis")

usethis::edit_r_environ(scope = "project")

You can add the following line to the file (again, replace xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx with your actual key):

OPENAI_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Note: If you are using GitHub/Gitlab, do not forget to add .Renviron to .gitignore!

Finally, you can always provide the key manually to the functions of the package.

Example

Functions of {openai} have self-explanatory names. For example, to create a completion, one can use create_completion() function:

library(openai)

create_completion(
    model = "ada",
    prompt = "Generate a question and an answer"
)
#> $id
#> [1] "cmpl-6MiImjcaCSuQYY6u8UA2Mm0rCdbEo"
#> 
#> $object
#> [1] "text_completion"
#> 
#> $created
#> [1] 1670871532
#> 
#> $model
#> [1] "ada"
#> 
#> $choices
#>                                                                             text
#> 1  within 5 minutes, up to an hour depending on how your users are different and
#>   index logprobs finish_reason
#> 1     0       NA        length
#> 
#> $usage
#> $usage$prompt_tokens
#> [1] 7
#> 
#> $usage$completion_tokens
#> [1] 16
#> 
#> $usage$total_tokens
#> [1] 23

Further, one can generate an image using DALL·E text-to-image model with create_image():

create_image("An astronaut riding a horse in a photorealistic style")

It is also possible to use ChatGPT’s gpt-3.5-turbo model via create_chat_completion():

create_chat_completion(
    model = "gpt-3.5-turbo",
    messages = list(
        list(
            "role" = "system",
            "content" = "You are a helpful assistant."
        ),
        list(
            "role" = "user",
            "content" = "Who won the world series in 2020?"
        ),
        list(
            "role" = "assistant",
            "content" = "The Los Angeles Dodgers won the World Series in 2020."
        ),
        list(
            "role" = "user",
            "content" = "Where was it played?"
        )
    )
)
#> $id
#> [1] "chatcmpl-6r7N6YXcMhg8xmVM4ohOcAmzPOy3f"
#> 
#> $object
#> [1] "chat.completion"
#> 
#> $created
#> [1] 1678117740
#> 
#> $model
#> [1] "gpt-3.5-turbo-0301"
#> 
#> $usage
#> $usage$prompt_tokens
#> [1] 56
#> 
#> $usage$completion_tokens
#> [1] 19
#> 
#> $usage$total_tokens
#> [1] 75
#> 
#> 
#> $choices
#> finish_reason index message.role
#> 1          stop     0    assistant
#> message.content
#> 1 The 2020 World Series was played at Globe Life Field in Arlington, Texas.

Finally, the speech-to-text Whisper is available via create_transcription() and create_translation():

voice_sample_ua <- system.file("extdata", "sample-ua.m4a", package = "openai")
create_translation(file = voice_sample_ua, model = "whisper-1")
#> $text
#> [1] "I want to check how this model works"
Metadata

Version

0.4.1

License

Unknown

Platforms (75)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-darwin
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-darwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-windows