Description
A library for interacting with various generative AI LLMs.
Description
A library for performing completions and chats with various generative AI LLMs (Large Language Models). Works today with Ollama and OpenAI with more to come in the future.
README.md
genai-lib
Synopsis
A library for interacting with various generative AI LLMs
Description
A library for performing completions and chats with various generative AI LLMs (Large Language Models). Works today with Ollama and OpenAI with more to come in the future.
This project is very much in-progress and incomplete.
Using this library
An example, more available in src/example
import Control.Exception (Handler (..), catches)
import Data.Text.Lazy.IO qualified as TL
import GenAILib (ClientError, Request (..), jsonToText, mkRequest, numopt,
stringopt, systemmsg, usermsg)
import GenAILib.HTTP (GenAIException, openaiV1Chat, openaiV1ChatJ,
tokenFromFile)
import GenAILib.OpenAI (OpenAIRequest, getMessage)
main :: IO ()
main = do
openaiJSON
openaiData
-- A simple example with no error handling that expects an Aeson Value
-- (OpenAIRequest is an instance of ToJSON) and displays the encoded JSON
-- response
openaiJSON :: IO ()
openaiJSON = do
let req :: OpenAIRequest = mkRequest "gpt-3.5-turbo" $ usermsg "Why is the sky blue?"
token <- tokenFromFile "path/to/openai/key"
TL.putStrLn . jsonToText =<< openaiV1ChatJ token Nothing req
-- Another example with exception handling that expects an OpenAIResponse
-- data structure, displaying that and also just the response text
openaiData :: IO ()
openaiData = do
let req :: OpenAIRequest = mkRequest "gpt-3.5-turbo"
( systemmsg "Answer in the style of Bugs Bunny. Try to work the phrase \"What's up Doc?\" in somewhere."
<> usermsg "Why is the sky blue?"
<> numopt "temperature" 0.8
<> stringopt "service_tier" "default"
)
token <- tokenFromFile "path/to/openai/key"
res <- openaiV1Chat token Nothing req `catches`
[ Handler (\(e :: GenAIException) -> error . show $ e)
, Handler (\(e :: ClientError) -> error . show $ e)
]
print res -- The entire OpenAIResponse
TL.putStrLn . getMessage $ res -- Just the response Message Content
Getting source
Source code is available from codeberg at the genai-lib project page.
Contact
Dino Morelli [email protected].