MyNixOS website logo
Description

An implementation of LangChain in Haskell.

Please see README.md for examples and usage.

TypeChain

An attempt to recreate Langchain in Haskell.

This is currently more a proof-of-concept than an actual functioning library.

Basic Model Prediction

Currently, only the GPT-3.5 Turbo model has been implemented and is the only model that can be used. Below is an example of a simple program that asks the model what 1 + 1 is.

module Main where

import DotEnv

import TypeChain.ChatModels.Types
import TypeChain.ChatModels.OpenAI

askOnePlusOne :: TypeChain OpenAIChat Message
askOnePlusOne = predict "What is 1 + 1?"

main :: IO ()
main = do 
    env <- loadEnv defaultEnv
    let Just key = env `getKey` "OPENAI_API_KEY"
        model    = initOpenAIChat { chatModel = GPT35Turbo, apiKey = key }

    response <- evalStateT askOnePlusOne model

    mapM_ print response

This provides the output:

assistant: 1 + 1 equals 2.

Model Prediction With Context

Let's say we want to ask our model what 1 + 1 is after setting a rule that 1 + 1 is 3. We can do this by passing in an initial system message when we create the model. Here is an example:

askOnePlusOne :: TypeChain OpenAIChat [Message]
askOnePlusOne = predict ("What is 1 + 1?" :: String)

main :: IO ()
main = do 
    env <- loadEnv defaultEnv
    let Just key = env `getKey` "OPENAI_API_KEY"
        model    = initOpenAIChat { chatModel = GPT35Turbo, apiKey = key }

    response <- evalStateT askOnePlusOne model

    mapM_ print response

This provides the output:

assistant: According to the new rule, 1 + 1 equals 3.

Model Prediction with Multiple Models

Let's say we want to have some sort of interaction between models. Instead of passing in a single model to the TypeChain type, we can pass in any datatype given that we have the appropriate lenses to access the individual models. In this example, we will use a tuple with the _1 and _2 lenses to represent the two different models.


import Control.Lens

-- Helper function to turn assistant messages into user messages 
-- We do this so we don't confuse the model and make it think it's talking 
-- to itself
toUserMessage :: Message -> Message 
toUserMessage msg = msg { _role = User }

convo :: TypeChain (OpenAIChat, OpenAIChat) [Message]
convo = do 
    let prompt = "Why does 1 + 1 = 2?"

    -- Add appropriate context to model 2 so it thinks it asked this 
    -- question.
    _2 `memorizes` [Message Assistant prompt]

    -- Ask model 1 why 1 + 1 = 2 
    -- We do this to start a conversation between the two models
    response1 <- _1 `predicts` ("Why does 1 + 1 = 2?" :: String)

    -- Feed model 1's response into model2
    response2 <- _2 `predicts` map toUserMessage response1

    return response2

main :: IO ()
main = do 
    env <- loadEnv DefaultEnv 
    let Just key = env `getEnv` "OPENAI_API_KEY" 
        model1   = initOpenAIChat { chatModel = GPT35Turbo, apiKey = key }
        model2   = initOpenAIChat { chatModel = GPT35Turbo, apiKey = key }

    -- Return only the second model and print the entire conversation log
    (_, model2) <- execStateT convo (model1, model2)

    mapM_ print (messages model2)

This produces the output:

assistant: Why does 1 + 1 = 2?

user: The equation 1 + 1 = 2 is a fundamental concept in mathematics and follows the principles of arithmetic. In the base-10 number system, 1 represents 
a single unit, and when another single unit (1) is added to it, we get a total of two units. This is the basic principle of addition, where combining two 
quantities or numbers gives us the sum or total of those quantities. Consequently, 1 + 1 equals 2.

assistant: The equation 1 + 1 = 2 is derived from the principles of mathematical logic and the base-10 number system. In this system, we assign the numera
l 1 to represent a single unit or quantity. When we add two quantities or units of 1 together, we combine them to get a total of two units. 

This concept of addition is a fundamental principle in mathematics, and it follows the properties and rules of arithmetic. It holds true not just in the b
ase-10 number system, but in any other number system as well. The equation 1 + 1 = 2 is a universally accepted fact in mathematics and forms the basis for
 further mathematical operations and calculations.

Prompt Templates

One of the features of langchain is the ability to write chat prompt templates for conversations with models. Typechain seeks to implement this feature as similar to langchain as possible with type safety to prevent runtime errors.

For example, consider the following langchain code:

template = "You are a helpful assistant that translates {from} to {to}."
human_template = "{text}"

chat_prompt = ChatPromptTemplate.from_messages([
    ("system", template)
    ("human", human_template)
])

messages = chat_prompt.format_messages(from='English', to='French', text='Hello, World!')

The same code can be implemented in Typechain:

{-# LANGUAGE TemplateHaskell #-}

import Typechain.ChatModels

makeTemplate "Translate" [ system "You are an assistant that translates {lang1} to {lang2}."
                         , user "{text}."
                         ]

-- Fill in the template quick and easy
messages :: [Message]
messages = mkTranslate "English" "French" "Hello, World!"

-- Fill in the template explicitly.
messages' :: [Message]
messages' = toPrompt $ Translate { lang1 = "English", lang2 = "French", text = "Hello!"}

The makeTemplate function generates code for a record type at compile time where each field represents a placeholder in the specified string. makeTemplate will also create an instance for ToPrompt, allowing you to use the toPrompt function to convert the record type into a list of messages. And for smaller templates, makeTemplate also generates a helper function (in this case mkTranslate) that takes in all of the prompt parameters and returns a String.

So for the above example, the makeTemplate function would expand to the following code:

data Translate = Translate { lang1 :: String, lang2 :: String, text :: String}

instance ToPrompt Translate where 
    toPrompt template = [ Message System $ "You are an assistant that translates" ++ lang1 template ++ " to " ++ lang2 template ++ "."
                        , Message User $ text template 
                        ]

mkTranslate :: String -> String -> String -> [Messages]
mkTranslate lang1 lang2 text = toPrompt $ Translate lang1 lang2 text
Metadata

Version

0.2.0.0

Platforms (77)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-freebsd
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64-windows
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-darwin
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-darwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-windows