DEV Community

Rocky LIU Yan
Rocky LIU Yan

Posted on • Edited on

Introducing F# with Semantic Kernel: Simplifying AI App Development with the Pipeline Pattern

This blog base from C# version https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStarted/Step8_Pipelining.cs

F# code is here https://github.com/rocklau/fsharpPlayground

Introducing F# with Semantic Kernel: Simplifying AI App Development with the Pipeline Pattern

As developers, we are continuously looking for tools and frameworks that can help us streamline our workflows and build efficient applications. In the realm of AI app development, Microsoft's Semantic Kernel (SK) has emerged as a powerful framework to simplify the integration of AI capabilities into applications. Coupling this with the functional programming prowess of F#, we can create robust, maintainable, and scalable AI solutions. Today, we'll explore how to harness the pipeline pattern in F# with SK to develop AI apps more easily.

Why F# and Semantic Kernel?

F# is a functional-first programming language that empowers developers to write concise, expressive, and high-performance code. It shines in scenarios requiring complex data manipulations, asynchronous programming, and domain-specific languages (DSLs).

Semantic Kernel, on the other hand, is a developer kit from Microsoft designed to enable rapid embedding of AI functionalities within applications. It abstracts many complexities associated with AI model integration and provides a range of tools for building intelligent systems effortlessly.

Combining F# with Semantic Kernel allows us to leverage the pipeline pattern, a well-known functional programming construct, to process data through a sequence of operations seamlessly. Let's delve into the practical implementation of this approach.

Implementing the Pipeline Pattern

Here's a step-by-step guide on how to implement a simple pipeline pattern using F# and Semantic Kernel.

1. Define Custom Types and Functions

First, we set up our custom types and essential functions. We define a JsonValue type to represent different JSON structures and a JsonSchema type for schema validation.

type JsonValue =
    | JsonObject of Map<string, JsonValue>
    | JsonArray of JsonValue list
    | JsonString of string
    | JsonNumber of float
    | JsonBool of bool
    | JsonNull

type JsonSchema =
    | SchemaObject of Map<string, JsonSchema>
    | SchemaArray of JsonSchema
    | SchemaString
    | SchemaNumber
    | SchemaBool
    | SchemaNull
    | SchemaRequired of JsonSchema
Enter fullscreen mode Exit fullscreen mode

Next, we define a context type and a result type to handle the outcomes of our operations:

type Context = { Input: JsonElement }
type Result<'T> =
    | Success of 'T
    | Failure of string
Enter fullscreen mode Exit fullscreen mode

2. JSON Value Conversion and Validation

We then create functions to convert JSON elements to our JsonValue types and validate JSON against schemas:

let rec toJsonValue (jsonElement: JsonElement): JsonValue =
    match jsonElement.ValueKind with
    | JsonValueKind.Object ->
        jsonElement.EnumerateObject()
        |> Seq.map (fun prop -> prop.Name, toJsonValue prop.Value)
        |> Map.ofSeq
        |> JsonObject
    | JsonValueKind.Array ->
        jsonElement.EnumerateArray()
        |> Seq.map toJsonValue
        |> List.ofSeq
        |> JsonArray
    | JsonValueKind.String -> JsonString (jsonElement.GetString())
    | JsonValueKind.Number -> JsonNumber (jsonElement.GetDouble())
    | JsonValueKind.True -> JsonBool true
    | JsonValueKind.False -> JsonBool false
    | JsonValueKind.Null -> JsonNull
    | _ -> failwith "Unsupported JSON value kind"

let rec validateJson (schema: JsonSchema) (json: JsonValue): bool =
    match schema, json with
    | SchemaObject properties, JsonObject obj ->
        properties |> Map.forall (fun key valueSchema ->
            match obj.TryFind key with
            | Some value -> validateJson valueSchema value
            | None -> valueSchema <> SchemaRequired _)
    | SchemaArray itemSchema, JsonArray items ->
        items |> List.forall (validateJson itemSchema)
    | SchemaString, JsonString _
    | SchemaNumber, JsonNumber _
    | SchemaBool, JsonBool _
    | SchemaNull, JsonNull -> true
    | SchemaRequired innerSchema, value -> validateJson innerSchema value
    | _ -> false
Enter fullscreen mode Exit fullscreen mode

3. Creating Contexts and JSON Operations

We create functions to build contexts and perform JSON operations like addition, multiplication, and squaring:

let createContextWithNewValue newValue =
    let json = JsonDocument.Parse($"{{ \"value\": {newValue} }}").RootElement
    { Input = json }

let jsonOperation opFunc (context: Context) : Result<Context> =
    try
        let value = context.Input.GetProperty("value").GetInt32()
        let newValue = opFunc value
        Success (createContextWithNewValue newValue)
    with
    | ex -> Failure ex.Message

let addOne = jsonOperation ((+) 1)
let multiplyByTwo = jsonOperation ((*) 2)
let square = jsonOperation (fun x -> x * x)
Enter fullscreen mode Exit fullscreen mode

4. Mapping Functions and Defining the Pipeline

We map function names to actual functions and define our pipeline function which processes data through these functions:

let functionMap =
    [ "addOne", addOne
      "multiplyByTwo", multiplyByTwo
      "square", square ]
    |> Map.ofList

let toFunction name : Context -> Result<Context> =
    match Map.tryFind name functionMap with
    | Some func -> func
    | None -> fun _ -> Failure (sprintf "Function %s not found" name)

let pipeline schema context functions =
    match validateSchema schema context with
    | Failure msg -> Failure msg
    | Success validatedContext ->
        functions
        |> Array.fold (fun acc funcName ->
            match acc with
            | Failure msg -> Failure msg
            | Success ctx -> toFunction funcName ctx) (Success validatedContext)
Enter fullscreen mode Exit fullscreen mode

5. Implementing and Testing the Pipeline

Finally, we implement and test our pipeline:

let schemaJson = SchemaObject (Map.ofList [ ("value", SchemaRequired SchemaNumber) ])
let initialContext = createContextWithNewValue 3
let functionNames = [| "addOne"; "multiplyByTwo"; "square" |]
let resultContext = pipeline schemaJson initialContext functionNames

match resultContext with
| Success ctx -> printfn "Result: %s" (ctx.Input.ToString())
| Failure msg -> printfn "Error: %s" msg
Enter fullscreen mode Exit fullscreen mode

Integrating with Semantic Kernel

To make our AI app more realistic, we integrate it with the Semantic Kernel by creating a plugin and utilizing OpenAI models:

type Calculate() =
    [<KernelFunction>]
    [<Description("")>]    
    member this.Pipeline([<Description("input number")>] input:int , [<Description("choose from addOne multiplyByTwo square")>] functionNames:string[] ) = 
        let schemaJson = SchemaObject (Map.ofList [ ("value", SchemaRequired SchemaNumber) ])
        let initialContext = createContextWithNewValue input
        let resultContext = pipeline schemaJson initialContext functionNames
        match resultContext with
        | Success ctx -> sprintf "Result: %s" (ctx.Input.ToString())
        | Failure msg -> sprintf "Error: %s" msg

let key = "sk-"  
let model = "gpt-4o-mini" 
let chat_url ="https://api.openai.com/v1/chat/completions" 
let builder = Kernel.CreateBuilder().AddOpenAIChatCompletion(modelId = model, endpoint = Uri(chat_url), apiKey = key)
let plugins = builder.Plugins.AddFromType<Calculate>()
let kernel = builder.Build()

asyncEx {
    let settings = OpenAIPromptExecutionSettings()
    settings.ToolCallBehavior <- ToolCallBehavior.AutoInvokeKernelFunctions
    let arguments = new KernelArguments(settings)
    printfn "Plugins count : %d" kernel.Plugins.Count
    let! response2 = kernel.InvokePromptAsync("Calculate: input 3 to Pipeline [addOne addOne]", arguments) 
    Console.WriteLine(response2)
} |> Async.RunSynchronously |> ignore
Enter fullscreen mode Exit fullscreen mode

Conclusion

By leveraging the power of F# and the versatility of Semantic Kernel, we can create AI applications following the pipeline pattern efficiently. This combination not only simplifies our development process but also ensures our applications are maintainable, scalable, and robust. Whether you're new to F# or Semantic Kernel, diving into this pattern can significantly enhance your ability to build sophisticated AI solutions with minimal complexity. Happy coding!

Top comments (0)