DEV Community

Cover image for OpenAI Function Calling
Liam Stone
Liam Stone

Posted on

OpenAI Function Calling

Image Credit: OpenAI

Hello, fellow coders! If you've been exploring the world of AI and chatbots, you've likely heard about OpenAI's amazing language model, GPT-4, and its counterpart GPT-3.5 Turbo. They're powerful tools for transforming the way we interact with technology.

In this post, we're diving into one of their fascinating features: function calling. We'll demystify what it is, why it's useful, and how to use it, even if you're a beginner. So grab a cup of coffee, sit back, and let's get started!

What is Function Calling?

Function calling in the context of GPT-4 and GPT-3.5 Turbo is the ability for these models to understand and generate JSON objects for specific function calls based on user queries. This doesn't mean the model is executing the function, but it's providing you with the necessary information to call the function in your own code.

Why is Function Calling Useful?

This feature opens up a world of possibilities. You can:

  • Create chatbots that answer questions by calling external APIs (like a weather API, for instance).
  • Convert natural language into API calls (imagine turning "Who are my top customers?" into an actual API call).
  • Extract structured data from a block of text.

And that's just scratching the surface!

How to Use Function Calling

Function calling involves four main steps:

  1. Call the model with the user query and a set of functions. You describe the functions you want the model to consider when analyzing the user's input.
  2. Check if the model generates a JSON object for a function call. If the model thinks a function needs to be called based on the user query, it will generate a JSON object.
  3. Parse the JSON and call your function. Take the output from the model and use it to call your function with the appropriate arguments.
  4. Call the model again with the function response. Let the model summarize the results back to the user.

Let's take a look at a Python example:

python
import openai
import json

# A dummy function that always returns the same weather information
def get_current_weather(location, unit="fahrenheit"):
    weather_info = {
        "location": location,
        "temperature": "72",
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return json.dumps(weather_info)

def run_conversation():
    messages = [{"role": "user", "content": "What's the weather like in Boston?"}]
    functions = [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        }
    ]

    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0613",
        messages=messages,
        functions=functions,
        function_call="auto",
    )
    response_message = response["choices"][0]["message"]

    if response_message.get("function_call"):
        available_functions = {"get_current_weather": get_current_weather}
        function_name = response_message["function_call"]["name"]
        function_to_call = available_functions[function_name]
        function_args = json.loads(response_message["function_call"]["arguments"])
        function_response = function_to_call(
            location=function_args.get("location"),
            unit=function_args.get("unit"),
        )

        messages.append(response_message)
        messages.append(
            {"role": "function", "name": function_name, "content": function_response}
        )
        second_response = openai.ChatCompletion.create(
            model="gpt-3.5-turbo-0613",
            messages=messages,
        )
        return second_response

print(run_conversation())
Enter fullscreen mode Exit fullscreen mode

Example courtesy OpenAI

This script mimics a chatbot interaction with a user asking about the weather in Boston. The run_conversation function handles the conversation, using the function calling feature of GPT-3.5 Turbo.

Handling Hallucinated Outputs

Sometimes, the model might generate function calls that weren't provided to it - we call these hallucinated outputs. To mitigate this, use a system message to remind the model to only use the functions it has been provided with.

Conclusion

That's it! With this simple introduction, you are now ready to explore the world of function calling in GPT-4 and GPT-3.5 Turbo. It's a powerful tool that can help you build more advanced and interactive chatbots or data extraction methods. So don't wait - start coding and see where these amazing tools can take you!

Top comments (7)

Collapse
 
exec profile image
exec

Hey, awesome post! Love how you broke down function calling and its utility.

The way these AI models can tap into the broader digital world with function calling is mind-blowing. The proficiency of the gpt-4-0613 model at effectively utilizing function calls never ceases to amaze me. For instance, I've developed a bot that can do cool stuff on GitHub - creating repos, tweaking code, all directed by natural language, using function calls. It's wild to see what's possible!

Anyway, just wanted to say thanks for shedding light on this. I'm hoping lots of people are developing around function calls right now, the possibilities seem to be limitless. Can't wait to see where we go next!

Collapse
 
stonediggity profile image
Liam Stone

Hey Dylan thanks for commenting. This functionality opens up so much potential with LLMs. Happy coding!

Collapse
 
petrbrzek profile image
Petr Brzek

Hey, if anyone wants to play around with OpenAI function calls in UI playground, I created one - LangTale. Here's an example of the weather function. langtale.ai/playground/p/duxgbEYjnW

Collapse
 
stonediggity profile image
Liam Stone

Thanks for sharing this!

Collapse
 
charliemday profile image
charliemday

This is great, have you had any luck limiting the number of items in a JSON array using min/max item keys?

Collapse
 
stonediggity profile image
Liam Stone

I haven't tried that just yet but will take a look!

Collapse
 
catsarebetter profile image
Hide Shidara

This is also a really good guide on how to do this (for the super nerds :))

marcotm.com/articles/information-e...