OpenAI API Functions & Embeddings Course (3/7): Multiple Functions and Multiple Calls

💡 Full Course with Videos and Course Certificate (PDF): https://academy.finxter.com/university/openai-api-function-calls-and-embeddings/

Course Overview

Welcome back to part 3, where we’ll take things one step further and look at having multiple functions and even calling multiple functions in a row.

First, let’s make a second function for ChatGPT to call. We have ChatGPT with access to the current weather right now. Let’s also give it access to the weather forecast!

Open your weather.py file in the apis folder (not the func_descriptions folder!) that has your get_current_weather function we made in tutorial part 2.

Keep all the imports and the existing function as is and just scroll down to the bottom to start the declaration of a new function below:

def get_weather_forecast(location, days=7) -> str:
    try:
        days = 1 if days < 1 else 14 if days > 14 else days
    except TypeError:
        days = 7

    params = {
        "key": config("WEATHER_API_KEY"),
        "q": location,
        "days": days,
        "aqi": "no",
        "alerts": "no",
    }

We declare a function that takes a location argument and a number of days, which defaults to 7 if not provided. The function will return a string, as this is what ChatGPT needs.

We then try to set the days value to 1 if a value of less than 1 was provided or 14 if a value higher than 14 was provided. If there is a TypeError, not a number, we just set the days to the default 7.

Next, we declare the params as a dictionary ahead of the function call, again passing in our API key, the location, and the number of days, setting air quality index and alerts to no. After this basic setup, we can continue with the API call:

    response: requests.models.Response = requests.get(
        "http://api.weatherapi.com/v1/forecast.json", params=params
    )

    response: dict = response.json()
    filtered_response = {}
    filtered_response["location"] = response["location"]
    filtered_response["current"] = response["current"]
    filtered_response["forecast"] = [
        [day["date"], day["day"]] for day in response["forecast"]["forecastday"]
    ]
    return dumps(filtered_response)

We make a requests.get call passing in the API address and the params dictionary we just made, catching the return in the response variable, which is of type requests.models.Response.

Check the previous tutorial for the type explanation. We then convert the response to a dict type by running .json() on it.

The response dictionary we got back is massive and has about 4000 lines, containing detailed information per hour. I won’t show the full output here as it’s massive. This is unnecessarily much detail and will overwhelm the ChatGPT context, so let’s filter out what we really need.

🚀 Recommended: Claude 2 LLM Reads Ten Papers in One Prompt with Massive 200k Token Context

First, we make an empty dictionary called filtered_response. We then set the location key of this new dictionary to have the value of the location key in the response we got, and we do the same for the current key in both dictionaries. Now our filtered_response has both the location and current weather data.

Inside the response is a key named 'forecast' which has another key 'forecastday' within which is a ton of information. We want to only extract the ['forecast']['forecastday']['date'] and ['forecast']['forecastday']['day'] parts for each day in the forecast we received.

We can do this with a list comprehension, which is a fancy way of saying we can make a list of the data we want in a single line of code. We do this by looping over the response['forecast']['forecastday'] list and for each day in that list we add a list to our filtered_response['forecast'] containing the date and day data we want.

Now we have an appropriate response to send to ChatGPT, we use the dumps (dumpstring) method again to convert the dictionary to a string and return it.

Our weather forecast data has now gone from about 4000 lines to about 1/20th of that, cutting away all the hourly data. This is much more manageable and the weather data we send to ChatGPT looks something like this:

{
    "location": {
        ... location data ...
    },
    "current": {
        ... current weather data ...
    },
    "forecast": [
        [
            "2023-07-08",
            {
                ... day data ...
            },
        ],
        [
            "2023-07-09",
            {
                ... day data ...
            },
            },
        ],
        [
            "2023-07-10",
            {
                ... day data ...
            },
            },
        ],
        [
            "2023-07-11",
            {
                ... day data ...
            },
        ],
    ],
}

If you’re in a production environment, you’d want to filter out any and all data that can be left out on a granular level to save tokens as the number of users of your website or program grows. But for now, this will do.

Your full get_weather_forecast function at the bottom of the weather.py file in the apis folder now looks like this:

def get_weather_forecast(location, days=7) -> str:
    try:
        days = 1 if days < 1 else 14 if days > 14 else days
    except TypeError:
        days = 7

    params = {
        "key": config("WEATHER_API_KEY"),
        "q": location,
        "days": days,
        "aqi": "no",
        "alerts": "no",
    }

    response: requests.models.Response = requests.get(
        "http://api.weatherapi.com/v1/forecast.json", params=params
    )

    response: dict = response.json()
    filtered_response = {}
    filtered_response["location"] = response["location"]
    filtered_response["current"] = response["current"]
    filtered_response["forecast"] = [
        [day["date"], day["day"]] for day in response["forecast"]["forecastday"]
    ]
    return dumps(filtered_response)

Let’s add a temporary print statement to the bottom of the file again to test it out:

print(get_weather_forecast("Seoul", days=4))

And if you run your file now you should see a bunch of weather data for Seoul, South Korea, for the next 4 days. Make sure to remove the print statement again after testing as we do not want to run this every time we import the file.

Go ahead and close up the apis > weather.py file

Function Description

Remember we also need a function description to pass to ChatGPT to describe this new function. Open the func_descriptions > weather.py file and add the following to the bottom below the existing function description for the get_current_weather function:

describe_get_weather_forecast = {
    "name": "get_weather_forecast_in_location",
    "description": "This function provides the weather forecast in a specific location for a specified number of days.",
    "parameters": {
        "type": "object",
        "properties": {
            "location": {
                "type": "string",
                "description": "The location as a city name, e.g. Amsterdam.",
            },
            "days": {
                "type": "integer",
                "description": "The number of days to forecast, between 1 and 14.",
            },
        },
        "required": ["location"],
    },
}

Most of this is familiar from the function descriptions we’ve made so far.

💡 Note: We simply added a new key to the properties named days of type integer with a description that it’s the number of days to forecast between 1 and 14. In the required part we’ve only set the location as required.

With us now having both a current weather and a weather forecast function, let’s start playing around with this.

Make a new file called 'Ca_multi_function_weatherGPT.py' in your main base directory. Again, you can use more sensible names but I like the alphabetical lineup for tutorial purposes.

First, we’ll look at providing multiple functions to ChatGPT, then we’ll look at calling multiple functions in a row.

Using Multiple Functions

In your Ca_multi_function_weatherGPT.py file start with the following imports and setup:

import json
import openai
from decouple import config

from apis.weather import get_current_weather, get_weather_forecast
from func_descriptions.weather import (
    describe_get_current_weather,
    describe_get_weather_forecast,
)
from utils.printer import ColorPrinter as Printer

openai.api_key = config("CHATGPT_API_KEY")

These should all be familiar by now, just the usual imports and openai API key setup. We also import our two weather functions and the weather function descriptions plus the ColorPrinter utility. Having all these in separate files keeps our code readable and allows us to reuse the code.

Now let’s start our function below the imports:

def ask_chat_gpt(query):
    messages = [
        {"role": "user", "content": query},
    ]

    functions = [describe_get_current_weather, describe_get_weather_forecast]

    first_response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0613",
        messages=messages,
        functions=functions,
        function_call="auto",  # auto is default
    )["choices"][0]["message"]
    messages.append(first_response)

Again, we don’t provide any setup but get straight into the user query as our first message in the message history. The functions list has 2 functions now, we make the GPT call and then append the first_response to our messages history list.

    if first_response.get("function_call"):
        available_functions = {
            "get_current_weather_in_location": get_current_weather,
            "get_weather_forecast_in_location": get_weather_forecast,
        }
        function_name = first_response["function_call"]["name"]
        function_to_call = available_functions[function_name]
        function_args = json.loads(first_response["function_call"]["arguments"])
        function_response = function_to_call(**function_args)

        messages.append(
            {
                "role": "function",
                "name": function_name,
                "content": function_response,
            }
        )

        second_response = openai.ChatCompletion.create(
            model="gpt-3.5-turbo-0613",
            messages=messages,
        )["choices"][0]["message"]
        messages.append(second_response)

        Printer.color_print(messages)
        return second_response["content"]

    Printer.color_print(messages)
    return first_response["content"]

We again test if there is a function call in the first_response, if not we just skip the whole block and print the message history while returning the first message.

If there is a function call, we make a dictionary of available functions, with the function name ChatGPT knows as the key and the function itself as the value.

We then get the function name from the first_response, get the function to call from the available_functions dictionary, and get the function arguments from the first_response, finally calling the function with the arguments and catching the response in function_response.

We append the function’s response to the message history just like the previous part and then make our second GPT call, appending the second response to the message history. We then print the message history and return the second response’s content.

Now test it out with some print statements at the bottom of the file:

print(ask_chat_gpt("What is the most delicious fruit?"))

Will still return a normal ChatGPT answer without any function calls.

print(ask_chat_gpt("What is the weather in Amsterdam?"))

Will call the current weather function and return an appropriate response.

###### Conversation History ######
user : What is the weather in Amsterdam?
assistant : get_current_weather_in_location({
"location": "Amsterdam"
})
function : {..object with loads of weather data cut out for brevity..}
assistant : The current weather in Amsterdam is 20°C (68°F) with light rain. The wind
is blowing from the southwest at a speed of 33.1 km/h (20.6 mph), and the humidity level is 83%. The visibility is good at 10.0 km (6.0 miles), and there is a 50% cloud cover.
##################################

Let’s ask for a forecast!

print(ask_chat_gpt("What is the weather forecast in Leipzig for the coming 3 days?"))

And we get:

###### Conversation History

user : What is the weather forecast in Leipzig for the coming 3 days?
assistant : get_weather_forecast_in_location({
"location": "Leipzig",
"days": 3
})
function : {..object with loads of weather data cut out for brevity..}
assistant : The weather forecast for Leipzig in the coming 3 days is as follows:

- August 2: Patchy rain is possible with a maximum temperature of 22.6°C (72.7°F) and
a minimum temperature of 13.9°C (57.0°F).
- August 3: Patchy rain is possible with a maximum temperature of 22.9°C (73.2°F) and
a minimum temperature of 16.4°C (61.5°F).
- August 4: Patchy rain is possible with a maximum temperature of 22.1°C (71.8°F) and
a minimum temperature of 15.3°C (59.5°F).

Please note that weather forecasts are subject to change, so it's always a good idea to check for updates closer to the date.
##################################

Awesome, ChatGPT will request the correct function based on which question it is asked! A general question requiring no function call, a question requiring the current weather in a specific location, or a question requiring the weather forecast in a specific location for a specified number of days.

Simplifying the GPT Call

So far, that was pretty simple. Now let’s have some fun and get it to call multiple functions in a row!

First some light prep work again. Inside your apis folder create a new file called 'chat_gpt.py'.

> apis
    > chat_gpt.py

Then inside, let’s abstract away some of the repetitive code we’ve been using as it’s getting pretty tiresome:

import openai
from decouple import config

openai.api_key = config("CHATGPT_API_KEY")


def gpt_3_5_turbo_0613(messages, functions=None, function_call="auto"):
    params = {
        "model": "gpt-3.5-turbo-0613",
        "messages": messages,
    }
    if functions:
        params["functions"] = functions
        params["function_call"] = function_call

    return openai.ChatCompletion.create(**params)

We import openai and set our API key using the decouple.config module.

We then declare a function that executes a GPT API call using the gpt-3.5-turbo-0613 model, taking a messages list as argument, and optionally taking a functions list and a function_call string.

We then declare a params dictionary with the model and messages, and if there are functions or the function_call argument has been specified we add those to the params dictionary as well.

We then simply return the ChatCompletion with the unpacked parameters passed in.

Prompt Setup

Go ahead and save and close this file for now.

Before we attempt to call multiple functions in a row we will use a setup/system message instead of passing the user query straight to ChatGPT as the first message.

To keep our code readable, let’s make a new folder called 'prompt_setups' and create a new file called 'weather.py' inside:

> prompt_setups
    > weather.py

Then put the following variable containing the setup inside:

current_and_forecast_setup = "You are a regular ChatGPT chatbot, just like normal, however, you also have access to some functions that can be called if you need them. One will provide the current weather and one will provide the weather forecast. YOU WILL NOT COMMUNICATE WITH THE USER UNTIL YOU FINISH ALL FUNCTION CALLS. YOU WILL NOT SEND A finish_reason OF STOP UNTIL YOU FINISH ALL FUNCTION CALLS."

These kinds of setups are a bit of a trial and error deal, finding the wording to get ChatGPT to do what you want it.

💡 Note: I found it helpful to stress not communicating with the end user until all function calls are finished and not setting a finish_reason of “stop” before completing all function calls. This kind of setup just grows by trial and error to address whatever issues you happen to run into.

Go ahead and save and exit this file too.

Now it’s time to have some fun! Go back to your base directory and create a new file called 'Cb_multi_functions_multi_calls.py'. Again, using the alphabetical names for tutorial purposes only, so it’s easier to look back through the lessons and review them for your future reference.

Calling Multiple Functions in a Row

In this new file, let’s start by adding our imports again:

import json

from utils.printer import ColorPrinter as Printer
from apis.weather import get_weather_forecast, get_current_weather
from apis.chat_gpt import gpt_3_5_turbo_0613
from prompt_setups.weather import current_and_forecast_setup
from func_descriptions.weather import (
    describe_get_current_weather,
    describe_get_weather_forecast,
)

So far, so good. These should all look very familiar as we made these 'modules' ourselves. Now let’s start our function declaration:

def ask_chat_gpt(query):
    messages = [
        {"role": "system", "content": current_and_forecast_setup},
        {"role": "user", "content": query},
    ]

    functions = [
        describe_get_current_weather,
        describe_get_weather_forecast,
    ]

    current_response = gpt_3_5_turbo_0613(messages, functions)
    current_message = current_response["choices"][0]["message"]
    messages.append(current_message)

    is_calling_function = True if current_message.get("function_call") else False

We pass in a list of messages where we use the setup we just made in the separate setup file and folder. We declare the list of functions again, passing in our function descriptions for ChatGPT to understand them.

As we’re not quite sure this time about how many calls we’ll be making, we’ll just name it current_response instead of first_response.

Notice the ChatGPT function call is much simplified using the helper function we made in the apis > chat_gpt.py file. We catch the response in current_response and then catch the first message in the response in current_message.

We append the current_message to our messages list and then set a boolean variable named is_calling_function to either True or False depending on whether the current_message has a function_call key.

Let’s finish up our function, adding the following code still within the function block.:

    while is_calling_function:
        available_functions = {
            "get_current_weather_in_location": get_current_weather,
            "get_weather_forecast_in_location": get_weather_forecast,
        }
        function_name = current_message["function_call"]["name"]
        function_to_call = available_functions[function_name]
        function_args = json.loads(current_message["function_call"]["arguments"])
        function_response = function_to_call(**function_args)

        messages.append(
            {
                "role": "function",
                "name": function_name,
                "content": function_response,
            }
        )

        current_response = gpt_3_5_turbo_0613(messages, functions)
        current_message = current_response["choices"][0]["message"]
        messages.append(current_message)

        is_calling_function = True if current_message.get("function_call") else False

    Printer.color_print(messages)
    return current_message["content"]

We start a while loop running as long as is_calling_function is True.

Inside the loop, we again make a dictionary of available functions, get the function name from the current_message, get the function to call from the available_functions dictionary, get the function arguments from the current_message, and finally call the function with the arguments and catch the response in function_response.

We append the function response to our messages history list, then make a new ChatGPT call, catching the response in current_response and the message in current_message, and appending the current_message to our messages history list again.

Finally, we set the is_calling_function boolean to True or False depending on whether the current_message has yet another function_call key. So if there is a second function call in a row, the while loop will simply trigger again.

Once the while loop is done, or if it was never triggered in the first place, we print the message history and return the current_message‘s content.

Now let’s test that everything still works:

print(ask_chat_gpt("What is the most delicious fruit?"))

Yep, still a political answer that AI’s don’t have personal preferences.

print(ask_chat_gpt("What is the weather in Seoul?"))

Yep, still calls the function for the current weather in Seoul.

print(ask_chat_gpt("What is the weather forecast in Hamburg for the coming 3 days?"))

Check, still a weather forecast for Hamburg for the coming three days. Looks like heavy rain tomorrow, sorry people of Hamburg!

Now let’s push it a bit!

print(ask_chat_gpt("Please give me the current weather in Seoul and the weather forecast in Amsterdam for the coming three days in a single response."))

And we get:

###### Conversation History ######
system : You are a regular ChatGPT chatbot, just like normal, however you also have access to some functions that can be called if you need them. One will provide the current weather and one will provide the weather forecast. YOU WILL NOT COMMUNICATE WITH THE USER UNTIL YOU FINISH ALL FUNCTION CALLS. YOU WILL NOT SEND A finish_reason OF STOP
UNTIL YOU FINISH ALL FUNCTION CALLS.
user : Please give me the current weather in Seoul and the weather forecast in Amsterdam for the coming three days in a single response.
assistant : get_current_weather_in_location({
"location": "Seoul"
})
function : {...object with loads of weather data cut out for brevity...}
assistant : get_weather_forecast_in_location({
"location": "Amsterdam",
"days": 3
})
function : {...object with loads of weather data cut out for brevity...}
assistant : The current weather in Seoul, South Korea is partly cloudy with a temperature of 32°C (89.6°F). The wind is coming from the west at 5.6 mph, and the humidity is 52%.

The weather forecast for Amsterdam, Netherlands for the coming three days is as follows:

- August 3: Expect moderate rain throughout the day. The maximum temperature will be 18.4°C (65.1°F) and the minimum temperature will be 15.2°C (59.4°F).

- August 4: There may be patchy rain possible. The maximum temperature will be 20.2°C
(68.4°F) and the minimum temperature will be 13.6°C (56.5°F).

- August 5: Expect moderate rain throughout the day. The maximum temperature will be 16.2°C (61.2°F) and the minimum temperature will be 12.6°C (54.7°F).

Please note that these weather conditions and temperatures are subject to change.
##################################

There you go, that’s pretty cool, isn’t it? ChatGPT called our get_current_weather function for Seoul and then called our weather_forecast function for Amsterdam, combining both answers in a single human-readable response.

This is pretty mind-blowing stuff. It actually feels like some kind of intelligence is helping us out here.

Now that we’ve gone pretty deep into extending ChatGPT’s powers using internet APIs, let’s take this up another notch and connect up a database for ChatGPT to query instead of an API to call. We’ll do this in the next tutorial, so see you there!

Check out lesson 4 to keep learning! 🚀

💡 Full Course with Videos and Course Certificate (PDF): https://academy.finxter.com/university/openai-api-function-calls-and-embeddings/