OpenAI API Functions & Embeddings Course (2/7): Function Calls with Parameters

Rate this post

πŸ’‘ Full Course with Videos and Course Certificate (PDF):

OpenAI API Mastery 2/7: Function Calls with Parameters for AI Coders (Beginners)

Course Overview

Welcome to the second part of the tutorial, where we’ll look at the shortcomings of ChatGPT and use our functions with parameters to overcome them.

First, make a file in your base directory called ‘‘ and copy/paste the following basic code into it:

from decouple import config
import openai

openai.api_key = config("CHATGPT_API_KEY")

def ask_chat_gpt(query):
    result = openai.ChatCompletion.create(
            {"role": "user", "content": query},
    return result["choices"][0]["message"]["content"]

This is just a simple ChatGPT call we’re familiar with. If we ask it something like “What is Python?”, we’ll get a pretty good answer. The problem is that it doesn’t have access to any recent information.

Add the following to your file

print(ask_chat_gpt("What is the weather in Seoul?"))

And now try running it and see what comes back.

I'm sorry, but as an AI language model, I don't have real-time data. However, you can easily check the current weather in Seoul by searching online or using a weather app on your smartphone.

Well, that’s no good, is it? Let’s go ahead and fix that.

Adding a Weather API

First, sign up for a free account on They will give you pro for 14 days for free but it will automatically switch back to free afterward and you don’t have to provide any payment or credit card information, so don’t worry about it.

πŸ‘©β€πŸ’» Recommended: How I Built a Weather App with Python Streamlit

Make sure you get your API key for and add it to the '.env' file we made to store our ChatGPT API key. Again make sure not to use any spaces.


We’ll need to create a function that can provide ChatGPT with the current weather. In the apis folder create a new file called

> apis

In this file, we’ll first add our imports

from decouple import config
from json import dumps
import requests

We use config as always to read our API key (the weather API) from the .env file. We will use the requests library to make the API call and the json library to ‘dump to string’ the response we get from the API, as ChatGPT cannot work with objects and the like, it will only accept strings.

Get the Current Weather

Let’s define our get_current_weather function for ChatGPT to use.

def get_current_weather(location) -> str:
    if not location:
        return (
            "Please provide a location and call the get_current_weather_function again."
    API_params = {
        "key": config("WEATHER_API_KEY"),
        "q": location,
        "aqi": "no",
        "alerts": "no",
    response: requests.models.Response = requests.get(
        "", params=API_params
    str_response: str = dumps(response.json())
    return str_response

We declare a function named get_current_weather which takes a location as an argument and returns a string type (again, the -> str part is optional for this tutorial, but it’s good practice to declare the return type of your functions).

First, we test if a location was given, if not, we return an error message in string format. Then we create a simple object called API_params which contains the parameters we need to send to the API. The 'key' is read from our .env file using the config method, 'q' takes the location from the location argument, and we set the air quality index or AQI and alerts to 'no'.

We declare a variable named response which will hold the response we get from the API.

We use the requests library to make a get request to the API, passing the API_params object as the parameters. We then use the json() method to get the response in JSON format and use the dumps method to convert it to a string, finally returning the string.

The response: requests.models.Response = requests.get(...) part might look confusing if you’re not familiar with it, but it’s just a way to say that the response variable is of type requests.models.Response.

We could declare a variable in the same manner by saying my_variable: str = 'hello world' to declare a variable named my_variable of type string with the value 'hello world'.

You don’t have to provide these type hints but they can be helpful to keep track of what kind of object you are working with. If we look at the documentation for a requests.models.Response type object we will see it has a .json() method which parses the JSON for us, which is why we were able to call the response.json() in the code above.

Add the following to the bottom of your file to test it out.


Now run your Python file to test it and you should get something like the following:

    "location": {
        "name": "Amsterdam",
        "region": "North Holland",
        "country": "Netherlands",
        "lat": 52.37,
        "lon": 4.89,
        "tz_id": "Europe/Amsterdam",
        "localtime_epoch": 1690961507,
        "localtime": "2023-08-02 9:31"
    "current": {
        "last_updated_epoch": 1690960500,
        "last_updated": "2023-08-02 09:15",
        "temp_c": 15.0,
        "temp_f": 59.0,
        "is_day": 1,
        "condition": {
        "text": "Moderate rain",
        "icon": "//",
        "code": 1189
        "wind_mph": 13.6,
        "wind_kph": 22.0,
        "wind_degree": 130,
        "wind_dir": "SE",
        "pressure_mb": 992.0,
        "pressure_in": 29.29,
        "precip_mm": 0.0,
        "precip_in": 0.0,
        "humidity": 100,
        "cloud": 75,
        "feelslike_c": 13.6,
        "feelslike_f": 56.5,
        "vis_km": 6.0,
        "vis_miles": 3.0,
        "uv": 4.0,
        "gust_mph": 19.2,
        "gust_kph": 31.0

Looks good! Make sure to remove the print statement again, then go ahead and save and close our file, we’re done with it for now.

This is the actual function we will be running, but as we learned in the previous tutorial part we also need a description of what our function is and what it does to pass into ChatGPT with our request.

Function Description

To keep our code a little bit organized let’s create a new folder in our base directory named func_descriptions, then create a new file in it called

> apis
> func_descriptions
> utils

Now open the file in your func_descriptions folder and let’s write a function description to feed to ChatGPT:

describe_get_current_weather = {
    "name": "get_current_weather_in_location",
    "description": "This function provides the current weather in a specific location.",
    "parameters": {
        "type": "object",
        "properties": {
            "location": {
                "type": "string",
                "description": "The location as a city name, e.g. Amsterdam.",
        "required": ["location"],

I just named the object describe_get_current_weather and gave it the name get_current_weather_in_location. Remember this name is just for ChatGPT and doesn’t necessarily have to match the name of the real function.

We have a simple description of what the function does, which is to provide the current weather in a specific location.

The parameters we need for our function will be of type ‘object‘ and have only a single property, namely ‘location‘, which will be of type string. We then provide a description for the location property, this description is for ChatGPT to understand what this property is, as ChatGPT is going to generate these properties/parameters for us when calling the function.

Finally, we set the ‘location‘ property as required, telling ChatGPT it must provide a location in order to call this function, as we need it to be able to provide the weather for a specific location. Note we can add multiple required properties to the list. Go ahead and save and close the file in your func_descriptions folder.

Upgrading ChatGPT

Now that we have all our setup done let’s make our super-powered ChatGPT.

Create a new file in your base directory named ''.

(Or feel free to choose a more sensible name, but as this is a tutorial series I’ll stick with this naming as it keeps the files in order alphabetically.)

In this new file let’s first add our imports:

import json
import openai

from decouple import config

from import get_current_weather
from import describe_get_current_weather
from utils.printer import ColorPrinter as Printer

openai.api_key = config("CHATGPT_API_KEY")

We will use json to parse arguments that ChatGPT provides for us and the next two should be familiar by now, openai to call ChatGPT and decouple to read the API key from our .env file.

Then we import the get_current_weather function and its description we just prepared from their respective folders and our color printer utility we made in the previous tutorial part.

We also set our api_key in the openai module so it’s all set up for our coming calls.

Now let’s define a new function:

def ask_chat_gpt(query):
    messages = [
        {"role": "user", "content": query},

    functions = [describe_get_current_weather]

    first_response = openai.ChatCompletion.create(
        function_call="auto",  # auto is default

Our function takes an argument named query, which represents whatever the user typed on our fictional website. We first define the messages list we will send to ChatGPT, which will consist of a single object containing the user’s message. We do not provide any system or setup message before the user query for now.

We define a variable called functions, and even though we only have a single function for now, this must be a list as the ChatGPT API expects a list of functions. We then make our initial ChatGPT call as we did many times before, passing in the model, messages, functions, and function_call="auto" parameters.

Remember function_call="auto" will allow ChatGPT to choose whether or not it wants to call the function we provide.

We directly index into the object to get ["choices"][0]["message"] and catch this in the variable named first_response. Then we access the messages list we declared at the start of the function and append the first response we got back from ChatGPT to it.

Now add the following code while still inside your function declaration:

    if first_response.get("function_call"):
        available_functions = {
            "get_current_weather_in_location": get_current_weather,
        function_name = first_response["function_call"]["name"]
        function_to_call = available_functions[function_name]
        function_args = json.loads(first_response["function_call"]["arguments"])
        function_response = function_to_call(

                "role": "function",
                "name": "get_current_weather_in_location",
                "content": function_response,

        second_response = openai.ChatCompletion.create(
        return second_response["content"]

    return first_response["content"]

First, we call the .get() method on the first_response we received back to see if it has a ‘function_call‘ attribute on it. If it does, we know ChatGPT wants us to call a function.

We then create a dictionary named available_functions which contains the name of the function we gave to ChatGPT as the key and the actual function as the value. We then get the name of the function ChatGPT wants to call from the first_response object and use it to get the actual function from our available_functions dictionary.

We then use the json.loads() method to parse the arguments we received from ChatGPT into a Python object and catch it in the function_args variable, after which we call the function passing in the arguments we received and catch the response in the function_response variable.

Now we must again extend our message history. We append a new object to the messages list, this time with the role of ‘function‘, the name of the function we called, and the function’s return as content.

We then make a second ChatGPT call, this time without providing any functions, but adding the entire message history including the function call and response, and then catch the response in the second_response variable. We append this response to the messages list as well and then print the messages using our color printer utility.

Finally, we return the content of the second response.

If there was no function_call in the first response, all the above code is bypassed and only the final two lines are executed, which is to print the messages and return the content of the first response.

Let’s Test It Out

Go ahead and add the following print statement to the bottom of your file:

print(ask_chat_gpt("What is the most delicious fruit?"))

Now run it and you should get a normal ChatGPT answer as always, with your color printer having an output like follows: (naturally ChatGPT will not tell you what the most delicious fruit is!)

###### Conversation History ######
user : What is the most delicious fruit?
assistant : The most delicious fruit is subjective and varies from person to person. Some popularly loved fruits include strawberries, mangoes, watermelons, and bananas. It ultimately depends on personal preference and taste.

ChatGPT read the query and obviously, no weather call was required, so it just gave us a normal answer. Now let’s try asking it about the current weather in Amsterdam. Replace the previous print statement with this:

print(ask_chat_gpt("What is the weather in Amsterdam?"))

Run your file and check it out!

###### Conversation History ######
user : What is the weather in Amsterdam?
assistant : get_current_weather_in_location({
"location": "Amsterdam"
function : {"location": {"name": "Amsterdam", "region": "North Holland", "country": "Netherlands", "lat": 52.37, "lon": 4.89, "tz_id": "Europe/Amsterdam", "localtime_epoch": 1690969509, "localtime": "2023-08-02 11:45"}, "current": {"last_updated_epoch": 1690968600, "last_updated": "2023-08-02 11:30", "temp_c": 17.0, "temp_f": 62.6, "is_day": 1, "condition": {"text": "Light rain", "icon": "//", "code": 1183}, "wind_mph": 15.0, "wind_kph": 24.1, "wind_degree": 170, "wind_dir": "S", "pressure_mb": 989.0, "pressure_in": 29.21, "precip_mm": 0.0, "precip_in": 0.0, "humidity": 94, "cloud": 50, "feelslike_c": 17.0, "feelslike_f": 62.6, "vis_km": 10.0, "vis_miles": 6.0, "uv": 4.0, "gust_mph": 21.0, "gust_kph": 33.8}}
assistant : The current weather in Amsterdam is light rain with a temperature of 17Β°C
(62.6Β°F). The wind is coming from the south at 24.1 km/h (15 mph), and the humidity is 94%. The visibility is 10.0 km (6.0 miles), and there is a 50% cloud cover.

We can see the user query first and ChatGPT then deciding to call a function, passing in the location argument of “Amsterdam”. We then see the function response, which is the weather in Amsterdam, and finally, the ChatGPT response which is a nicely formatted weather report.

We’ve just made ChatGPT even more powerful! Pretty awesome, right?

That’s it for part 2, I’ll see you soon in the next part, where we’ll look at having multiple potential functions and even making multiple consecutive function calls in a row.

πŸ’‘ Full Course with Videos and Course Certificate (PDF):