OpenAI's recent post about their Chat Completions API brings exciting new possibilities for developers. Thanks to the newly introduced function calling capability, developers can leverage the power of OpenAI's language models to directly call functions within their applications. This eases the reliance on external tools like LangChain, streamlining the integration process and making it more efficient and production ready.
In this blog post, we explore a simplified approach to function integration using Python's ast and inspect modules, combined with the power of decorators.
Check out the code for this post here.
To streamline the process of incorporating functions into OpenAI's Chat Completions API, we leverage the flexibility of Python decorators. By using decorators, we can dynamically extract relevant information about functions, such as their names, descriptions, arguments, and return types. This information can then be presented to the language model for enhanced interaction.
Now let's inspect the output of that function's .function() method, provided by the decorator:
The great part about this approach is that we can pass it directly into the OpenAI completion call:
At the core of our approach is the FunctionWrapper class. This class utilizes the ast and inspect modules to extract essential details from the function being decorated. Through introspection, we can retrieve the function's name, description, argument names, types, and descriptions, as well as the return type. This information is organized and made accessible to the API, enabling a seamless integration experience with nearly any function.
NOTE: It would appear that OpenAI's function support is limited to strings, so you will need to assume strings being passed in and casting them as needed to other variable types.
The FunctionWrapper class extract_function_info() method uses the ast, inspect and __docs__ functions provided by Python to extract the required information for processing the function for use by the model:
NOTE: We need to ensure we use the Pythonic return notation to ensure we have access to the return type (using the example code in the Gist repo):
This example fetches the top X stories from HackerNews and then thinks about which ones might be related to "AI".
Checkout the code from the Github Gist:
To use this, create a config.py file in the directory and add a variable with your OpenAI token:
You'll want to make sure you also install the OpenAI library for Python:
To run the example, do the following:
One you run the script, you can then ask for the type of stories you'd like summarized:
The simplified approach to function integration presented in this blog post demonstrates a Pythonic way to leverage OpenAI's Chat Completions API. By utilizing decorators, the ast and inspect modules, and the power of Python's introspection capabilities, developers can seamlessly integrate functions and unlock the full potential of OpenAI's language models. With a simplified configuration and a straightforward integration process, developers can now create more efficient and interactive applications.
Keep your eyes peeled for an integration of this approach into PythonGPT, a project that writes and executes code interactively in Python using GPT.
AI: You may also want to check out our recent exploration of Harnessing the Power of Semantic Knowledge Graphs for Unstructured Data with DoctorGPT.
docker run -p 10101:10101 featurebasedb/featurebase
git clone https://github.com/FeatureBaseDB/featurebase-examples.git
docker network create fbnet
docker compose up