Let us continue our discussion on using Mathematica to interact with OpenAI (you may want to go through the earlier article as well).
The simplest function to interact with the LLM is LLMSynthesize[].
As you might have guessed, this is a “sync” (non-streaming) call. What if you expect a long response and you don’t want to wait for the complete answer to display, but instead want to display the chunks as and when received?
The LLMSynthesizeSubmit[] function works in “streaming” (async) mode. The following snippet demonstrates this:
In the above, the variable “response” is dynamically concatenated and updated whenever a chunk is received. It won’t be obvious in the image, but you can see it when you execute the code in Mathematica.
As a slight variant of the above code, if you want to keep track of the individual chunks, then the following snippet does that:
Another useful function is LLMTool[]. This is used to enable OpenAI’s “Tool calling” feature. Let us use this to check whether the given number is a Prime number:
Here is an example that calculates the “factorial” of a given number:
As the final example, let us use a tool to find out the current temperature at select cities:
These examples clearly demonstrate how the LLM can use “external tools” to do tasks that it cannot do directly.
Incidentally, although I have configured my system to use OpenAI model (GPT-4o-mini), it is possible to use any other model as well.
I am hoping to see support for Agents in Mathematica in the near future!
The Mathematica Notebook used in this article can be downloaded here. I used Mathematica 14.2 for this article.
Have a nice weekend!
Recent Comments