Using OpenAI from Mathematica: Part-2

Written by on October 2, 2024 in Mathematica, OpenAI, Programming with 0 Comments

I had written an earlier article showing how to use OpenAI models from Mathematica ver 14.0.  Wolfram Mathematica ver 14.1 was released recently, with several improvements in the area of LLMs. Of course, there are many other core additions as well, but our focus is on LLMs in this article.

This version supports many vendors other than OpenAI. For example, Anthropic, Cohere, and MistralAI are also supported.  We can even use many other open-source models via Groq and TogetherAI. That is great news! 

I will be using my OpenAI account with all the examples discussed in today’s article.

Let us start with the LLMSynthesize[] function that we looked at in the previous article. It is the easiest to get started with.

LLMSynthesize Function

LLMSynthesize Function

Let us suppose we wanted to generalize this and turn it into a function. Although we can implement a Mathematica function over LLMSynthesize[] to take arguments and return values, it is more convenient to use LLMFunction[] for this.

LLMFunction

LLMFunction

What about those cases where the LLM has to use an external “function” or source to produce the correct answer? One common use case is retrieving information about some fact that the LLM’s knowledge base (at the time of training) might not have. To keep things simple, I will try to calculate the “Factorial” of a number by using Mathematica’s function instead of relying on the LLM to give me the correct answer.

For this we use the LLMTool[] function. 

LLMTool Function

LLMTool Function

Here we pass the name, a brief description, the argument, and the actual function that does the job.

Here is how we use it in LLMSynthesize[]:

Using the Tool Function

Using the Tool Function

We can wrap this inside a function that takes an integer and passes it to the LLMSynthesize[] function.

Parameterizing the Tool Call

Parameterizing the Tool Call

Another cool addition is LLMExampleFunction[]. This function is capable of generating text with a prompt dynamically generated from a list of examples. Let us check it out.

Learning from Examples

Learning from Examples

Pretty neat, isn’t it?

There is also the ChatObject[] that models an ongoing chat with the LLM. It stores the complete conversation along with metadata. Here is how we can use it.

Chat Mode

Chat Mode

The new version even has support for Semantic Search and RAG. More on this in another article.

Have a Great Day!

Tags: ,

Subscribe

If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS Feed

Leave a Reply

Your email address will not be published. Required fields are marked *

Top