Skip links

Using OpenAI from Mathematica

Mathematica was among the first to integrate with OpenAI. The functionality is nicely exposed in terms of a few pre-defined functions. Let us explore some of the functionality in today’s article.

The simplest way to get started is to use LLMSynthesize function:

LLMSynthesize Function
LLMSynthesize Function

It can take a few seconds before you get the answer.

If this is the first time you are trying to access OpenAI functionality from Mathematica, you will be prompted to enter your access key:

OpenAI Authorization
OpenAI Authorization

The entered information can be saved for future use as well and you won’t be prompted again.

Here is another interaction with LLM:

Another Example
Another Example

If you wish to restrict the number of generated tokens, it is easy to do that:

Controlling Generated Tokens
Controlling Generated Tokens

Sometimes we want to experiment with different models. We can even specify a specific model to use:

Specifying the Model
Specifying the Model

I could specify the recently released GPT-4o model too!

Using the Latest Model
Using the Latest Model

The above interactions show that we can use the LLMSynthesize function to talk directly to the LLM. What is nice, but not surprising, is that we can build other functions on top of LLMSynthesize. Here is an example:

Building on LLMSynthesize
Building on LLMSynthesize

This means we can write Mathematica functions that can take advantage of Gen-AI where necessary without directly exposing the problem solving logic!

Mathematica also allows us to create Chat-Enabled and Chat-Driven Notebooks, instead of the traditional Notebook. I haven’t explored these, but will do so in the coming weeks.

I hope Wolfram expands the scope of Gen-AI integration by supporting other models as well, including open source ones.

The above examples have been tested in Mathematica 14.0.

Have a nice week ahead!

Leave a comment