Mathematica was among the first to integrate with OpenAI. The functionality is nicely exposed in terms of a few pre-defined functions. Let us explore some of the functionality in today’s article.
The simplest way to get started is to use LLMSynthesize function:
It can take a few seconds before you get the answer.
If this is the first time you are trying to access OpenAI functionality from Mathematica, you will be prompted to enter your access key:
The entered information can be saved for future use as well and you won’t be prompted again.
Here is another interaction with LLM:
If you wish to restrict the number of generated tokens, it is easy to do that:
Sometimes we want to experiment with different models. We can even specify a specific model to use:
I could specify the recently released GPT-4o model too!
The above interactions show that we can use the LLMSynthesize function to talk directly to the LLM. What is nice, but not surprising, is that we can build other functions on top of LLMSynthesize. Here is an example:
This means we can write Mathematica functions that can take advantage of Gen-AI where necessary without directly exposing the problem solving logic!
Mathematica also allows us to create Chat-Enabled and Chat-Driven Notebooks, instead of the traditional Notebook. I haven’t explored these, but will do so in the coming weeks.
I hope Wolfram expands the scope of Gen-AI integration by supporting other models as well, including open source ones.
The above examples have been tested in Mathematica 14.0.
Have a nice week ahead!
Recent Comments