LLMs with MATLAB updated to support the latest OpenAI Models

Toshiaki Takeuchi on 5 May 2025 (Edited on 5 May 2025)
Latest activity Reply by cui,xingxing on 7 May 2025

Large Languge model with MATLAB, a free add-on that lets you access LLMs from OpenAI, Azure, amd Ollama (to use local models) on MATLAB, has been updated to support OpenAI GPT-4.1, GPT-4.1 mini, and GPT-4.1 nano.
According to OpenAI, "These models outperform GPT‑4o and GPT‑4o mini across the board, with major gains in coding and instruction following. They also have larger context windows—supporting up to 1 million tokens of context—and are able to better use that context with improved long-context comprehension."
You can follow this tutorial to create your own chatbot with LLMs with MATLAB.
What would you build with the latest update?
cui,xingxing
cui,xingxing on 6 May 2025
If it were possible to highly customize an LLM model for use in your own copilot plugin to assist programming, that would be ideal. Currently, the official "MATLAB copilot" LLM performance is somewhat lacking!
Hans Scharler
Hans Scharler on 6 May 2025
Thanks. Can you describe performance?
cui,xingxing
cui,xingxing on 7 May 2025
The attached compressed file contains the MP4 demonstration video I uploaded. Currently, MATLAB Copilot's code completion still has some limitations, and I'm not sure if it's an issue with the LLM model. Would it be better to give users more flexibility, such as allowing them to switch their own LLM model or choose a specific API to call? Of course, these are just my personal thoughts.
Hans Scharler
Hans Scharler on 5 May 2025
GPT-4.1 is so good...