Generative AI
Follow


Did you know you can use Local LLMs with MATLAB?

David on 13 Sep 2024 at 20:29 (Edited on 18 Sep 2024 at 18:33)
Latest activity Edit by David on 18 Sep 2024 at 18:33

Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™!
Read about it here:
David
David on 13 Sep 2024 at 20:59
I've been 'playing' with Ollama since it was released. It's great to pull down new/updated models to see what they can do. Typically, models sizes of 7Billion or less run well on my personal laptop with 16 gigs of RAM. While I haven't found one that's good enough for coding, there are a couple such as Mistral and Llama3 that can serve several helpful use cases - such as summarization, brain storming, etc.
Hans Scharler
Hans Scharler on 13 Sep 2024 at 20:46
Wow, this is awesome. Didn't think about local models. They are getting more and more capable.

Tags