Did you know you can use Local LLMs with MATLAB?
Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™!
Read about it here:
2 Comments
Time DescendingI've been 'playing' with Ollama since it was released. It's great to pull down new/updated models to see what they can do. Typically, models sizes of 7Billion or less run well on my personal laptop with 16 gigs of RAM. While I haven't found one that's good enough for coding, there are a couple such as Mistral and Llama3 that can serve several helpful use cases - such as summarization, brain storming, etc.
Wow, this is awesome. Didn't think about local models. They are getting more and more capable.
Sign in to participate