Ollama Model Review: Microsoft’s Phi3

Categorized: , ,

Written:

About Ollama:
Ollama is a local LLM environment that allows users to run open-source large language models and create their own as well. Ollama works offline and runs locally on a user’s machine offering increased self-ownership and privacy for independent offline workflows.

Phi3 from Microsoft is wordy and inaccurate. Phi3 is fast, though – very fast – near instantaneous answers on an M1 8G Macbook Air.

The fly in the ointment is Phi3’s inaccuracy. I would not rely on this model for much at the moment.

#Ai #Opensource #LLM #LocalAi #Ollama #Phi3

About The Author:

Level Price Action
Monthly Supporter

$3.00 per Month.

Select

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

I’m Webmention Enabled!

Leave a Comment above or…
If your website is WebMention enabled write a post on your own website linking to this post to create a WebMention:

If your website is not WebMention enabled – submit the address of the post you’ve written (the one containing a link to this page) in the box below (but it’s probably simpler to leave a comment or get Webmention Enabled).

Subscribe To
Inbox Posts Like This

Your data is kept private and only ever shared with third parties that make this service possible. Read our full Privacy Policy.

MORE FROM THESE tags: