About Ollama:
Ollama is a local LLM environment that allows users to run open-source large language models and create their own as well. Ollama works offline and runs locally on a user’s machine offering increased self-ownership and privacy for independent offline workflows.
Phi3 from Microsoft is wordy and inaccurate. Phi3 is fast, though – very fast – near instantaneous answers on an M1 8G Macbook Air.

The fly in the ointment is Phi3’s inaccuracy. I would not rely on this model for much at the moment.
#Ai #Opensource #LLM #LocalAi #Ollama #Phi3
Leave a Reply