Loading...
EmNudge
EmNudge
@Hacubu @LangChainAI @Cloudflare @ollama Anecdotally, ollama with llama3 runs really quickly on an M2 Pro. I suspect any M1+ macbook will have no trouble with it.
posted 4 weeks ago