MacRumors1 min read
Ollama Now Runs Faster on Macs Thanks to Apple's...
Ollama, the popular app for running AI models locally on a computer, has released an update that takes advantage of Apple's own machine learning framework, MLX. The result is a hefty speed boost on Macs with Apple silicon. According to Ollama, the new version processes prompts around 1.6 times faster (prefill speed) and nearly doubles the speed at which it generates responses (decode speed). Macs with M5-series chips are said to see the largest improvements, thanks to Apple's new GPU Neural Acce
Read original on macrumors.com0
0Related
Hacker News
LinkedIn Is Illegally Searching Your Computer
Discussed on Hacker News with 538 points and 254 comments.
538
254Hacker News
Artemis II lifts off: four astronauts begin 10-day
Discussed on Hacker News with 206 points and 101 comments.
206
101Hacker News
How the AI Bubble Bursts
Discussed on Hacker News with 117 points and 76 comments.
117
76Get the 10 best reads every Sunday
Curated by AI, voted by readers. Free forever.
Liked this? Start your own feed.
Comment
Sign in to join the discussion.
Loading comments…