The new M5 Max MacBook with 128 GB of RAM can now easily run Llama 70B as a local LLM. Running that AI 1.5 years ago would’ve taken you a 40k GPU cluster.
The new M5 Max MacBook with 128 GB of RAM can now easily run Llama 70B as a local LLM. Running that AI 1.5 years ago would’ve taken you a 40k GPU cluster.