Discussion about this post

User's avatar
Linh Tran's avatar

Not sure if you know this. Llama cpp now offers various backends that allow you to use iGPUs (eg vulkan, OpenGL, sycl, etc) which can run pretty fast even on 16GB laptops.

Expand full comment
1 more comment...

No posts