Here's an incredible new way to run LLMs on your own machine: llamafile - [url=https://hacks.mozilla.org/2023/11/introducing-llamafile/]https://hacks.mozilla.org/2023/11/introducing-llamafile/It bundles an LLM with the code needed to run it in a single binary using DEEP magic (Cosmopolitan Libc) such that the same binary works on 6 different operating systems
Best part: it works with LLaVA multi-modal... so you can a 4GB file from and:
chmod 755 llamafile-server-0.1-llava-v1.5-7b-q4
./llamafile-server-0.1-llava-v1.5-7b-q
Visit
And now: