LocalLlama

LocalLlama

5
(620)
Write Review
More
$ 16.50
Add to Cart
In stock
Description

llama.cpp server now supports multimodal! : r/LocalLLaMA

Upgraded to a 3rd GPU (x3 RTX 3060 12GBs) : r/LocalLLaMA

what am I doing wrong? : r/LocalLLaMA

Nvidia gives a shoutout to r/LocalLLaMA in a blog post on generative AI with their Jetson platform : r/LocalLLaMA

I made llama.cpp easier to use : r/LocalLLaMA

LocalLlama

Free Sydney - Sidney finetune on LLaMA 2 : r/LocalLLaMA

Just to be clear 🦙 : r/LocalLLaMA

Your settings are (probably) hurting your model - Why sampler settings matter : r/LocalLLaMA