Local LLM on EL8

I got the idea of trying to run a local LLM on one of my Rocky Linux 8 computers but every front end (framework?) I try errors out looking for a newer glibc or libc++. Jan actually allowed me to download a llm but errored out looking for a newer libstdc++.so.6 when I tried to interact with the llm I downloaded. The others that I tried (gpt4all and lm-studio) didn’t finish loading themselves before they quit.

Has anyone successfully run a local LLM on Rocky 8? Or am I outta-luck unless I upgrade to version 9? I’m really not in a hurry to do that and this is the first thing I’ve come across that I want to do with el8 and can’t.

If something is requiring a newer glibc, then yes, you will need to migrate to a newer version. Rocky Linux 9 is a higher version and may be more compatible with the software you’re trying to run.

Rocky Linux 8: 2.28
Rocky Linux 9: 2.34
Fedora 40: 2.39

I’ve been doing some more experimenting.

Answering my own question here:

llama.cpp compiles and works fine on Rocky 8.

I have Meta-Llama-3-8B-Instruct.Q2_K.gguf running on this computer right now. :slight_smile:

It isn’t breaking any speed records but it works fine.