this post was submitted on 20 May 2024
-14 points (37.5% liked)

Technology

57455 readers
4586 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] possiblylinux127 10 points 3 months ago (1 children)

That's not how that works. Also we have our own. Its called ollama

[–] [email protected] 1 points 3 months ago (1 children)

How would one set this up with ollama?

[–] possiblylinux127 2 points 3 months ago* (last edited 3 months ago) (1 children)

On which platform?

Basically you need three things. You need the ollama software, a LLM model such as mistral and a front end like openwebui.

Ollama is pretty much just a daemon that has a web api apps can use to query LLMs.

[–] [email protected] 1 points 3 months ago (1 children)

Linux, specifically nobara (a gaming focused fedora distro) for me

Do you have any guides you would recommend?

[–] possiblylinux127 2 points 3 months ago

Actually it is pretty easy. You can either run it in a VM or you can run it in podman.

For a VM, you could install virtual manager and then Debian. From there you need to of course do the normal setup of SSH and disable the root login.

Once you have a Debian VM you can install ollama and pull down llava and mistral. Make sure you give the VM plenty of resources including almost all cores and 8gb of ram. To setup ollama you can follow the guides

Once you have ollama working you can then setup openwebui. I had to use network: host with the ollama environment variable pointed to 127.0.0.1 (loopback)

Once that's done you should be able to access it at the IP of the VM port 8080. The first time it runs you need to click create account.

Keep in mind that a blank screen means that it can't reach ollama.

The alternative setup to this would be podman. You theoretically could create a ollama container and a openwebui container. They would need to be attached to the same internal network. It probably would be simpler to run but I haven't tried it.