Ollama and local AI (self.freesoftware)
submitted 1 month ago by possiblylinux127 to c/freesoftware

So you can run a local AI assistant with ollama.


Ollama isn't a complete software and you will need a front end such as oterm or openwebUI.

Keep in mind that ollama has models under various licenses. I would use models under a free license such as Mistral, Mixtral llama2 and Llava. (Lava next uses llama3 which isn't under a free license)

Keep away from models such as llama3 and Gemma as they are under a non free license.

You also can fine tune a model but you will need a significant amount of compute.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here
this post was submitted on 30 May 2024
8 points (100.0% liked)

Free Software

929 readers
1 users here now

What is free software?

Free software is software that respects the 4 software freedoms. The 4 freedoms are

Please note: Free software does not relate to monetary price. Free software can be sold or gratis (no cost)


  1. Please keep on topic
  2. Follow the Lemmy.zip rules
  3. No memes
  4. No "circle jerking" or inflammatory posts
  5. No discussion of illegal content

Please report anything you believe to violate the rules and be sure to include rhetoric on why you think it should be removed.

If you would like to contest mod actions please DM me with your rational as to why you feel that the relivant mod action should be reversed. Remember to use rhetoric and to site any relevant sources. You will only get one chance to argue your point and continued harassment will result in a ban.

Overall this community is pretty laid back and none if the things list above normally are an issue.

founded 1 year ago