this post was submitted on 10 Aug 2024
-1 points (45.5% liked)

Perchance - Create a Random Text Generator

392 readers
6 users here now

⚄︎ Perchance

This is a Lemmy Community for perchance.org, a platform for sharing and creating random text generators.

Feel free to ask for help, share your generators, and start friendly discussions at your leisure :)

This community is mainly for discussions between those who are building generators. For discussions about using generators, especially the popular AI ones, the community-led Casual Perchance forum is likely a more appropriate venue.

See this post for the Complete Guide to Posting Here on the Community!

Rules

1. Please follow the Lemmy.World instance rules.

2. Be kind and friendly.

  • Please be kind to others on this community (and also in general), and remember that for many people Perchance is their first experience with coding. We have members for whom English is not their first language, so please be take that into account too :)

3. Be thankful to those who try to help you.

  • If you ask a question and someone has made a effort to help you out, please remember to be thankful! Even if they don't manage to help you solve your problem - remember that they're spending time out of their day to try to help a stranger :)

4. Only post about stuff related to perchance.

  • Please only post about perchance related stuff like generators on it, bugs, and the site.

5. Refrain from requesting Prompts for the AI Tools.

  • We would like to ask to refrain from posting here needing help specifically with prompting/achieving certain results with the AI plugins (text-to-image-plugin and ai-text-plugin) e.g. "What is the good prompt for X?", "How to achieve X with Y generator?"
  • See Perchance AI FAQ for FAQ about the AI tools.
  • You can ask for help with prompting at the 'sister' community Casual Perchance, which is for more casual discussions.
  • We will still be helping/answering questions about the plugins as long as it is related to building generators with them.

6. Search through the Community Before Posting.

  • Please Search through the Community Posts here (and on Reddit) before posting to see if what you will post has similar post/already been posted.

founded 1 year ago
MODERATORS
 

As usual, the Chrome team is leading the charge on some exciting new web platform tech. The goal is to release some prototypes and eventually write up the feature as a browser standard that would make its way into all browsers (i.e. not just Chrome).

The point is, it'd run completely on-device (no cloud access, works offline), so it'd be a very small model, but would likely still be smart enough for a lot of tasks - e.g. summarizing text, converting a list of words into a grammatically correct sentence/description, guessing an appropriate emotion based on some character dialogue, etc.

Article: https://developer.chrome.com/docs/ai/built-in

The key problem with these text generation models is how massive they are. They're so big that they could literally fill your entire device (for smart phones and cheap laptops, at least), and would bloat the initial browser download time from a few minutes to a few days for a lot of people.

Still, smaller models are getting surprisingly smart, and while they're still several times the size of the actual browser download itself, this download can be done in the background.

Either way, I'm excited about this new direction, because there are lots of tasks that don't require an extremely smart model, and so it's overkill to use /ai-text-plugin, especially since it means ads will be shown for non-logged-in users.

One problem that I do anticipate, is that the models will be extremely "safety-oriented", meaning refusal to even generate stuff like violence in a DnD fantasy adventure, and stuff like that. I know from experience that Google's Gemini models have false-positive-refusal rates that almost make them unusable even for many sfw tasks. There is a mention of LoRA fine-tuning in the article, which is very exciting and might help with that. If you're a web dev, you can use the links on the page to test their prototypes and give constructive+professional feedback on them. It'd be good for the health of the web platform to have some of the feedback be for use-cases like Perchance, and not just e.g. business applications.

Tangentially, builders here may also be interested in Transformers.js which allows you to run AI models in your browser. Ad-free AI plugins could already be created using this project, although for a lot of models the download times are a bit too long, and processing times also a bit too long (for mobile devices especially). Still, the situation is improving quite rapidly. /ai-character-chat already uses Transformers.js for text embedding.

top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 month ago

Eww. I just want my browser to view the web without anything built into it as an excuse for insane hardware integration that is only there to fingerprint and enable stalkerware.

I run AI models on my hardware every day. One is running on my network right now. I can access that in my browser. I don't want my browser running that code with this kind of hardware access using anything less than Rust; C/pp in the worst case.

LLM's are invasive stalkerware on a never before seen level. I will never use any corporate LLM. The last thing I want is more/deeper stalkerware tentacles in a browser.

[–] [email protected] 2 points 1 month ago

Oh yeah I saw that the other day, pretty cool... I guess you could just have another plugin that uses that instead--perhaps with the same API. And even use it as a fallback or something.

What do you mean by using js for text embedding? How does that work with the server-side stuff? Does it add to the api request?

[–] [email protected] 1 points 1 month ago

🔥 🔥 🔥 🔥

[–] [email protected] 0 points 1 month ago

"Built in" - I would guess just an interface like edge did to a online model. If not then how will google track them?