rumba

joined 2 months ago
[–] rumba 2 points 2 weeks ago

I would cancel that subscription SOOO FAST.

I'd argue that YTMusic is a superior product to YT, but both put together aren't worth anywhere near the cost. You can get a premium TV/Movie service for that price with family access.

[–] rumba -2 points 2 weeks ago

We've GOT A PAYERR OVER HEREEEEE!!!!!!!

[–] rumba 1 points 2 weeks ago

I think Wil Wheaton had something that was supposed to air on Freevee, the link his PR person gave him just threw you back into the Amazon video page, I've never actually seen any information about the service or a working video stream surface.

It seems like a lot of places are ready to throw millions of dollars into system and just never freaking marking them.

[–] rumba 5 points 2 weeks ago (2 children)

Oh god yes, ran into this asking for a shell.nix file with a handful of tricky dependencies. It kept trying to do this insanely complicated temporary pull and build from git instead of just a 6 line file asking for the right packages.

[–] rumba 10 points 2 weeks ago

This has already started to happen. The new llama3.2 model is only 3.7GB and it WAAAAY faster than anything else. It can thow a wall of text at you in just a couple of seconds. You're still not running it on $20 hardware, but you no longer need a 3090 to have something useful.

[–] rumba 2 points 2 weeks ago

We've never tried a post-morem candidate, stranger thing have happened (just happened)

[–] rumba 2 points 2 weeks ago

You can get a lot done currently with ARC. The mobile ARC versions share system memory, So if you get a mini PC with ARC and upgrade it to 96GB, you can share system ram with the GPU and load decently large models. They're a little slow it not being vram and all, but still useful (and cheap)

https://www.youtube.com/watch?v=xyKEQjUzfAk

I have it running on a zenbook duo with 32GB so I can't load the 70B models, but I works shockingly well.

[–] rumba 2 points 2 weeks ago (1 children)

Sounds like you're getting better numbers than we do :) Wonder if there's some incompatibility in our fleet hardware that you don't have. We're mostly Dell XPS. The biggest problem we regularly have is the audio output and mic inputs going rogue. They'll be using the machine with sound all day, no problem, go into a meeting and there's no sound. They'll have the same problem with microphones. Somehow the browser session behind the scenes doesn't pick up the current default device settings and the volume for the Slack session ends up being muted.

[–] rumba 3 points 2 weeks ago (2 children)

I certainly don't wan to run windows on it :)

I've been running llama keep my telemetry out of the hands of Microsoft/Google/"open"AI. I'm kind of shocked how much I can do locally with a half assed video card, and offline model and a hacked up copy of searxng.

[–] rumba 10 points 2 weeks ago (2 children)

I had my money on Zombie Burnie Sanders, but you might be on to something.

[–] rumba 2 points 2 weeks ago

I honestly had no idea what was in chorizo. I had been making chili with it at home and it came time to make it for work, I stopped by the market near work and they didn't have any. I was all "FINE!, I'll make my own" and looked it up, there are TONS of variations. The one I went for was basically vinegar, coriander, cinnamon, cloves, and most of the spices I already use in chili.

One of my favorite taco shops made one that was very hot and just a touch sweet the cinnamon was forward which I didn't care for at first, but it ended up being amazing, it was also processed fine like round beef. I've been trying to replicate that for a while.

[–] rumba 2 points 2 weeks ago (3 children)

we have 60 ppl, it varies

view more: ‹ prev next ›