this post was submitted on 16 Jul 2024
33 points (100.0% liked)

Hardware

396 readers
89 users here now

All things related to technology hardware, with a focus on computing hardware.

Rules:

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.

Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 1 year ago
MODERATORS
top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 9 points 1 month ago

Well I’m gonna say it- future PCs will run 60B parameter models at 200T/s

[–] [email protected] 6 points 1 month ago (2 children)

That's so unnecessary. The way we are going about "AI" is like brute forcing a password. Sure you'll get the job done, but it's the least efficient way to do it. I'm not saying I have the answer, but why not try and find a more efficient way to get the same results instead of building unnecessarily powerful PC's? I can only imagine how much redundancy is present in those LLM black boxes. It's just a word calculator.

[–] [email protected] 9 points 1 month ago (2 children)

I'm gonna go off on a limb and say that they're likely doing both.

[–] [email protected] 1 points 1 month ago

Also going to go off on a limb and say both is better, plus AMD being a compute company has a.. Ahem.. Vested interest?... In the further utilization of ever increasing amounts of compute.

[–] [email protected] -2 points 1 month ago (2 children)

Well they are definitely prioritizing one over the other.

[–] [email protected] 7 points 1 month ago (1 children)

You definitely sound educated and certainly not like a novice just bitching. In terms of AI research, which methods being explored (or not) do you feel need more direct investment?

[–] [email protected] 1 points 1 month ago

Look man, he said he doesn't understand anything. Why don't you just accept that everyone working in AI is stupid and there's a completely better way to do everything.

/s

[–] [email protected] 4 points 1 month ago* (last edited 1 month ago) (1 children)

Man, I sure do wonder which one is going to shown visible returns sooner? Fundamentally reworking how the models work, or simply duct taping more processing power to it?

Obviously the brute force method is going to show the most returns immediately, you're just throwing more resources at it. Efficiency gains take time. While it's absolutely a much bigger deal with AI, that's pretty much the path all all these things have. Crypto mining, ray tracing, 3d graphics, hell even all the way back to 2d graphics.

There's no magic "make run more efficient" button.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

While I agree with you to a certain extent, these technologies always take different paths depending on those priorities. The thing with 3d and 2d graphics was that they were working with limiting technology. In fact I would even use that as an argument against just "building a better machine." Back then they had to make software work with the limitations of the hardware. You couldn't just duct tape two SNES's together and get better performance. They had to be efficient or have no product to even release. Nowadays you can just buy more computing power. Even when it comes to graphics there are so many companies that release unoptimized software onto the market because the consumer can just "build a better machine." Crypto has so much unnecessary redundancy that all of the computations just get thrown out the window while only 1 computer gets to add to the Blockchain and gets that reward for the actual mining.

Those older industries had more limitations than we did so they had to make it as efficient as possible. Now we have so much computing power there is no incentive to make things more efficient save for long term viability. Which none of these companies give a shit about as long as they are making money. I'm not saying they need to hit the magic "efficiency" button. I'm just saying they're lazy and making everyone else pay the price.

[–] [email protected] 4 points 1 month ago

I'm not saying I have the answer, but why not try and find a more efficient way

??? You don't understand the problem yet you claim there's a better way that everyone has missed until now?

Well sure. That's everything.

"Planes are so unnecessary, why haven't they found a better way."

"CPU's have billions of transistors, why haven't they found a better way."