From house fires to datacenter fires ๐ฅฒ they grow up so fast
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - [email protected]
- Gaming Laptops - [email protected]
- Laptops - [email protected]
- Linux Hardware - [email protected]
- Mechanical Keyboards - [email protected]
- Microcontrollers - [email protected]
- Monitors - [email protected]
- Raspberry Pi - [email protected]
- Retro Computing - [email protected]
- Single Board Computers - [email protected]
- Virtual Reality - [email protected]
Icon by "icon lauk" under CC BY 3.0
120kW per rack
I knew GPU compute took a lot of energy, but I didn't realize it was 120 kW per rack. That is a stupid amount.
Nvidia? Overheating?
Some things never change.. ๐ฅ
Nvidia ~~Thermi~~ Fermi never forget
Oof, combined with demand and their strong MI300 series, AMD might finally gain some meaningful marketshare in data centers
Really hope amd can compete in the power efficiency market, just like how they are crushing Intel with the very power efficient x3d chips for gaming.