this post was submitted on 30 Jan 2024
3 points (53.5% liked)

Ask Lemmy

26238 readers
1250 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

Currently, AI models are trained using GPUs. In the future though, Generative AI will probably require its own specialized ASICs to achieve the best performance. This happened with bitcoin mining a few years ago and is also the reason big tech companies are making their own CPUs now.

Since there are only a few companies on the planet capable of producing these chips in bulk, the government could easily place restrictions on the purchase of AI hardware. This would control who has access to the best AI.

Only the government and a few permitted parties have access to the best AI. Everyone else would use worse AI that, while still good enough for most people, could be detected by the government. The government could use their superior models to easily detect whether a post is AI-generated, for example, and provide that insight as a service to citizens.

Effectively, the government becomes the sole purveyor of truth, as opposed to that power being in the hands of whoever can afford the biggest computer.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 7 months ago

Based on your replies, it doesn't seem like you want a discussion on the idea but that you want people to say how good of an idea this is.

Truth is, it's not. It's a half thought out idea that can't work. ASICs aren't that terribly unique that only a handful of chip manufacturers have the ability to make them. There are existing companies that can move on quickly because they already have the infrastructure and processes in place, but other chip manufacturers can enter the space.

This assumes there is no black marker or secondary marker for ASICs.

This assumes that one governments restrictions would be effective when there are companies in more than just the single country with these restrictions.

Restricting hardware also implies that hardware today (or ASICs of tomorrow) are going to stay as the tech for AI. It also hampers the R&D of this type of hardware.

It creates a barrier of entry to startups and smaller business that may use generative AI in positive ways.

It implies that the use of generative AI is inherently dangerous and needs to be regulated.

It assumes that consumer hardware wouldn't be able to match ASICs. ASICs are certainly fast, but enough consumer GPUs would match the processing power of a single ASIC.

It assumes the government is good, truthful, effective, honest, and moral.

It assumes that truth is a black and white construct.

It assumes that there will be a process to check, identify, communicate, and regulate AI generated information.