this post was submitted on 14 Jun 2023
10 points (100.0% liked)

Stable Diffusion

4256 readers
8 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
 

While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

Corps passing their stuff as "technically open source" would also be a problem. Google controls a lot of web by open source Chrome, Microsoft controls dev IDEs by open source VS Code. I'm sure OpenAI would find more ways to pretend to be "open" again if it would be more profitable than saying they have scary monster AIs they can't release publicly.

Open source exceptions would have to be in tandem with breaking them up somehow and setting some limits to their activities.