this post was submitted on 17 Jun 2023
1084 points (98.9% liked)

Lemmy.World Announcements

29104 readers
5 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to [email protected] e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
 

CEO Steve Huffman says tech giants should not be able to trawl Reddit’s huge store of data for free. But that information came from users, not the company

That “corpus of data” is the content posted by millions of Reddit users over the decades. It is a fascinating and valuable record of what they were thinking and obsessing about. Not the tiniest fraction of it was created by Huffman, his fellow executives or shareholders. It can only be seen as belonging to them because of whatever skewed “consent” agreement its credulous users felt obliged to click on before they could use the service.

Ouch

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

that's how they did it. They put a 10 request a minute on bots and a higher oauth limit (100) for individuals. large User client type apps could have somewhat easily converted over to that system but due to time constraint they didn't. I do think they extorted their third party devs sure but, honestly the individual user limit isn't super unreasonable as long as you aren't liking or disliking every post. the search api is 100 posts per Api request, it was more the no NSFW and the no advertising limits they put on it that sucked

edit: its actually 10 or 100 per minute not hour

[–] [email protected] 1 points 1 year ago (1 children)

It's not that simple, because the third party apps ship with a single api key. So I used Relay for reddit, and used the same api key as everyone else on that app. You could create an app, and then have everyone make their own key, but that is just asking for trouble. Definitely too technical for most people, and you would probably need to put in billing info for a scenario where you go above the free-tier call limit.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Yeah but if you're going to use the oauth 2 method you don't use the same API key as everyone, how that works is you authorize your account with the bot, the company gives you a bearer token and then that token is what's used for rate limits. The Bot client token is not used in that process, the oauth2 bearer token is

this is taken from the reddit Api docs: As of July 1, 2023, we will enforce two different rate limits for those eligible for free access usage of our Data API. The limits are:

If you are using OAuth for authentication: 100 queries per minute (QPM) per OAuth client ID
If you are not using OAuth for authentication: 10 QPM

so apperently I undershot it, it's actually 100 requests per minute not per hour like I originally thought it was