this post was submitted on 12 Jun 2023
386 points (99.7% liked)
Lemmy.World Announcements
29084 readers
220 users here now
This Community is intended for posts about the Lemmy.world server by the admins.
Follow us for server news ๐
Outages ๐ฅ
https://status.lemmy.world/
For support with issues at Lemmy.world, go to the Lemmy.world Support community.
Support e-mail
Any support requests are best sent to [email protected] e-mail.
Report contact
- DM https://lemmy.world/u/lwreport
- Email [email protected] (PGP Supported)
Donations ๐
If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.
If you can, please use / switch to Ko-Fi, it has the lowest fees for us
Join the team
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Ensuring there's no data leakage in those cached calls can be tricky, especially if any api calls return anything sensitive (login tokens, authentication information, etc) but I can see caching all read-only endpoints that return the same data regardless of permissions for a second or two being helpful for the larger servers.
It's also worth noting that postgres does its own query-level caching, quite aggressively too. I've worked in some places where we had to add a
SELECT RANDOM()
to a query to ensure it was pulling the latest data.In my experience, the best benefits gained from caching are done before the backend and are stored in RAM, so the query never even reaches those services at all. I've used varnish for this (which is also what the big CDN providers use). In Lemmy, I imagine that would be the ngnix proxy that sits in-front of the backend.
I haven't heard admins discussing web-proxy caching, which may have something to do with the fact that the Lemmy API is currently pretty much entirely over websockets. I'm not an expert in web-sockets, and I don't want to say that websockets API responses absolutely can't be cached... but it's not like caching a restful API. They are working on moving away from websockets, btw... but it's not there yet.
The comments from Lemmy devs in https://github.com/LemmyNet/lemmy/issues/2877 make me think that there's a lot of database query optimization low-hanging fruit to be had, and that admins are frequently focusing on app configs like worker counts and db configs to maximize the effectiveness of db-level caches, indexes, and other optimizations.
Which isn't to say there aren't gains in the direction your suggesting, but I haven't seen evidence that anyone's secret sauce is in effective web-proxy caches.
I work on nginx cache modules for a CDN provider.
While websockets can be proxied, they're impractical to cache. There are no turn key solutions for this that I'm aware of, but an interesting approach might be to build something on top of NChan with some custom logic in ngx_lua.
I agree with you that web proxy cache's aren't the silver bullet solution. They need to be part of a more holistic approach, which should start with optimizing the database queries.
Caching with auth is possible, but it's a whole can of worms that should be a last resort, not a first one.