this post was submitted on 12 Mar 2024
32 points (100.0% liked)

Reddthat Announcements

691 readers
1 users here now

Main Announcements related to Reddthat.

founded 1 year ago
MODERATORS
 

Edited: this post to be the Lemmy.World federation issue post.

We are now ready to upgrade to postgres 16. When this post is 30 mins old the maintenace will start


Updated & Pinned a Comment in this thread explaining my complete investigation and ideas on how the Lemmy app will & could move forward.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 7 months ago (1 children)
[โ€“] [email protected] 4 points 7 months ago (1 children)

Yes... it is VERY annoying. We have so much resources available and lemmy/postgres will not use them

[โ€“] [email protected] 5 points 7 months ago (1 children)

That's sad.

I've seen you posted in the Lemmy Matrix channel, hopefully you'll be able to find a way soon. I guess you already read the write-up from Db0? https://dbzer0.com/blog/post-mortem-the-massive-lemmy-world-lemmy-dbzer0-com-federation-delays/

[โ€“] [email protected] 4 points 7 months ago (1 children)

Yes. Unfortunately the information gleamed boils down to two reasons:

  • their db was slow to respond
  • their db server ended up being 25ms away from their backend servers which caused the slowness.

Our db server is occasionally slow to respond, but most requests from LW complete in less than 0.1 second. Unfortunately there are times when they take longer. These longer ones are going to be the problem (I believe). As all activities are sequential servers can only catch up as fast as they can process them.

What i've found in the past 30 seconds is that it is not necessarily out database that is the problem but possibly the way lemmy handles the federation. I'm chatting with some of the admins on reddthat and making pretty graphs while looking at walls of logs.

[โ€“] [email protected] 2 points 7 months ago (1 children)

I see, thank you for your work!