this post was submitted on 01 Jul 2023
3756 points (97.2% liked)

Lemmy.World Announcements

28904 readers
9 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news ๐Ÿ˜

Outages ๐Ÿ”ฅ

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to [email protected] e-mail.

Report contact

Donations ๐Ÿ’—

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 1 year ago
MODERATORS
 

Looks like it works.

Edit still see some performance issues. Needs more troubleshooting

Update: Registrations re-opened We encountered a bug where people could not log in, see https://github.com/LemmyNet/lemmy/issues/3422#issuecomment-1616112264 . As a workaround we opened registrations.

Thanks

First of all, I would like to thank the Lemmy.world team and the 2 admins of other servers @[email protected] and @[email protected] for their help! We did some thorough troubleshooting to get this working!

The upgrade

The upgrade itself isn't too hard. Create a backup, and then change the image names in the docker-compose.yml and restart.

But, like the first 2 tries, after a few minutes the site started getting slow until it stopped responding. Then the troubleshooting started.

The solutions

What I had noticed previously, is that the lemmy container could reach around 1500% CPU usage, above that the site got slow. Which is weird, because the server has 64 threads, so 6400% should be the max. So we tried what @[email protected] had suggested before: we created extra lemmy containers to spread the load. (And extra lemmy-ui containers). And used nginx to load balance between them.

Et voilร . That seems to work.

Also, as suggested by him, we start the lemmy containers with the scheduler disabled, and have 1 extra lemmy running with the scheduler enabled, unused for other stuff.

There will be room for improvement, and probably new bugs, but we're very happy lemmy.world is now at 0.18.1-rc. This fixes a lot of bugs.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 1 points 1 year ago (2 children)

it has nothing to do with cookies (and my advice is to NOT clear your cookies if you have a working session), the login form is (was?) broken, the API endpoint for the login kept returning a 404 status code

[โ€“] [email protected] 1 points 1 year ago (1 children)

Mh.. the login endpoint seems to work

Could you please share some more information?

[โ€“] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Mhโ€ฆ the login endpoint seems to work

Yes, only now, shortly before I typed that comment the endpoint always returned a 404 error.

Could you please share some more information?

It was a server-side error, many users in the comments confirmed it multiple times and it was a bit surprising that none were acknowledged: it had nothing to do with browser cache or cookies, the login was broken. Ruud finally acknowledged that it's a bug related to closing registrations. Apparently closing them prevents user login.

[โ€“] [email protected] 3 points 1 year ago

I'm sorry, you are absolutely right!
He was working on a different issue, but I honestly didn't give them much attention because I was investigating the cookie issue.

The issue regarding the API endpoint could be "fixed" by reenabling the registration.
The still-existing cookie issue is reported (https://github.com/LemmyNet/lemmy-ui/issues/1740) and @[email protected] even fixed it already (https://github.com/LemmyNet/lemmy-ui/pull/1741)
We're now just waiting for it to get merged Blobcat bongo

[โ€“] [email protected] 1 points 1 year ago

But the cookies are set just for the path