Announcements

225 readers
1 users here now

Announcements about system updates or other things related to this instance.

founded 1 year ago
MODERATORS
1
 
 

Hi everyone,

We just had some unexpected downtime due to a disk quickly getting full over night... :/

We will adjust the monitoring to start alerting us earlier for disk space running out so we can actually do something about it.

I took the chance to install the latest OS patches and updates now though since the server was anyway down.

Hope you didnt notice the downtime too much!

2
2
Well Hello There! (lemmy.today)
submitted 5 months ago* (last edited 5 months ago) by [email protected] to c/[email protected]
 
 

Yes I am actually an OG Star Wars nerd. I saw the release of "New Hope" in the theater on opening weekend with my Dad. We've actually seen every SW movie in the theater together. Yes all of them.

Anyway I'm a new Admin here at lemmy.today and @[email protected] asked me to introduce myself so here goes.

As a GenX my 'online' experience started back in the mid-80s using my Commodore to dial into BBSs then it was BBSs on my custom built 286 (Computer Shopper FTW!) By the early-90s I was running rampant on CompuServ using my Tandy 386, in the mid '90s it was AOL on my IBM Aptiva, and by the late 90s it was ISP connections on my custom built Pentium II PCs.

Along the way I've participated in the rise, fall, and replacement of all the Operating Systems, Applications, Forums, and Aggregators that the last four decades have had to offer. (Dammit I'm old!)

Like many Lemmy users I ~~left~~ escaped Reddit last summer when they started seriously enshittifying the site in the IPO runup. I was actully on lemmy.world first but ended up here after they had too much downtime and too many defederations. I like it here, it's a fast and fairly open instance with very little drama.

Speaking of admin / mod styles mine is "Digital Janitor" and I really try to be as no/low drama as possible in that role. I'm here to to keep this instance functional, federated, and the content in line with whatever policies mrmanger or a community sets for itself. I clean up after spammers, remove objectionable or illegal content, and help with user management. That's pretty much it. I'm simply not interested in the power tripping rot that seems to infect so many Admins / Mods.

I ended up as Admin through an offer to help mrmanager when some other instances were threatening to defederate us due to spam and content problems. In the thread where it was being discussed I offered to lend a hand and the next thing I knew I had a red "A" next to my name! (I'm joking, they did actually ask me first and I took a couple of days to think about it before I agreed.)

I'm around quite a bit so if you run into something that needs attention feel free to reach out. ๐Ÿ™‚

3
29
submitted 6 months ago* (last edited 6 months ago) by [email protected] to c/[email protected]
 
 

Hi all,

We were getting several reports of people linking to csam content in the signal groups community. We tried to contact the moderator but no response in several days.

Its difficult to moderate this kind of content. Moderators of a lemmy community cannot be expected to visit each posted group around the clock to make sure it doesnt have csam content. Signal groups can change their content at any time.

So sorry to say, it had to be removed. If you were using it, I hope you find some other way to find new signal groups.

4
 
 

We decided to double the server memory and add more cpu, since we were close to running out sometimes.

So if you notice higher performance, you know what the reason is now. :)

Short post, but just wanted to mention it.

5
 
 

We just deployed the latest version of Lemmy 0.19.2 where the github code includes a possible fix for the outgoing federation issues we have been having.

But lets see before we celebrate. Help us test if outgoing federation seems to work now by making comments, posts and upvotes and see if they appear on other instances.

Of course if the other instances are on Lemmy 0.19.0 or 0.19.1, they could have issues with outgoing federation still until they update.

Release notes for 0.19.2: https://join-lemmy.org/news/2024-01-10_-_Lemmy_Release_v0.19.2_-_More_Federation_Fixes

6
 
 

Please try to comment and post things now, and see if they federate again.

Hopefully you see your activity instantly federated. I have tried making comments both to instances running Lemmy 0.19.1 and 0.18.5 and they all federate like they should.

Hope you have the same experience! ๐Ÿฅณ

7
1
submitted 10 months ago* (last edited 10 months ago) by [email protected] to c/[email protected]
 
 

As you may have read in other threads, this version of Lemmy (0.19.1) seems to have bugs in outgoing federation on some instances.

As a temporary fix, we have added a scheduled restart of Lemmy every hour. It only takes a few seconds to restart, and the big advantage is that your comments and posts are only delayed up to 1 hour before they federate to other instances. You probably wont notice the restart even.

This will be in effect until a bug fix arrives from Lemmy developers, probably after new years sometime.

Thanks for reading and merry x-mas to everyone. :)

8
 
 

Today we spent some time preparing for the big upgrade, by kicking the Lemmy version up to 0.18.5, merging in the latest changes from lemmy-ansible git repository and cleaning up some disk space on the instance.

So tomorrow at 02.00 am Oregon time we will do the update to Lemmy 0.19.

This is 11.00 am CET for people in Europe.

Hopefully all goes well and we come out the other side with a nice new 0.19 version. Its supposed to take about 30 minutes of downtime if there are no issues to solve.

Wish us luck :)


IMPORTANT: You probably need to log out and log in again to be able to post anything since they reworked authentication in this release.


9
 
 

Hi,

We are planning to install Lemmy 0.19 soon, hopefully in the coming week or so. It's a huge release with many new features, and I personally really like that it allows you users to block other instances if you want to.

You can read about all the new features in the link above.

More info coming in a few days about planned downtime and so on. :)

10
 
 

We just had another unscheduled downtime due to a linux kernel bug.

Yesterday we noticed some issues with the server - mainly that we couldnt stop some docker containers. As you know, we are running some extra user web interfaces for Lemmy and we noticed that they started acting weirdly and had one cpu running at 100% constantly.

I wanted to restart those containers but I couldnt stop them. Found some posts online that this is a bug in the Ubuntu linux kernel: https://github.com/moby/moby/issues/43094.

Our stop attempts caused the docker platform to behave weirdly and it started to affect the main Lemmy software, so we did a reboot of the server and installed the latest updates.

We are very sorry for this unscheduled downtime. :/ Did you guys notice weirdness with Lemmy in the last 8 hours or so?

11
 
 

Hi guys!

This weekend we will move lemmy.today over to using object storage for images. We will be serving images from Amazon S3 in the Oregon region (western USA).

The way lemmy software is designed right now, it caches every image federated from other instances. So even if we are small instance, we still have to store a lot of federated images locally on our disk. This leads to disk space running out quickly and we previously had to delete images because of this.

When we delete images, it removes not only those cached images but also user profile icons and banners, as well as community icons and banners. This is why we have some missing images under Communities right now, and also why users have lost their profile pics.

Its been very embarrassing to have to do this, and now we will move to object storage to prevent this from happening in the future. Its much cheaper compared to ordinary disk space and gives better performance for users, so its a win-win. We just need to do a one-time migration over to it.

**Estimated downtime hours: **

Oregon time: Sunday 3 am - 6 am

CET: Sunday 12 pm to 15 pm.

If you have any questions, you know what to do. :)


EDIT: Looks like it went well and images are now served from S3 instead of filling up our disks. :) The url to the images still looks like they are served by the instance, but thats by design appearently. In the background, they are fetched from S3.

Please fill free to re-upload any banners, user avatars or community pictures you had in place before that may have been broken by the disk cleaning before.

  • When you do, you have to create a new picture with a new name for Lemmy to actually replace the image. Otherwise it wont work - ive tried myself. :)

12
 
 

Hi everyone,

As part of cleaning old cached images when the disk went full, it seems also images like your profile picture and banners (if you had those), got deleted.

If you dont mind, would you upload those again? And when you do, you cant upload the same picture. I made attempts myself to upload the same picture, but it needs to be a new picture (not even a rename of the pic works).

Next time I will make a DB query to figure out what pics are local and which are not, and delete only remote ones (cached images from other instances). There is a column in the DB for that, so just need to export a list of remote images and then delete only those.

Despite these growing pains, I think lemmy is still pretty awesome, and there will be tools to make these sort of issues go away in the future. I hear they are working on something for next version already so we will see.

Anyway, enjoy the weekend and once again, sorry for the mess around this issue.

13
 
 

Hi all,

The disk on the instance ran out of space today, due to the way Lemmy software caches images from all other instances. That cache had filled up about 60 GB's of disk, despite us being a small instance with very little local activity.

I had to delete the last 10 days of cached images again, and I plan to delete quite a lot of older cached images as well. The mobile apps seems to not be affected by this (they have a local image cache I believe), but on the web site, this leads to missing thumbnail images.

They are working on a fix for this in the Lemmy software so the disks dont fill up so enormously with cached thumbnails, and as soon as its out, we will install it here.

Hope you guys didnt get too annoyed or sad by the instance being unavailable for a while.

14
1
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

Some bots posted lots of illegal pictures in the https://lemmy.world/c/lemmyshitpost community, and because of federation, those pictures have spread to all instances, including this one.

The lemmy software doesnt have good moderation tools for abuse like this, and the quickest way to get rid of them was to delete all cached images for the last couple of days.

You may see some thumbnail images missing in the web interface, but I personally dont see any missing images in my mobile app. I guess it has its own thumbnail cache.

15
2
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

These ones have been added:

People like them on lemmy.world so didnt want them to missing here. :)

16
 
 

I noticed that the web interface sometimes didnt show all pictures when doing a full reload of the front page. This has been fixed now. It was related to some custom security settings I added last week, and I didnt notice the problem since i use mobile apps myself. But for everyone who uses the web interface a lot, this must have been annoying and has been fixed.

17
2
Lemmy themes! (lemmy.today)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

I added some themes for people who use the web interface. Some are pretty nice I think:

Modern Light:

Hanubeki Cold

Hanubeki Mint Alt Lt

And others.

How to use

  • After you switch to a theme and save your settings, its really important to reload your browser cache, otherwise the theme will look wonky.

  • Do this by holding shift and clicking the Reload current page button in your browser. (or press Shift-Control-R if you are on firefox)

18
1
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

New version of Lemmy just got released by the devs. :)

I plan to wait a few days before I upgrade the instance, just to make sure there isn't any weirdness being discovered. It's a small bug fix release so nothing major.

Link to the announcement: https://lemmy.ml/post/3021118

19
1
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

Hi guys,

Lemmy.today was just updated to 0.18.3 that came out yesterday.

Major changes

This version brings major optimizations to the database queries, which significantly reduces CPU usage. There is also a change to the way federation activities are stored, which reduces database size by around 80%. Special thanks to @phiresky for their work on DB optimizations.

The federation code now includes a check for dead instances which is used when sending activities. This helps to reduce the amount of outgoing POST requests, and also reduce server load.

In terms of security, Lemmy now performs HTML sanitization on all messages which are submitted through the API or received via federation. Together with the tightened content-security-policy from 0.18.2, cross-site scripting attacks are now much more difficult.

Other than that, there are numerous bug fixes and minor enhancements.