Lemmy.World Announcements

30158 readers
13 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world/

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS
1
8
submitted 2 weeks ago* (last edited 2 weeks ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world
 
 

Hello world,

we've had various smaller updates lately that didn't all warrant their own update posts and we don't want to post too many announcements around the same time, so we collected them for a single larger post.

New alternative Lemmy frontend: Tesseract

We have recently added a new alternative Lemmy frontend to our collection: Tesseract.

Tesseract was forked from Photon a while back and includes a variety of additional customization options and moderation utilities.

It is available at https://t.lemmy.world/.

Lemmy-UI update to 0.19.11-ish

We have deployed a custom build of Lemmy-UI, the default Lemmy web interface, which includes most of the features included in the official 0.19.11 release.

While we haven't updated our backend to a newer version yet, as we still have to find a solution for dealing with the newly integrated functionality to send emails on rejected registration applications, all the frontend features that don't require a backend update have been included. The only part that is currently missing is Lemmy's new donation dialog, as this requires a backend upgrade as well.

You can find the list of changes in the frontend section in the announcement for the 0.19.11 release.

Defederation from lemmy.one and r.nf

A Lemmy.World user informed us about an instance we are federated with that was hosting very illegal content a while back. This was a result of an attack more than a year ago, and said content federated to many other instances, which made local copies of the material. Unfortunately, when this material was taken down at the source, that action did not federate to all linked instances, meaning that there are still some instances showing this material.

Once we were made aware of this, we realized that this was likely not the only occurrence, so we started looking for other instances where this content may also still exist. We have identified more than 50 affected instances and already reached out to many of them to inform them about this content to have it taken down.

Among these instances, r.nf and lemmy.one were some of the first instances that were informed, but even after 2 months since the initial report there has been zero reaction from either instance. Both of these instances don't appear to be moderated, as evident also by posts asking whether the instance is still maintaned on lemmy.one and 2 month old spam in r.nf's main community.

The community that gets hit the hardest by this is !privacyguides@lemmy.one, which is the only larger community across these instances. We recommend looking for alternative communities on other instances.

Due to the lack of action and response we have since also reported this directly to their hosting providers through Cloudflare, which includes an automatic report to NCMEC.

Even when this material will get taken down now, we don't currently believe that the instance operators are willing or able to moderate these instances properly, so we will keep them defederated unless they can convince us that they are going to moderate their instances more actively and ensure that they provide usable abuse contacts that don't require going through their hosting provider.

We also defederate from other instances from time to time due to lack of moderation and unreachable admins among other reasons. If you're interested in the reasons for our defederations, we aim to always document them on Fediseer. Be warned though, as this list contains a mentions or references to various disturbing or illegal material.

Most of those instances are either very small, don't interact with Lemmy much anyway, or are explicitly stating support of content that is incompatible with our policies.

We also usually try to reach out to affected instances prior to defederation if we believe that they may not intentionally be supporting the problematic content.

We have temporarily re-federated to lemmy.one to allow this post and https://lemmy.world/post/28173100 to federate to them. We're waiting for federation to catch up with the activities since we defederated a day ago originally before we defederate again.

Reliability of media uploads

We have recently been receiving some reports of media uploads not working from time to time. We have already addressed one of the underlying issues and are working on addressing another one currently. Please continue to let us know about issues like that to ensure that they're on our radar.

We're currently also working on improving our overall application monitoring to collect more useful information that helps us track down specific issues, improve visibility for errors, as well as hopefully allowing us to identify potential performance issues.

Parallel federation

Back in Lemmy 0.19.6, Lemmy introduced the option to send federated activities in parallel. Without this, Lemmy would only ever have one activity in the process of being transmitted to another instance. While most instances don't have a large number of activities going out, we're at the point where instances far away from us are not able to keep up with our traffic anymore due to physics limitations when waiting for data from the other side of the world.

Some instances mitigated this by setting up an external federation queue near our instance that would batch activities together to work around these limitations while this was not implemented in Lemmy and deployed on our end. Unfortunately this also meant having to maintain an additional server, which means time investment, a few bucks every month to pay, as well as another potential component that could break.

We have enabled 2 parallel sends around a week ago and aussie.zone, who were pretty much constantly lagging behind multiple days have finally caught up with us again. We will continue to monitor this and if needed increase the number of parallel sends further in the future, but so far it looks like we should be fine with 2 for a good while.


edit: added section about parallel federation

2
 
 

Hello world,

as many of you may already be aware, there is an ongoing spam attack by a person claiming to be Nicole.

It is very likely that these images are part of a larger scale harassment campaign against the person depicted in the images shared as part of this spam.

Although the spammer claims to be the person in the picture, we strongly believe that this is not the case and that they're only trying to frame them.

Starting immediately, we will remove any images depicting "Nicole" and information that may lead to identifying the real person depicted in those images to prevent any possible harassment.
This includes older posts and comments once identified.

We also expect moderators to take action if such content is reported.

While we do not intend to punish people posting this once, not being aware of the context, we may take additional actions if they continue to post this content, as we consider this to be supporting the harassment campaign.

Discussion that does not include the images themselves or references that may lead to identifying the real person behind the image will continue to be allowed.

If you receive spam PMs please continue reporting them and we'll continue working on our spam detections to attempt to identify them early before they reach many users.

3
59
submitted 3 weeks ago* (last edited 3 weeks ago) by Thekingoflorda@lemmy.world to c/lemmyworld@lemmy.world
 
 

Hello world,

Today we are starting an April Fools charity event called: Lemmy Silver! accessible to all users of Lemmy with an account made before today (also non-world users!).

From now on, every 24 hours, you can comment !lemmysilver under any post in participating communities (see: this post for more information), or send a PM to this bot account with (!lemmysilver [username]) to award the poster points. We will keep a score depending on the amount of votes you send and receive. More details can be found in this post.

In this post we will keep a live leaderboard. At the end of April the users with the highest score will get these prizes:

  1. €150 to a charity of choice
  2. €100 to a charity of choice
  3. €75 to a charity of choice
  4. €50 to a charity of choice
  5. €25 to a charity of choice

If you are a moderator and want to add your community to the whitelist, type !whitelistsilver in the comments or in a PM to the bot account to whitelist all the communities you moderate, or send a PM to me (Thekingoflorda).

The prize fund is made up by personal donations from members of the admin team and the moderation team, no money from the Fedihosting Foundation is used for this event. If you want to contribute to the prize fund, please send me a PM.

We also made a little survey for this event, so if you have 2 minutes, please fill it out: https://app.formbricks.com/s/cm8x96xjc022vvt01xr0z9tdp

Feel free to leave any feedback here in the comments, in !LemmySilver@lemmy.world or by sending them to my PMs. The bot's PM will probably not be read. Please be honest, but respectful and keep the TOS in mind.

Thank you everyone! Hope you have fun.

PS. The bot might be a bit unstable in the beginning, so please send all your complaints my way so I can fix the inevitable bugs.

4
 
 

Hello,

we will be updating pict-rs to the latest version in about 2 hours.

We expect a short downtime of 1-2 minutes during the planned migration window, as there are no major database changes involved.

Most users won't be affected by this, as the majority of our media is cached and served by Cloudflare. This should primarily only affect thumbnail generation and uploads of new media while the service is down.

You can convert this to your local time here: https://inmytime.zone/?iso=2025-03-28T22%3A00%3A00.000Z


The update has been completed successfully.

5
 
 

Hello,

as some of you may have noticed we just had about 25 minutes of downtime due to the update to Lemmy 0.19.10.

Lemmy release notes: https://join-lemmy.org/news/2025-03-19_-_Lemmy_Release_v0.19.10_and_Developer_AMA

This won't fix YouTube thumbnails for us, as YouTube banned all IPs belonging to our hosting provider.

We were intending to apply this update without downtime, as we're looking to apply the database migration that allows marking PMs as removed due to the recent spam waves.

Although this update contains database migrations, we expected to still be able to apply the migration in the background before updating the running software, as the database schema between the versions was backwards compatible. Unfortunately, once we started the migrations, we started seeing the site go down.

In the first minutes we assumed that the migrations contained in this upgrade were somehow unexpectedly blocking more than intended but still processing, but it turned out that nothing was actually happening on the database side. Our database deadlocked due to what appears to be an orphaned transaction, which didn't die even after we killed all Lemmy containers other than the one running the migrations.

While the orphaned transaction was pending, a pending schema migration was waiting for the previous transaction to complete or be rolled back, so nothing was moving anymore. As the previous transaction also didn't move anymore everything started to die. We're not entirely sure why the original transaction broke down, as it was started about 30 seconds before the schema migration query, which seems like that shouldn't have been broken by that happening at the same time.

Lemmy has a "replaceable" schema, which is applied separately from the regular database schema migrations, which runs every time a DB migration occurs. We unfortunately did not consider this replaceable schema migration in our planning, as we would otherwise have realized that this would likely have larger impact on the overall migration.

After we identified that the database had deadlocked, we resorted to restarting our postgres container, then run the migration again. Once we restarted the database, everything was back online in less than 30 seconds, which includes first running the remaining migrations and then starting up all containers again.

When we tested this process on our test instance prior to deploying this to the Lemmy.World production environment we did not run into this issue. Everything was working fine with the backend services running on Lemmy 0.19.9 and the database being upgraded to Lemmy 0.19.10 schema already, but the major difference here is the lack of user activity during the time of the migration.

Our learning from this is to always plan for downtime for Lemmy updates if any database migrations are included, as it does not appear to be possible to "safely" apply them even if they seem small enough to be theoretically doable without downtime.

6
 
 

Hello,

we will be performing the long awaited update to Lemmy 0.19.9 tomorrow.

We are planning for around 1 hour of downtime between 16:00-17:00 UTC on 16th of March.

You can convert this to your local time here: https://inmytime.zone/?iso=2025-03-16T16%3A00%3A00.000Z

You can find an overview of the changes in our previous announcement here and in the Lemmy release notes:


Update 16:50 UTC:

The upgrade was successfully completed at around 16:27 UTC, but we're still fighting with some performance issues after the upgrade. Our database and the outbound federation container are currently using significantly higher CPU than expected, which is still being investigated to identify the root cause.

7
 
 

Hello World,

as many of you know, several newer Lemmy versions have been released since the once we are currently using.

As this is a rather long post, the TLDR is that we're currently planning for late January/early February to update Lemmy.World to a newer Lemmy release.

We're currently running Lemmy 0.19.3 with a couple patches on top to address some security or functionality issues.

As new Lemmy versions have been released, we've been keeping an eye on other instances' experiences with the newer versions, as well as tracking certain issues on GitHub, which might impact stability or moderation experience.

We updated to Lemmy 0.19.3 back in March this year. At that point, 0.19.3 had been released for a little over a month already and at that point all the major issues that troubled the earlier 0.19 releases had been addressed.

Several months later, in June, Lemmy 0.19.4 was released with several new features. This was a rather big release, as a lot of changes had happened since the last release. Only 12 days later 0.19.5 was released, which fixed a few important issues with the 0.19.4 release. Unfortunately, Lemmy 0.19.5 also introduced some changes that were, and to some part are still not fully addressed.

Prior to Lemmy 0.19.4, regular users may see contents of removed or deleted comments in some situations, primarily when using third party apps. Ideally, this would have been fixed by restricting access to contents of removed comments to community moderators in the communities they moderate, as well as admins on each instance. Deleted comments will be overwritten in the database after some delay, but they might still be visible prior to that. This is especially a problem when moderators want to review previously removed comments to either potentially restore them or to understand context in a thread with multiple removed comments. Lemmy modlog does not always record individual modlog entries for bulk-removed items, such as banning a user while also removing their content would only log their ban but not the individual posts or comments that were removed.

We were considering writing a patch to restore this functionality for moderators in their communities, but this is unfortunately a rather complex task, which also explains why this isn't a core Lemmy feature yet.

While admins can currently filter modlog for actions by a specific moderator, this functionality was lost somewhere in 0.19.4. While this isn't something our admin team is using very frequently, it is still an important feature to have available for us for the times we need it.

This also included a few security changes for ActivityPub handling, which resulted in breaking the ability to find e.g. Mastodon posts in Lemmy communities by entering the post URL in the search. It also caused issues with changes to communities by remote moderators.

The 0.19.4 release also broke marking posts as read in Sync for Lemmy. Although this isn't really something we consider a blocker, it's still worth mentioning, as there are still a lot of Sync for Lemmy users out there that haven't noticed this issue yet if they're only active on Lemmy.World. Over the last 2 weeks we've had nearly 5k active Sync for Lemmy users . This is unfortunately something that will break during the upgrade, as the API has changed in upstream Lemmy.

There are also additional issues with viewing comments on posts in local communities that appear to be related to the 0.19.4/0.19.5 release, appear to be a lot more serious. There have been various reports of posts showing with zero comments in Sync, while viewing them in a browser or another client will show various comments. It's not entirely clear to us right now what the full impact is and to what extent it can be mitigated by user actions, such as subscribing to communities. If anyone wants to research what is needed to restore compatibility and potentially even propose a patch for compatibility with both the updated and the previous API version we'll consider applying it as a custom patch on top of the regular Lemmy release.

If there won't be a Sync update in time for our update and we won't have a viable workaround available, you may want to check out !lemmyapps@lemmy.world to find potential alternatives.

There were also several instances reporting performance issues after their upgrades, although they seemed to mostly have been only for a relatively short time after the upgrades and not persistent.

Lemmy 0.19.6 ended up getting released in November and introduced quite a few bug fixes and changes again, including filtering the modlog by moderator. Due to a bug breaking some DB queries, 0.19.7 was released just 7 days later to address that.

Among the issues fixed in this release were being able to resolve Mastodon URLs in the search again and remote moderators being able to update communities again.

0.19.6 also changed the way post thumbnails generated, which resulted thumbnails missing on various posts.

A month later, now we're in December, 0.19.8 was released.

One of the issues addressed by 0.19.8 was Lemmy returning content of removed comments again for admins. For community moderators this functionality is not yet restored due to the complexity of having to check mod status in every community present in the comment listing.

At this point it seems that most of the issues have been addressed, although there seem to still be some remaining issues relating to thumbnails not reliably being created in some cases. We'll keep an eye on any updates on that topic to see if it might be worth waiting a little longer for another fix or possibly deploying an additional patch even if it may not be part of an official Lemmy release yet at the time.

While we were backporting some security/stability related changes, including a fix for a bug that can break federation in some circumstances when a community is removed, we accidentally reverted this patch while applying another backport, which resulted in our federation with lemmy.ml breaking back in November. This issue was already addressed upstream a while back, so other instances running more recent Lemmy versions were not affected by this.

Among the new features released in the Lemmy versions we have missed out on so far, here are a couple highlights:

  • Users will be able to see and delete their uploads on their profile. This will include all uploads since we updated to 0.19.3, which is the Lemmy version that started tracking which user uploaded media.
  • Several improvements to federation code, which improve compatibility with wordpress, discourse, nodebb.
  • Fixing signed fetch for federation, enabling federation with instances that require linked instances to authenticate themselves when fetching remote resources. Not having this is something we've seen cause issues with a small number of mastodon instances that require this.
  • Site bans will automatically issue community bans, which means they're more reliable to federate.
  • Deleted and removed posts and comments will no longer show up in search results.
  • Bot replies and mentions will no longer be included in notification counts when a user has blocked all bots.
  • Saved posts and comments will now be returned in the reverse order of saving them rather than the reverse order of them being created.
  • The image proxying feature has evolved to a more mature state. This feature intends to improve user privacy by reducing requests to third party websites when browsing Lemmy. We do not currently plan on enabling it with the update, but we will evaluate it later on.
  • Local only communities. We don't currently see a good use for these, as they will prevent federation of such communities. This cuts off users on all other instances, so we don't recommend using them unless you really want that.
  • Parallel sending of federated activities to other instances. This can be especially useful for instances on the other side of the world, where latency introduces serious bottlenecks when only sending one activity at a time. A few instances have already been using intermediate software to batch activities together, which is not standard ActivityPub behavior, but it allows them to eliminate most of the delays introduced by latency. This mostly affects instances in Australia and New Zealand, but we've also seen federation delays with instances in US from time to time. This will likely not be enabled immediately after the upgrade, but we're planning to enable this shortly after.

edit: added information about sync not showing comments on posts in local communities

8
 
 

Hello World,

following feedback we have received in the last few days, both from users and moderators, we are making some changes to clarify our ToS.

Before we get to the changes, we want to remind everyone that we are not a (US) free speech instance. We are not located in US, which means different laws apply. As written in our ToS, we're primarily subject to Dutch, Finnish and German laws. Additionally, it is our discretion to further limit discussion that we don't consider tolerable. There are plenty other websites out there hosted in US and promoting free speech on their platform. You should be aware that even free speech in US does not cover true threats of violence.

Having said that, we have seen a lot of comments removed referring to our ToS, which were not explicitly intended to be covered by our ToS. After discussion with some of our moderators we have determined there to be both an issue with the ambiguity of our ToS to some extent, but also lack of clarity on what we expect from our moderators.

We want to clarify that, when moderators believe certain parts of our ToS do not appropriately cover a specific situation, they are welcome to bring these issues up with our admin team for review, escalating the issue without taking action themselves when in doubt. We also allow for moderator discretion in a lot of cases, as we generally don't review each individual report or moderator action unless they're specifically brought to admin attention. This also means that content that may be permitted by ToS can at the same time be violating community rules and therefore result in moderator action. We have added a new section to our ToS to clarify what we expect from moderators.

We are generally aiming to avoid content organizing, glorifying or suggesting to harm people or animals, but we are limiting the scope of our ToS to build the minimum framework inside which we all can have discussions, leaving a broader area for moderators to decide what is and isn't allowed in the communities they oversee. We trust the moderators judgement and in cases where we see a gross disagreement between moderatos and admins' criteria we can have a conversation and reach an agreement, as in many cases the decision is case-specific and context matters.

We have previously asked moderators to remove content relating to jury nullification when this was suggested in context of murder or other violent crimes. Following a discussion in our team we want to clarify that we are no longer requesting moderators to remove content relating to jury nullification in the context of violent crimes when the crime in question already happened. We will still consider suggestions of jury nullification for crimes that have not (yet) happened as advocation for violence, which is violating our terms of service.

As always, if you stumble across content that appears to be violating our site or community rules, please use Lemmys report functionality. Especially when threads are very active, moderators will not be able to go through every single comment for review. Reporting content and providing accurate reasons for reports will help moderators deal with problematic content in a reasonable amount of time.

9
0
submitted 2 years ago* (last edited 2 years ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world
 
 

Hello World,

Today, after careful consideration and evaluation of recent events, we have decided to defederate from Lemmygrad.

Regrettably, we have observed a significant increase in hate speech and calls to violence originating from the Lemmygrad instance. Due to the severity of the posts and comments, we are not waiting for the next Lemmy update that will allow users to block instances.

At Lemmy.world, we have always strived to foster an inclusive and welcoming user environment. However, recent posts and comments from Lemmygrad have clearly violated our server rules and, more importantly, our core values. We firmly believe that hate speech and incitement of violence have no place in our community, regardless of personal beliefs or affiliations.

As always, we encourage all users to report any content they deem inappropriate or harmful. No matter one's stance in any conflict, Lemmy.world will always take immediate action to remove and ban any posts or comments that incite violence or propagate hatred.

We encourage everyone to continue engaging in discussions within the boundaries of respect and understanding. As we move forward with this decision, we remain committed to providing all community members with a safe and welcoming space. We appreciate your continued support and cooperation in upholding our shared principles.

Thank you,

The Lemmy.World Team

10
 
 

Hello World!

As we've all known and talked about quite a lot, we previously blocked several piracy-focused communities. These communities, as announced, were:

In our removal announcement, we stated that we will continue to look into this more in detail, and re-allow these communities if and when we deem it safe. It was a solid concern at the time, because we were already receiving takedown requests as well as constant attacks, and didn't want to put our volunteer team at risk. We had zero measures in place, and the tools we had were insufficient to deal with anything at scale.

Well, after back and forth with some very cool people, and starting to have proper measures as well as tooling to protect ourselves, we decided it's time to welcome these communities back again. Long live the IT nerds!

We know it's been a rough ride with everything, and we'd like to thank every one of you who were understanding of us, and stayed with us all the way. Please know that as users, you are what makes this platform what it is, and damned we be if we ever forget it.

With love, and as always, stay safe in the high seas!

Lemmy.world Team

❤️

11
 
 

We set up a bridge between the Lemmy.world General Matrix room and the Public-1 channel on the Lemmy World discord server yesterday. It's not perfect as emoji reactions aren't visible and some minor things like how it handles edited messages but other than that it seems to be working well.

Now people on both chat clients can interact with eachother!

12
-1
submitted 2 years ago* (last edited 2 years ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world
 
 

Earlier, after review, we blocked and removed several communities that were providing assistance to access copyrighted/pirated material, which is currently not allowed per Rule #1 of our Code of Conduct. The communities that were removed due to this decision were:

We took this action to protect lemmy.world, lemmy.world's users, and lemmy.world staff as the material posted in those communities could be problematic for us, because of potential legal issues around copyrighted material and services that provide access to or assistance in obtaining it.

This decision is about liability and does not mean we are otherwise hostile to any of these communities or their users. As the Lemmyverse grows and instances get big, precautions may happen. We will keep monitoring the situation closely, and if in the future we deem it safe, we would gladly reallow these communities.

The discussions that have happened in various threads on Lemmy make it very clear that removing the communites before we announced our intent to remove them is not the level of transparency the community expects, and that as stewards of this community we need to be extremely transparent before we do this again in the future as well as make sure that we get feedback around what the planned changes are, because lemmy.world is yours as much as it is ours.

13
0
submitted 2 years ago* (last edited 2 years ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world
 
 

Update:
The comments from this post will not be removed as to preserve the discussion around the announcement. Any continued discussions outside of this thread that violate server rules will be removed. We feel that everyone that has an opinion, and wanted to vent, has been heard.

————-

Original post:
Yesterday, we received information about the planned federation by Hexbear. The announcement thread can be found here: https://www.hexbear.net/post/280770. After reviewing the thread and the comments, it became evident that allowing Hexbear to federate would violate our rules.

Our code of conduct and server rules can be found here.

The announcement included several concerning statements, as highlighted below:

  • “Please try to keep the dirtbag lib-dunking to hexbear itself. Do not follow the Chapo Rules of Posting, instead try to engage utilizing informed rhetoric with sources to dismantle western propaganda. Posting the western atrocity propaganda and pig poop balls is hilarious but will pretty quickly get you banned and if enough of us do it defederated.”
  • “The West's role in the world, through organizations such as NATO, the IMF, and the World Bank - among many others - are deeply harmful to the billions of people living both inside and outside of their imperial core.”
  • “These organizations constitute the modern imperial order, with the United States at its heart - we are not fooled by the term "rules-based international order." It is in the Left's interest for these organizations to be demolished. When and how this will occur, and what precisely comes after, is the cause of great debate and discussion on this site, but it is necessary for a better world.”

The rhetoric and goal of Hexbar are clear based on their announcement: to "dismantle western propaganda" and "demolish organizations such as NATO” shows that Hexbar has no intention of "respecting the rules of the community instance in which they are posting/commenting.” It’s to push their beliefs and ideology.

In addition, several comments from a Hexbear admin, demonstrate that instance rules will not be respected.

Here are some examples:

“I can assure you there will be no lemmygrad brigades, that energy would be better funneled into the current war against liberalism on the wider fediverse.”

“All loyal, honest, active and upright Communists must unite to oppose the liberal tendencies shown by certain people among us, and set them on the right path. This is one of the tasks on our ideological front.”

Overall community comments:

To clarify, for those who have inquired about why Hexbear versus Lemmygrad, it should be noted that we are currently exploring the possibility of defederating from Lemmygrad as well based on similar comments Hexbear has made.

Defederation should only be considered as a last resort. However, based on their comments and behavior, no positive outcomes can be expected.

We made the decision to preemptively defederate from Hexbear for these reasons. While we understand that not everyone may agree with our decision, we believe it is important to prioritize the best interests of our community.

14
1
submitted 2 years ago* (last edited 2 years ago) by ruud@lemmy.world to c/lemmyworld@lemmy.world
 
 

Looks like it works.

Edit still see some performance issues. Needs more troubleshooting

Update: Registrations re-opened We encountered a bug where people could not log in, see https://github.com/LemmyNet/lemmy/issues/3422#issuecomment-1616112264 . As a workaround we opened registrations.

Thanks

First of all, I would like to thank the Lemmy.world team and the 2 admins of other servers @stanford@discuss.as200950.com and @sunaurus@lemm.ee for their help! We did some thorough troubleshooting to get this working!

The upgrade

The upgrade itself isn't too hard. Create a backup, and then change the image names in the docker-compose.yml and restart.

But, like the first 2 tries, after a few minutes the site started getting slow until it stopped responding. Then the troubleshooting started.

The solutions

What I had noticed previously, is that the lemmy container could reach around 1500% CPU usage, above that the site got slow. Which is weird, because the server has 64 threads, so 6400% should be the max. So we tried what @sunaurus@lemm.ee had suggested before: we created extra lemmy containers to spread the load. (And extra lemmy-ui containers). And used nginx to load balance between them.

Et voilà. That seems to work.

Also, as suggested by him, we start the lemmy containers with the scheduler disabled, and have 1 extra lemmy running with the scheduler enabled, unused for other stuff.

There will be room for improvement, and probably new bugs, but we're very happy lemmy.world is now at 0.18.1-rc. This fixes a lot of bugs.