this post was submitted on 26 Nov 2025
97 points (96.2% liked)

Selfhosted

53155 readers
408 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Got a warning for my blog going over 100GB in bandwidth this month... which sounded incredibly unusual. My blog is text and a couple images and I haven't posted anything to it in ages... like how would that even be possible?

Turns out it's possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for 'Unknown robot'? This is actually bonkers.

Edit: As Thunraz points out below, there's a footnote that reads "Numbers after + are successful hits on 'robots.txt' files" and not scientific notation.

Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That's when my account was suspended for exceeding bandwidth (it's an artificial limit I put on there awhile back and forgot about...) that's also why the 'last visit' for all the bots is November 12th.

you are viewing a single comment's thread
view the rest of the comments
[–] dual_sport_dork@lemmy.world 27 points 2 hours ago (1 children)

I run an ecommerce site and lately they've latched onto one very specific product with attempts to hammer its page and any of those branching from it for no readily identifiable reason, at the rate of several hundred times every second. I found out pretty quickly, because suddenly our view stats for that page in particular rocketed into the millions.

I had to insert a little script to IP ban these fuckers, which kicks in if I see a malformed user agent string or if you try to hit this page specifically more than 100 times. Through this I discovered that the requests are coming from hundreds of thousands of individual random IP addresses, many of which are located in Singapore, Brazil, and India, and mostly resolve down into those owned by local ISPs and cell phone carriers.

Of course they ignore your robots.txt as well. This smells like some kind of botnet thing to me.

[–] panda_abyss@lemmy.ca 9 points 1 hour ago (1 children)

I don’t really get those bots.

Like, there are bots that are trying to scrape product info, or prices, or scan for quantity fields. But why the hell do some of these bots behave the way they do?

Do you use Shopify by chance? With Shopify the bots could be scraping the product.json endpoint unless it’s disabled in your theme. Shopify just seems to show the updated at timestamp from the db in their headers+product data, so inventory quantity changes actually result in a timestamp change that can be used to estimate your sales.

There are companies that do that and sell sales numbers to competitors.

No idea why they have inventory info on their products table, it’s probably a performance optimization.

I haven’t really done much scraping work in a while, not since before these new stupid scrapers started proliferating.

[–] dual_sport_dork@lemmy.world 8 points 1 hour ago (1 children)

Negative. Our solution is completely home grown. All artisinal-like, from scratch. I can't imagine I reveal anything anyone would care about much except product specs, and our inventory and pricing really doesn't change very frequently.

Even so, you think someone bothering to run a botnet to hound our site would distribute page loads across all of our products, right? Not just one. It's nonsensical.

[–] panda_abyss@lemmy.ca 6 points 1 hour ago

Yeah, that’s the kind of weird shit I don’t understand. Someone on the other hand is paying for servers and a residential proxy to send that traffic too. Why?