Glitchvid

joined 2 months ago
[–] Glitchvid@lemmy.world 5 points 21 hours ago

There can be theoretical audit or blame issues , since you're not "paying" then how does the company pass the buck (SLA contracts) if something fucks up with LE.

[–] Glitchvid@lemmy.world 4 points 1 day ago (5 children)

Ironically the shortening of cert lengths has pushed me to automated systems and away from the traditional paid trust providers.
I used to roll a 1-year cert for my CDN, and manually buy renewals and go through the process of signing and uploading the new ones, it wasn't particularly onerous, but then they moved to I think either 3 or 6 months max signing, which was the point where I just automated it with Let's Encrypt.

I'm in general not a fan of how we do root of trust on the web, I much prefer had DANE caught on, where I can pin a cert at the DNS level that is secured with DNSSEC and is trusted through IANA and the root zone.

[–] Glitchvid@lemmy.world 16 points 2 days ago* (last edited 2 days ago)

IP law needs overhauling, but these are the last people (aside from Disney et al) I'd trust to draft the new ones.

[–] Glitchvid@lemmy.world 16 points 2 days ago* (last edited 2 days ago) (1 children)

The US manages to store 1.5B pounds of cheese it doesn't do anything with, I think China can handle constructing some warehouse to hold what it digs up from the ground.

[–] Glitchvid@lemmy.world 3 points 6 days ago

if not x then … end is very common in Lua for similar purposes, very rarely do you see hard nil comparisons or calls to typeof (last time I did was for a serializer).

[–] Glitchvid@lemmy.world 1 points 6 days ago* (last edited 6 days ago)

Most of the VCS ops in Hg are actually written in C.

GitHub is mostly written in Ruby, so that's not really a performance win.

Like I said, we're stuck with Git's UX, but we were never stuck with Hg's performance.

[–] Glitchvid@lemmy.world 1 points 1 week ago (2 children)

I don't think it's hyperbole to say a significant percentage of Git activity happens on GitHub (and other "foundries") – which are themselves a far cry from efficient.

My ultimate takeaway on the topic is that we're stuck with Git's very counterintuitive porcelain, and only satisfactory plumbing, regardless of performance/efficiency; but if Mercurial had won out, we'd still have its better interface (and IMO workflow), and any performance problems could've been addressed by a rewrite in C (or the Rust one that is so very slowly happening).

[–] Glitchvid@lemmy.world 25 points 1 week ago* (last edited 1 week ago)

If only, this is "modern" PhysX, we'd need the source to the original Ageia PhysX 2.X branch to fix it properly.

[–] Glitchvid@lemmy.world 33 points 2 weeks ago

The amount of stupid AI scraping behavior I see even on my small websites is ridiculous, they'll endlessly pound identical pages as fast as possible over an entire week, apparently not even checking if the contents changed. Probably some vibe coded shit that barely functions.

[–] Glitchvid@lemmy.world 1 points 2 weeks ago

Man this reminds me of the lockers we had in middle school that used dial locks, cheap masterlock jobbies that despite having notches between the major numbers, just being within 2 of the actual number would register.
Plus it felt like they'd slip internally so if you dialed too quickly (because class starts in 3 minutes at the other end of the building) you'd have to start all over.

[–] Glitchvid@lemmy.world 2 points 2 weeks ago

Yeah, electric motors are what I notice the most. Be it on washers/dryers, garbage disposals (which range from 1/3, 1/2, 3/4, 1HP) and more.

[–] Glitchvid@lemmy.world 8 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Probably a mix of Z systems, that stuff goes back 20-odd years, and even then older code can still run on new Z systems which is something IBM brags about.
Mainframes aren't old they're just niche technology, and that includes enterprise Java software.

view more: next ›