traches

joined 2 years ago
[–] traches@sh.itjust.works 2 points 21 hours ago

Base on the left, enemy base on the right

[–] traches@sh.itjust.works 3 points 22 hours ago (2 children)

Woah, there’s a nostalgia hit. Those games would go on for SO LONG

Also ifnyou have 2 monitors it’ll put an independent map on each one

[–] traches@sh.itjust.works 19 points 6 days ago* (last edited 6 days ago) (3 children)

Aside from getting treatment, what’s helped me:

  • know that you’re incapable of forming habits. You have to choose to do the thing every time.
  • Trick yourself into hyperfixation if you can. It’s a superpower, if you can aim it in a useful direction. Find something gratifying or satisfying about the task to focus on. Sometimes watching the todo list get smaller or the inbox get cleared out does it for me.
  • This one’s hard to describe, but sometimes you have to realize that the hyperfixation easy-mode motivation isn’t coming and you’ll have to draw motivation from somewhere else inside yourself. Sometimes just getting over the initial jump of starting a task makes the rest of it flow naturally.
  • Break big problems into little problems.
  • Simplify your life as much as possible. You only have so many executive function tokens every day, don’t waste them.
  • have a todo list. Don’t make it complicated. If possible, just do the thing immediately instead of putting it on the list.
  • Get enough sleep. This is not optional.
  • Exercise, specifically some kind of cardio. 30 minutes above about 120bpm, 3 times per week. There are studies, it makes a big difference in a million ways.
  • Give yourself grace. You’re playing on hard mode, don’t compare yourself to neurotypical people.

Edit: also it’s not laziness, not really. Lazy people are comfortable with it.

[–] traches@sh.itjust.works 1 points 2 weeks ago

Vercel isn’t an AI company it’s a web host.

[–] traches@sh.itjust.works 2 points 2 weeks ago (1 children)

Eh, I try to keep this username separate from my real name. It’s not too hard though, you just need ‘@media print {‘. Set display none on stuff like the navbar and footer, and you also need to think about page breaks and such, there are guides.

Browser dev tools can simulate print styles, and you can preview with the regular print preview. To get consistency across browsers you probably want to set a definite width, so the sizing stays the same.

[–] traches@sh.itjust.works 5 points 2 weeks ago (3 children)

Page on my personal site, with good print styles so I can print to pdf if needed.

[–] traches@sh.itjust.works 7 points 2 weeks ago

I mean…. Let’s say you set up a Postgres user for all of your application users, with appropriate roles and row level security policies, you could actually do it without Bobby tables issues. I think.

[–] traches@sh.itjust.works 2 points 3 weeks ago

https://sudoku.coach/ is installable as a progressive web app and is probably the best sudoku app I’ve encountered. Extremely customizable and good for learning new solving techniques

[–] traches@sh.itjust.works 2 points 3 weeks ago

One of the most charming games I’ve ever played and I play a shitload of games

[–] traches@sh.itjust.works 37 points 3 weeks ago (1 children)

Hey that’s unethical! It should be “main”

 

I'm working on a project to back up my family photos from TrueNas to Blu-Ray disks. I have other, more traditional backups based on restic and zfs send/receive, but I don't like the fact that I could delete every copy using only the mouse and keyboard from my main PC. I want something that can't be ransomwared and that I can't screw up once created.

The dataset is currently about 2TB, and we're adding about 200GB per year. It's a lot of disks, but manageably so. I've purchased good quality 50GB blank disks and a burner, as well as a nice box and some silica gel packs to keep them cool, dark, dry, and generally protected. I'll be making one big initial backup, and then I'll run incremental backups ~monthly to capture new photos and edits to existing ones, at which time I'll also spot-check a disk or two for read errors using DVDisaster. I'm hoping to get 10 years out of this arrangement, though longer is of course better.

I've got most of the pieces worked out, but the last big question I need to answer is which software I will actually use to create the archive files. I've narrowed it down to two options: dar and bog-standard gnu tar. Both can create multipart, incremental backups, which is the core capability I need.

Dar Advantages (that I care about):

  • This is exactly what it's designed to do.
  • It can detect and tolerate data corruption. (I'll be adding ECC data to the disks using DVDisaster, but defense in depth is nice.)
  • More robust file change detection, it appears to be hash based?
  • It allows me to create a database I can use to locate and restore individual files without searching through many disks.

Dar disadvantages:

  • It appears to be a pretty obscure, generally inactive project. The documentation looks straight out of the early 2000s and it doesn't have https. I worry it will go offline, or I'll run into some weird bug that ruins the show.
  • Doesn't detect renames. Will back up a whole new copy. (Problematic if I get to reorganizing)
  • I can't find a maintained GUI project for it, and my wife ain't about to learn a CLI. Would be nice if I'm not the only person in the world who could get photos off of these disks.

Tar Advantages (that I care about):

  • battle-tested, reliable, not going anywhere
  • It's already installed on every single linux & mac PC , and it's trivial to put on a windows pc.
  • Correctly detects renames, does not create new copies.
  • There are maintained GUIs available; non-nerds may be able to access

Tar disadvantages:

  • I don't see an easy way to locate individual files, beyond grepping through snar metadata files (that aren't really meant for that).
  • The file change detection logic makes me nervous - it appears to be based on modification time and inode numbers. The photos are in a ZFS dataset on truenas, mounted on my local machine via SMB. I don't even know what an inode number is, how can I be sure that they won't change somehow? Am I stuck with this exact NAS setup until I'm ready to make a whole new base backup? This many blu-rays aren't cheap and burning them will take awhile, I don't want to do it unnecessarily.

I'm genuinely conflicted, but I'm leaning towards dar. Does anyone else have any experience with this sort of thing? Is there another option I'm missing? Any input is greatly appreciated!

 

I have a load-bearing raspberry pi on my network - it runs a DNS server, zigbee2mqtt, unifi controller, and a restic rest server. This raspberry pi, as is tradition, boots from a microSD card. As we all know, microSD cards suck a little bit and die pretty often; I've personally had this happen not all that long ago.

I'd like to keep a reasonably up-to-date hot spare ready, so when it does give up the ghost I can just swap them out and move on with my life. I can think of a few ways to accomplish this, but I'm not really sure what's the best:

  • The simplest is probably cron + dd, but I'm worried about filesystem corruption from imaging a running system and could this also wear out the spare card?
  • recreate partition structure, create an fstab with new UUIDs, rsync everything else. Backups are incremental and we won't get filesystem corruption, but we still aren't taking a point-in-time backup which means data files could be inconsistent with each other. (honestly unlikely with the services I'm running.)
  • Migrate to BTRFS or ZFS, send/receive snapshots. This would be annoying to set up because I'd need to switch the rpi's filesystem, but once done I think this might be the best option? We get incremental updates, point-in-time backups, and even rollback on the original card if I want it.

I'm thinking out loud a little bit here, but do y'all have any thoughts? I think I'm leaning towards ZFS or BTRFS.

view more: next ›