eli

joined 2 years ago
[–] eli@lemmy.world 8 points 1 day ago (1 children)

We're a Linux shop at my work. We do have a windows PC due to corporate policies...but everything we do on our windows PCs we could do from Linux.

Outlook? Website. Excel? Website. Jira? Website. Teams? Website. Nearly everything we do front end wise is all web based. Which, I know electron sucks, but from a "Linux as a main desktop environment"...I'm pretty damn happy with everything being web based nowadays. It's all OS agnostic.

[–] eli@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

There are a lot of great commands in here, so here are my favorites that I haven't seen yet:

  • crontab -e
  • && and || operators
  • ">" and >> chevrons and input/output redirection
  • for loops, while/if/then/else
  • Basic scripts
  • Stdin vs stdout vs /dev/null

Need to push a file out to a couple dozen workstations and then install it?

for i in $(cat /tmp/wks.txt); do echo $i; rsync -azvP /tmp/file $i:/opt/dir/; ssh -qo Connect timeout=5 $i "touch /dev/pee/pee"; done

Or script it using if else statements where you pull info from remote machines to see if an update is needed and then push the update if it's out of date. And if it's in a script file then you don't have search through days of old history commands to find that one function.

Or just throw that script into crontab and automate it entirely.

[–] eli@lemmy.world 2 points 1 day ago

You can do "ss -aepni" and that will dump literally everything ss can get its hands on.

Also, ss can't find everything, it does have some limitations. I believe ss can only see what the kernel can see(host connections), but tcpdump can see the actual network flow on the network layer side. So incoming, outgoing, hex(?) data in transit, etc.

I usually try to use ss first for everything since I don't think it requires sudo access for the majority of its functionality, and if it can't find something then I bring out sudo tcpdump.

[–] eli@lemmy.world 3 points 1 day ago (1 children)

And I believe shift+r will let you go forward in history if you're spamming ctrl+r too fast and miss whatever you're looking for

[–] eli@lemmy.world 6 points 1 day ago* (last edited 1 day ago)

Get a cocksleeve. Blissful creations makes some really great pieces. It's a dildo that you wear essentially

[–] eli@lemmy.world 4 points 2 days ago (1 children)

I don't have any books in particular to recommend, but with homelab'ing we should be learning about the command line of our OS(Powershell, terminal(bash, zsh)).

Learning the ins and outs of something like bash, cron, environment variables, for loops, systemd services(managing, creating your own), command line networking...all things I've had to learn to either setup, manage, and/or troubleshoot my homelab.

So maybe basic Linux command line books? Probably O'Reilly has some along with bash.

[–] eli@lemmy.world 2 points 2 days ago (1 children)

This is pretty much my setup as well. Proxmox on bare metal, then everything I do are in Ubuntu LXC containers, which have docker installed inside each of them running whatever docker stack.

I just installed Portainer and got the standalone agents installed on each LXC container, so it's helped massively with managing each docker setup.

Of course you can do whatever base image you want for the LXC container, I just prefer Ubuntu for my homelab.

I do need to setup a golden image though to make stand-ups easier...one thing at a time though!

[–] eli@lemmy.world 17 points 2 days ago

Probably the biggest carrot on stick we've ever seen lol

[–] eli@lemmy.world 4 points 2 days ago

While I love and run Grafana and Prometheus myself, it's like taking a RPG to an ant.

There are simpler tools that do the job just fine of "is X broken?".

Even just running Portainer and attaching it to a bunch of standalone Docker environments is pretty good too.

[–] eli@lemmy.world 24 points 3 days ago (14 children)

There are millions of devices that still and will continue to use SATA.

My Synology NAS only accepts SATA. So if one of my SSDs dies I'm just shit out of luck and have to find a 8 bay M.2 NAS to have a comparable alternative?

Your comment is beyond ridiculous

[–] eli@lemmy.world 3 points 4 days ago

I run proxmox for my own homelab and another instance for very small services inside my LAN.

Anyway, I have gotten into docker recently and my method so far has been to spin up a LXC container of just a base OS(like Ubuntu or Alpine or whatever) and then install docker and whatever else inside that container and then run my service.

So I have one container per service. Now my problem is how to manage the docker side without having to go into each container individually. I have tried portainer but it's not clicking with me.

I've actually been trying to find a solution to just have docker on a bare metal OS install and that be my hypervisor, but I can't get a clear answer on anything, so Proxmox seems to be my only option.

Proxmox is a very solid option, but it is not "less intensive" than Debian since it is built on top of Debian. Proxmox does not install a desktop environment(it has a web GUI), so that may help with keeping resources low, but it isn't some magical solution.

I would recommend trying it 100%, there is a little bit of a learning curve getting to know Proxmox, but it's the best hypervisor I've used for homelab so far.

[–] eli@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (1 children)

Your situation sounds like a two server solution for local. So one server for hypervisor/vms and then snapshots and backups go to a separate box like a NAS. As for "house burning down", a solution for that is off-site backups. I'm guessing building a small TrueNAS server and installing it at a friend's house or your parents or whatever and then find a backup solution to sync(syncthing may be an answer here for you?).

I don't care about my homelab much, but I do care about my family photos. For that I follow my own 3-2-1 where:

3 copies of my data

2 copies are local

1 copy is off-site

I have a NAS at my house and another NAS at my parents house. They are both linked with syncthing and I do a one-way backup to the other NAS. Now, my parents are a 10 minutes away by car, so I consider that NAS "local".

And then I backup my NAS to backblaze for my off-site backup.

view more: next ›