fubarx

joined 10 months ago
[–] fubarx@lemmy.world 1 points 22 hours ago

Got one for my kid, as well as a neon bright case so it would stand out. He's on his second case and third screen protector. The case cracked and chipped on the corners, and all protectors ended up with spiderweb cracks. He's due for another protector.

The phone is still fine.

[–] fubarx@lemmy.world 2 points 1 day ago

Auto shutoff when you fall off.

Solid design.

[–] fubarx@lemmy.world 1 points 2 days ago

Day after Easter.

[–] fubarx@lemmy.world 2 points 3 days ago

this side up

[–] fubarx@lemmy.world 4 points 3 days ago

The ground level weedwhacker blades could do double-duty clearing out brush (and pesky pedestrians).

[–] fubarx@lemmy.world 6 points 3 days ago

How many borad-feet per yar?

[–] fubarx@lemmy.world 2 points 3 days ago

Have had good luck with electronic traps. Caught mice and a couple of rats that were hanging around our driveway and chewing up car cables.

I got one with wifi that sends a message when it caught something. Good for out of the way spots. Trap with peanut butter. For deterrence, ended up spraying the area with capsicum pepper spray.

https://www.victorpest.com/store/mouse-control/electronic-traps

[–] fubarx@lemmy.world 3 points 4 days ago (3 children)

The processors inside every single device we have in our homes today will no longer be manufactured in 10 years.

In 20 years, few will know how to build the firmware for them or how to fix them.

As time goes on, those numbers could drop to 5 and 10 years.

[–] fubarx@lemmy.world 2 points 4 days ago

Given that 50% of the time, the generated code is unworkable garbage, having an AI automatically write code to create new training models will either solve all problems, or spontaneously combust into a pile of ash.

My money's on the latter.

[–] fubarx@lemmy.world 13 points 4 days ago

Management.

[–] fubarx@lemmy.world 23 points 4 days ago

La Femme Nikita (original French), Alien, and Terminator 2 all had seriously kick-ass female leads.

[–] fubarx@lemmy.world 27 points 4 days ago (1 children)

If you wanted to run Unix, your main choices were workstations (Sun, Silicon Graphics, Apollo, IBM RS/6000), or servers (DEC, IBM) They all ran different flavors of BSD or System-V unix and weren't compatible with each other. Third-party software packages had to be ported and compiled for each one.

On x86 machines, you mainly had commercial SCO, Xenix, and Novell's UnixWare. Their main advantage was that they ran on slightly cheaper hardware (< $10K, instead of $30-50K), but they only worked on very specifically configured hardware.

Then along came Minix, which showed a clean non-AT&T version of Unix was doable. It was 16-bit, though, and mainly ended up as a learning tool. But it really goosed the idea of an open-source OS not beholden to System V. AT&T had sued BSD which scared off a lot of startup adoption and limited Unix to those with deep pockets. Once AT&T lost the case, things opened up.

Shortly after that Linux came out. It ran on 32-bit 386es, was a clean-room build, and fully open source, so AT&T couldn't lay claim to it. FSF was also working on their own open-source version of unix called GNU Hurd, but Linux caught fire and that was that.

The thing about running on PCs was that there were so many variations on hardware (disk controllers, display cards, sound cards, networking boards, even serial interfaces).

Windows was trying to corral all this crazy variety into a uniform driver interface, but you still needed a custom driver, delivered on a floppy, that you had to install after mounting the board. And if the driver didn't match your DOS or Windows OS version, tough luck.

Along came Linux, eventually having a way to support pluggable device drivers. I remember having to rebuild the OS from scratch with every little change. Eventually, a lot of settings moved into config files instead of #defines (which would require a rebuild). And once there was dynamic library loading, you didn't even have to reboot to update drivers.

The number of people who would write and post up device drivers just exploded, so you could put together a decent machine with cheaper, commodity components. Some enlightened hardware vendors started releasing with both Windows and Linux drivers (I had friends who made a good living writing those Linux drivers).

Later, with Apache web server and databases like MySql and Postgres, Linux started getting adopted in data centers. But on the desktop, it was mostly for people comfortable in terminal. X was ported, but it wasn't until RedHat came around that I remember doing much with UIs. And those looked pretty janky compared to what you saw on NeXTStep or SGI.

Eventually, people got Linux working on brand name hardware like Dell and HPs, so you didn't have to learn how to assemble PCs from scratch. But Microsoft tied these vendors so if you bought their hardware, you also had to pay for a copy of Windows, even if you didn't want to run it. It took a government case against Microsoft before hardware makers were allowed to offer systems with Linux preloaded and without the Windows tax. That's when things really took off.

It's been amazing watching things grow, and software like LibreOffice, Wayland, and SNAP help move things into the mainstream. If it wasn't for Linux virtualization, we wouldn't have cloud computing. And now, with Steam Deck, you have a new generation of people learning about Linux.

PS, this is all from memory. If I got any of it wrong, hopefully somebody will correct it.

 

cross-posted from: https://lemmy.world/post/28263533

👊 TARIFF 🔥

The GREATEST, most TREMENDOUS Python package that makes importing great again!

TARIFF is a fantastic tool that lets you impose import tariffs on Python packages. We're going to bring manufacturing BACK to your codebase by making foreign imports more EXPENSIVE!

 

👊 TARIFF 🔥

The GREATEST, most TREMENDOUS Python package that makes importing great again!

TARIFF is a fantastic tool that lets you impose import tariffs on Python packages. We're going to bring manufacturing BACK to your codebase by making foreign imports more EXPENSIVE!

 
 
 

128
submitted 3 weeks ago* (last edited 3 weeks ago) by fubarx@lemmy.world to c/technology@lemmy.world
 

Fragments of a rare Merlin manuscript from c. 1300 have been discovered and digitised in a ground-breaking three-year project at Cambridge University Library

A fragile 13th century manuscript fragment, hidden in plain sight as the binding of a 16th-century archival register, has been discovered in Cambridge and revealed to contain rare medieval stories of Merlin and King Arthur.

...

What followed the discovery has been a ground-breaking collaborative project, showcasing the work of the University Library’s Cultural Heritage Imaging Laboratory (CHIL) and combining historical scholarship with cutting-edge digital techniques, to unlock the manuscript's long-held secrets - without damaging the unique document.

...

To achieve this, the team undertook:

Multispectral Imaging (MSI)

This technique used in CHIL involved capturing the fragment in various wavelengths of light, from ultraviolet to infrared.

...

Computed Tomography (CT) scanning

Conducted with equipment and expertise from the University’s Zoology department, the team used a powerful X-ray scanner—typically used for scanning fossils or skeletons—to virtually penetrate the layers of parchment and uncover hidden structures in the binding.

...

3D modelling

Industrial scanning techniques created highly detailed virtual models of the fragment, allowing researchers to study its creases, stitching, and folds in remarkable detail.

...

The digital results of the project are now available for everyone to explore online via the
Cambridge Digital Library.

view more: next ›