GamingChairModel

joined 2 years ago
[–] GamingChairModel@lemmy.world 19 points 1 day ago (1 children)

this battery can deliver 0.03mA of power

0.03mA of current. That times the 3 volts = 0.1 mW of power.

[–] GamingChairModel@lemmy.world 14 points 1 day ago (3 children)

This is pretty normal, in my opinion. Every time people complain about common core arithmetic there are dozens of us who come out of the woodwork to argue that the concepts being taught are important for deeper understanding of math, beyond just rote memorization of pencil and paper algorithms.

Do you have a source for AMD chips being especially energy efficient?

I remember reviews of the HX 370 commenting on that. Problem is that chip was produced on TSMC's N4P node, which doesn't have an Apple comparator (M2 was on N5P and M3 was on N3B). The Ryzen 7 7840U was N4, one year behind that. It just shows that AMD can't get on a TSMC node even within a year or two of Apple.

Still, I haven't seen anything really putting these chips through the paces and actually measuring real world energy usage while running a variety of benchmarks. And the fact that benchmarks themselves only correlate to specific ways that computers are used, aren't necessarily supported on all hardware or OSes, and it's hard to get a real comparison.

SoCs are inherently more energy efficient

I agree. But that's a separate issue from instruction set, though. The AMD HX 370 is a SoC (well, technically, SiP as pieces are all packaged together but not actually printed on the same piece of silicon).

And in terms of actual chip architectures, as you allude, the design dictates how specific instructions are processed. That's why the RISC versus CISC concepts are basically obsolete. These chip designers are making engineering choices on how much silicon area to devote to specific functions, based on their modeling of how that chip might be used: multi threading, different cores optimized for efficiency or power, speculative execution, various specialized tasks related to hardware accelerated video or cryptography or AI or whatever else, etc., and then deciding how that fits into the broader chip design.

Ultimately, I'd think that the main reason why something like x86 would die off is licensing reasons, not anything inherent to the instruction set architecture.

[–] GamingChairModel@lemmy.world 2 points 1 week ago (2 children)

it's kinda undeniable that this is where the market is going. It is far more energy efficient than an Intel or AMD x86 CPU and holds up just fine.

Is that actually true, when comparing node for node?

In the mobile and tablet space Apple's A series chips have always been a generation ahead of Qualcomm's Snapdragon chips in terms of performance per watt. Meanwhile, Samsung's Exynos has always been behind even more. That's obviously not an instruction set issue, since all 3 lines are on ARM.

Much of Apple's advantage has been a willingness to pay for early runs on each new TSMC node, and a willingness to dedicate a lot of square millimeters of silicon to their gigantic chips.

But when comparing node for node, last I checked AMD's lower power chips designed for laptop TDPs, have similar performance and power compared to the Apple chips on that same TSMC node.

The person who wrote it has been gone for like four years

Four years? You gotta pump those numbers up. Those are rookie numbers.

[–] GamingChairModel@lemmy.world 1 points 1 week ago* (last edited 1 week ago)

Yeah, Firefox in particular gave me the most issues.

Configuring each app separately is also annoying.

And I definitely never got things to work on an external monitor that was a different DPI from my laptop screen. I wish I had the time or expertise to be able to contribute, but in the meantime I'm left hoping that the Wayland and DE devs find a solution to be at least achieve feature parity with Windows or MacOS.

I mean, that's basically the author's problem, then. I suspect the software support just isn't there for the hardware that ships on this particular laptop, to where it's easiest to manually put it in some blurry non native resolution, as the least crappy solution.

[–] GamingChairModel@lemmy.world 1 points 1 week ago (5 children)

What's the current state of Linux support for high dpi screens? As of two years ago I had some issues with getting things to work right in KDE, especially with GTK apps, by manually fiddling with system font sizes and button sizes, before I ended up donating that laptop to someone else.

[–] GamingChairModel@lemmy.world 11 points 1 week ago (1 children)

What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.

Also, I think it's worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.

There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don't have the data to actually analyze that, as far as I can tell.

What's annoying, too, is that a lot of the methods that have traditionally been used for discounts (education, nonprofit, employer-based discounts) are now only applicable to the subscriptions. So if you do want to get a standalone copy and would ordinarily qualify for a discount, you can't apply that discount to that license.

[–] GamingChairModel@lemmy.world 8 points 1 week ago (3 children)

Is it just me, or do new office features seem kinda pointless or unnecessary?

I feel like almost all the updates of the last two decades have been:

  • Security updates in a code base that was traditionally quite vulnerable to malware.
  • Technical updates in taking advantage of the advances in hardware, through updated APIs in the underlying OS. We pretty seamlessly moved from single core, 32-bit x86 CPU tasks to multicore x86-64 or ARM, with some tasks offloaded to GPUs or other specialized chips.
  • Some improvement in collaboration and sharing, unfortunately with a thumb on the scale to favor other Microsoft products like SharePoint or OneDrive or Outlook/Exchange.
  • Some useless nonsense, like generative AI.

Some of these are important (especially the first two), but the user experience shouldn't change much for them.

[–] GamingChairModel@lemmy.world 32 points 1 week ago (2 children)

Everyone gets to run sequencing, but this post is about 23andme nearing bankruptcy, where they would run an auction for their records, including this genetic information of its customers.

view more: next ›