I'm skeptical that those 4k developers are using their entire GPU for 8 hours a day. I would be surprised if even 10% of the ~~brain~~ GPU was being used. Though there are CI servers running ontop of that, but typically much fewer than there are developers. I would estimate 5 GWh as a liberal upper bound.
jsomae
Seems more like "putting things in scale" than "whataboutism." I'm not sure I agree with the premise, but I don't think it's whataboutism at all. Whataboutism would be "it's fine, because something else is worse," whereas I think the commenter is trying to say "it's not much, since it's less than something else that isn't much either."
This really fits my two hot takes about how we need to fix the left:
- (1) end purity tests: an imperfect ally is still an ally.
- (2) we need to appeal to centrists more; ± straight cis white men feel alienated when we talk about privilege as though it's a bad thing. We could talk about privilege as though it's a great thing that you should be proud of instead. Then we'd have privileged allies, which would be really helpful.
e^𝘪θ^ is not just notation. You can graph the entire function e^x+𝘪θ^ across the whole complex domain and find that it matches up smoothly with both the version restricted to the real axis (e^x^) and the imaginary axis (e^𝘪θ^). The complete version is:
e^x+𝘪θ^ := e^x^(cos(θ) + 𝘪sin(θ))
Various proofs of this can be found on wikipeda. Since these proofs just use basic calculus, this means we didn't need to invent any new notation along the way.
How much energy does AI really use? (zdnet) Seems like queries aren't that expensive, so I guess the enormous energy cost of AI must be mostly from training. I reckon this is why apologists try to minimize it.