this post was submitted on 26 Mar 2025
15 points (100.0% liked)

Programmer Humor

30459 readers
2467 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 20 comments
sorted by: hot top controversial new old
[–] lime@feddit.nu 5 points 11 months ago (2 children)

all programs are single threaded unless otherwise specified.

[–] firelizzard@programming.dev 1 points 11 months ago (1 children)

It’s safe to assume that any non-trivial program written in Go is multithreaded

[–] kbotc@lemmy.world 1 points 11 months ago (2 children)

And yet: You’ll still be limited to two simultaneous calls to your REST API because the default HTTP client was built in the dumbest way possible.

[–] LodeMike@lemmy.today 1 points 10 months ago (1 children)

The client object or the library?

[–] kbotc@lemmy.world 1 points 10 months ago

… Is this a trick question? The object, provided by the library (net/http which is about as default as they come) sets “DefaultMaxIdleConnsPerHost” to 2. This is significant because if you finish a connection and you’ve got more than 2 idles, it slams that connection close. If you have a lot of simultaneous fast lived requests to the same IP (say a load balanced IP), your go programs will exhaust the ephemeral port list quickly. It’s one of the most common “gotchas” I see where Go programs work great in dev and blow themselves apart in prod.

https://dev.to/gkampitakis/http-connection-churn-in-go-34pl is a fairly decent write up.

[–] firelizzard@programming.dev 1 points 11 months ago

Really? Huh, TIL. I guess I've just never run into a situation where that was the bottleneck.

[–] Successful_Try543@feddit.org 0 points 11 months ago (1 children)

Does Python have the ability to specify loops that should be executed in parallel, as e.g. Matlab uses parfor instead of for?

[–] lime@feddit.nu 1 points 11 months ago (2 children)

python has way too many ways to do that. asyncio, future, thread, multiprocessing...

[–] WolfLink@sh.itjust.works 2 points 11 months ago (1 children)

Of the ways you listed the only one that will actually take advantage of a multi core CPU is multiprocessing

[–] lime@feddit.nu 1 points 11 months ago (1 children)

yup, that's true. most meaningful tasks are io-bound so "parallel" basically qualifies as "whatever allows multiple threads of execution to keep going". if you're doing numbercrunching in pythen without a proper library like pandas, that can parallelize your calculations, you're doing it wrong.

[–] WolfLink@sh.itjust.works 1 points 11 months ago* (last edited 11 months ago) (1 children)

I’ve used multiprocessing to squeeze more performance out of numpy and scipy. But yeah, resorting to multiprocessing is a sign that you should be dropping into something like Rust or a C variant.

[–] itslilith@lemmy.blahaj.zone 0 points 11 months ago

Most numpy array functions already utilize multiple cores, because they're optimized and written in C

[–] danhab99@programming.dev 1 points 11 months ago

I've always hated object oriented multi threading. Goroutines (green threads) are just the best way 90% of the time. If I need to control where threads go I'll write it in rust.

[–] Fortatech@gregtech.eu 2 points 11 months ago

!lemmySilver

[–] alcasa@lemmy.sdf.org 1 points 11 months ago

It only took us how many years?

[–] SaharaMaleikuhm@feddit.org 0 points 11 months ago (1 children)

Oh wow, a programming language that is not supposed to be used for every single software in the world. Unlike Javascript for example which should absolutely be used for making everything (horrible). Nodejs was a mistake.

[–] lena@gregtech.eu 0 points 11 months ago (1 children)

Nodejs was a mistake.

More choice is always better

[–] _stranger_@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

And some of those choices are mistakes.

[–] kSPvhmTOlwvMd7Y7E@programming.dev 0 points 11 months ago (1 children)

let's be honest here, he actually means 0.01 core performance

[–] burlemarx@lemmygrad.ml 1 points 11 months ago

Yes, 0.99 performance being consumed by the interpreter.