this post was submitted on 26 Mar 2025
13 points (100.0% liked)
Programmer Humor
23585 readers
1107 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
all programs are single threaded unless otherwise specified.
It’s safe to assume that any non-trivial program written in Go is multithreaded
And yet: You’ll still be limited to two simultaneous calls to your REST API because the default HTTP client was built in the dumbest way possible.
Really? Huh, TIL. I guess I've just never run into a situation where that was the bottleneck.
The client object or the library?
… Is this a trick question? The object, provided by the library (net/http which is about as default as they come) sets “DefaultMaxIdleConnsPerHost” to 2. This is significant because if you finish a connection and you’ve got more than 2 idles, it slams that connection close. If you have a lot of simultaneous fast lived requests to the same IP (say a load balanced IP), your go programs will exhaust the ephemeral port list quickly. It’s one of the most common “gotchas” I see where Go programs work great in dev and blow themselves apart in prod.
https://dev.to/gkampitakis/http-connection-churn-in-go-34pl is a fairly decent write up.
Does Python have the ability to specify loops that should be executed in parallel, as e.g. Matlab uses
parfor
instead offor
?python has way too many ways to do that.
asyncio
,future
,thread
,multiprocessing
...Of the ways you listed the only one that will actually take advantage of a multi core CPU is
multiprocessing
yup, that's true. most meaningful tasks are io-bound so "parallel" basically qualifies as "whatever allows multiple threads of execution to keep going". if you're doing numbercrunching in pythen without a proper library like pandas, that can parallelize your calculations, you're doing it wrong.
I’ve used multiprocessing to squeeze more performance out of numpy and scipy. But yeah, resorting to multiprocessing is a sign that you should be dropping into something like Rust or a C variant.
Most numpy array functions already utilize multiple cores, because they're optimized and written in C
I've always hated object oriented multi threading. Goroutines (green threads) are just the best way 90% of the time. If I need to control where threads go I'll write it in rust.