SpiderShoeCult

joined 2 years ago
[–] SpiderShoeCult@sopuli.xyz 2 points 1 week ago (1 children)

not sure what you mean by opioid+nsaid prescription painkillers, so I'll assume it's a mix of opiods plus NSAIDs. wouldn't make much sense to add aspirin/naproxen/ibuprofen on top of that, as they are also NSAIDs and that role's already been filled

paracetamol is also a fairly good painkiller so my guess is they're probably going for some synergy there as well

[–] SpiderShoeCult@sopuli.xyz 2 points 1 week ago (1 children)

Fully agreed. And to add on top of that, the world cannot take adequate care of all the children in it right now, and there are voices shouting we need more?

[–] SpiderShoeCult@sopuli.xyz 0 points 1 week ago (1 children)

guessing some of them had (have?) some ties to russia or russian assets one way or another

or, it could be that some russian resources may be vital and hard to replace, so they had make some compromises. I know from a few years back, for instance, that cobalt-60 gamma sources were sourced nearly exclusively from russia. and those are needed worldwide for sterilization via gamma-irradiation of all sorts of things (including the single-use syringes widely used in hospitals). and that's just one example. wonder how many more there are

otherwise, I can't really understand why russia wasn't briefly excluded from SWIFT

[–] SpiderShoeCult@sopuli.xyz 4 points 1 week ago (1 children)

why, though? the other options would be just as likely

[–] SpiderShoeCult@sopuli.xyz 5 points 1 week ago (1 children)

The article title is not the title of this post. "Women warned weight loss jabs may affect the pill" appears for me when I click it. Is this one of those things where they switch the title around for different people?

[–] SpiderShoeCult@sopuli.xyz 7 points 1 week ago

Nationality of somebody born on a plane wouldn't be a big deal as long as at least one of the parents comes from a country where lex sanguis applies. If lex solis applies (as in the USA) then they could in fact be stateless unless their parents have some other nationality.

And, if I remember correctly, the captain has the responsibility to record births and deaths on board an airplane. So you might be on to something with the paperwork.

[–] SpiderShoeCult@sopuli.xyz 3 points 1 week ago

what about the blood?

[–] SpiderShoeCult@sopuli.xyz 24 points 1 week ago

I think he means that the ureter itself cannot be transplanted, so the root cause is not solvable. A kidney transplant would be like putting in a new lightbulb in a defective lamp.

[–] SpiderShoeCult@sopuli.xyz 7 points 1 week ago (1 children)

I know it's probably a typo but, man, 'burning Koreans' is a bit of an extreme stance to have on migration, even for extremists.

[–] SpiderShoeCult@sopuli.xyz 2 points 2 weeks ago (1 children)

OP's username checks out. Another case of nominative determinism?

[–] SpiderShoeCult@sopuli.xyz 21 points 2 weeks ago

you now have tourette's and speak that incantation uncontrolably. the only medication for it is lead-based. lead is otherwise harmless to you.

[–] SpiderShoeCult@sopuli.xyz 2 points 3 weeks ago (1 children)

1 mL (or cubic centimeter) of water weighs 1g, not 1 mg. 1 mg would be 1 microliter of water, or one millionth of a liter.

 

Hey, so first off, this is my first time dabbling with LLMs and most of the information I found myself by rummaging through githubs.

I have a fairly modest set-up, an older gaming laptop with a RTX3060 video card with 6 GB VRAM. I run inside WSL2.

I have had some success running fastchat with the vicuna 7B model, but it's extremely slow, at roughly 1 word every 2-3 seconds output, with --load-8bit, lest I get a CUDA OOM error. Starts faster at 1-2 words per second but slows to a crawl later on (I suspect it's because it also uses a bit of the 'Shared video RAM' according to the task manager). So I heard about quantization which is supposed to compress models at the cost of some accuracy. Tried ready-quantized models (compatible with the fastchat implementation) from hugginface.co, but I ran into an issue - whenever I'd ask something, the output would be repeated quite a lot. Say I'd say 'hello' and I'd get 200 'Hello!' in response. Tried quantizing a model myself with exllamav2 (using some .parquet wikitext files also from hugginface for calibration) and then using fastchat but the problem persists. Endless repeated output. It does work faster, though at the actual generation, so at least that part is going well.

Any ideas on what I'm doing wrong?

view more: next ›