• 0 Posts
  • 34 Comments
Joined 11 months ago
cake
Cake day: January 1st, 2024

help-circle

  • it means you’re getting fucked by them and not in a good way

    So anal sex is a not-good way to have sex? Yeah sorry but that does sound pretty homophobic to me.

    without lube

    Ah, well that changes things. Anal without lube is a pretty universally bad experience, so sure, use that. But just framing being the receiving end of anal as bad without further context, we can do better than that, that’s all im saying





  • I’m German, and I would not want that. German grammar works differently in a way that makes programming a lot more awkward for some reason. Things like, “.forEach” would technically need three different spellings depending on the grammatical gender of the type of element that’s in the collection it’s called on. Of course you could just go with neuter and say it refers to the “items” in the collection, but that’s just one of lots of small pieces of awkwardness that get stacked on top of each other when you try to translate languages and APIs. I really appreciate how much more straightforward that works with English.








  • It is an algorithm that searches a dataset and when it can’t find something it’ll provide convincing-looking gibberish instead.

    This is very misleading. An LLM doesn’t have access to its training dataset in order to “search” it. Producing convincing looking gibberish is what it always does, that’s its only mode of operation. The key is that the gibberish that comes out of today’s models is so convincing that it actually becomes broadly useful.

    That also means that no, not everything an LLM produces has to have been in its training dataset, they can absolutely output things that have never been said before. There’s even research showing that LLMs are capable of creating actual internal models of real world concepts, which suggests a deeper kind of understanding than what the “stochastic parrot” moniker wants you to believe.

    LLMs do not make decisions.

    What do you mean by “decisions”? LLMs constantly make decisions about which token comes next, that’s all they do really. And in doing so, on a higher, emergent level they can make any kind of decision that you ask them to, the only question is how good those decisions are going be, which in turn entirely depends on the training data, how good the model is, and how good your prompt is.



  • My “best we got” was in regards to the potential to become a lot worse because of shareholder pressure. Given that CD Project is a publicly traded company, GOG is much worse in that regard than Steam.

    I fully agree that GOG, as it currently is, could be the better product for you depending on your values, but its defenses against enshittification are objectively much worse than Steam’s*, and that’s all I was talking about.

    *That is, until Gabe dies, I guess, who knows what’ll happen then



  • Not really. Timezones, at their core (so without DST or any other special rules), are just a constant offset that you can very easily translate back and forth between, that’s trivial as long as you remember to do it. Having lots of them doesn’t really make anything harder, as long as you can look them up somewhere. DST, leap seconds, etc., make shit complicated, because they bend, break, or overlap a single timeline to the point where suddenly you have points in time that happen twice, or that never happen, or where time runs faster or slower for a bit. That is incredibly hard to deal with consistently, much more so that just switching a simple offset you’re operating within.



  • You’re not wrong, but the way you put it makes it sound a little bit too intentional, I think. It’s not like the camera sees infrared light and makes a deliberate choice to display it as purple. The camera sensor has red, green and blue pixels, and it just so happens that these pixels are receptive to a wider range of the light spectrum than the human eye equivalent, including some infrared. Infrared light apparently triggers the pixels in roughly the same way that purple light does, and the sensor can’t distinguish between infrared light and light that actually appears purple to humans, so that’s why it shows up like that. It’s just an accidental byproduct of how camera sensors work, and the budgetary decision to not include an infrared filter in the lens to prevent it from happening.