Well that sucks I really do like the smell. But maybe I’m just crazy.
Well that sucks I really do like the smell. But maybe I’m just crazy.
You caught me
I don’t usually recognize Lemmy users but I’ve run into your posts a lot these last few days. I assumed you were a bot but from interactions like these it’s clear you mean well. Love to see Lemmy growing from power users like yourself!
This is a great list! Thanks for sharing!
I had a similar relationship with my father. He was an alcoholic. These days I don’t have much of a relationship with him. I recognize that he’s a better person now that he’s older, but I don’t really see him as “dad”. Just “father”.
My mom sometimes asks if I will regret not spending more time with him. Honestly, I’m not sure I would. I don’t have many fond memories with him at all. It’s weird saying this knowing that I have a father who loves me in his own way when others might not have one at all.
This is a really thoughtful and educational answer. I learned a lot from this. Thank you!
That’s what I already do. Which got me thinking about why apps are so bad at this considering they can do the same thing.
Sounds like the system is just stuck on old tech. If I can tell that rain will reach my area from a precipitation radar map then I’m fairly certain an ML based system can do this too.
I suppose there’s just no money in it.
This is exactly what I’m looking for! I also often check the weather to get a gauge on whether it’s clear to walk my dogs. 2-3 hours ahead is perfect.
Gotcha. All I can add is a vote for Plausible. It’s great!
I spun up Plausible for my company site. Pretty straight forward honestly. It’s completely dockerized so there’s not a lot going on.
Your biggest problem is exposing the service on your home server to the internet. I personally wouldn’t recommend it, but if you know what you’re doing it’s possible.
I see your point. Rereading the OP, it looks like I jumped to a conclusion about LLMs and not AI in general.
My takeaway still stands for LLMs. These models have gotten huge with little net gain on each increase. But a Moore’s Law equivalent should apply to context sizes. That has a long way to go.
What drastically better results are you thinking of?
We’ve reached far beyond practical necessity in model sizes for Moore’s Law to apply there. That is, model sizes have become so huge that they are performing at 99% of the capability they ever will be able to.
Context size however, has a lot farther to go. You can think of context size as “working memory” where model sizes are more akin to “long term memory”. The larger the context size, the more a model is able to understand beyond the scope of it’s original model training in one go.
Whoa that’s a nice piece of trivia. Did some googling and it definitely has roots in MUDs, but Andrew obviously had higher ambitions visually. That’s cool.
That’s awesome! I’ve noticed it on lists of top voted MUDs for a long time, but never quite got into that particularly flavor.
MUDs. Text based (generally RPG) games with incredibly immersive story telling, near infinite levels of character customization, and many even feature ways for players to build on the world itself.
I’m surprised it’s not more popular amongst D&D enthusiasts.
In its hey day, people spent thousands of dollars just to boost their characters on massive for-profit MUDs like those created by Iron Realms. But smaller MUDs like Ancient Anguish were just as quality.
Sadly they’re going extinct. Only a few MUDs are still actively maintained.
Image file format with excellent compression. It’s designed for web browsers, so what you’re probably running into is compatibility with other programs. It’s fairly easy to convert though to GIF or JPEG formats though.
Now he’s going to pay the bots! 🤣
This is a really awesome answer. Thanks for taking the time writing it all out.
I think changing out my detergent for something really scent free is a good first step.