Yea, BI didn’t want to get beat to the headline, so they want to press, but what a wasted opportunity.
@knowledge_seeker@lemmy.world this comment is mostly it. Live in America for a bit and you’ll see there are people that way and there aren’t. And guess what, every other country, too. Americans also have speech patterns that push more air, which sounds loud and projecting to many other nationalities. (Am American)
This is so, so variable. Cost of living in US can swing dramatically with food and housing.
But surely you know another beekeeper who is on Lemmy? There must be 1s of you!
But you’d need to spend some of that memory convincing yourself you actually were in a time loop, or you’d spend the whole time thinking the memory is some weird dream.
For me, the pinned issue keeps me from using it: https://github.com/vogler/free-games-claimer/issues/183
And for GOG you can have something refresh a link daily and it grabs whatever is there.
But immich does so much more than faces. I searched for “tent on the beach” the other day, worked like a charm.
So. Much. Better. I love Nextcloud but Immich uploads just work, private AI tagging, better UX, maps, it’s just better.
Searching. XNG makes the “ch” then “ng” sound in some languages.
Thus Searx was “Search”
@Crozekiel@lemmy.zip
@sibachian@lemmy.ml
@AnotherPenguin@programming.dev
@catloaf@lemm.ee
@fatboy93@lemm.ee
@Sturgist@lemmy.ca
I came really close back then, had a deposit but canceled due to realizing I had a perfectly good old car and didn’t “need” one at the time. Hindsight says I lucked out.
If I did get it, and still had it (I would, I keep cars for 10+ years), I would probably rebadge it as an indicator of my dislike for what the brand has become.
I’d actually love to see a rebadging campaign by dissatisfied owners.
When you can predict the behavior of a company 10 years out, you pet me know. I’ll make you my banker.
Until then, that whole comment is hot garbage.
Or Spam detection. A standard Bayesian filter would catch these with reports.
As I know we all find this funny, this is also fantastic.
With the use of agents bound to grow, this removes the need for TTS and STT meaning no power hungry GPU in the mix. A low-power microprocessor can handle this kind of communication.
You can run your own sync server in Docker.
Love me some Obtainium. Did my first PR this week (adding cross-device sync via SxncD)
To anyone complaining about non-replaceable RAM: This machine is for AI, that is why.
Think of it like a GPU wirh a CPU on the side, vs the other way around.
Inference requires very fast ram transfer speed, and that is only possible (currently) on soldered buses. Even this is pretty slow at 256Gb/s, but it’s RAM size of 96GB to GPU makes it interesting for larger models.
Especially the dishwashers.
This person is a consumer, just like you. Your gaming is no more important than their fiddling. Your angst is pointed in the wrong direction.