Evilschnuff

joined 1 year ago
[–] [email protected] 47 points 3 months ago (2 children)

They look exactly golfball-sized, wtf. This is the default measurement unit for hail!

[–] [email protected] 2 points 4 months ago

In iOS you can use Yattee and link to an alternative Frontend. Works well for me.

[–] [email protected] 23 points 6 months ago (1 children)

Er hat Geschichte studiert und gelehrt. Unwissen ist da keine mögliche Entschuldigung.

[–] [email protected] 19 points 6 months ago

Noch ein Grund mehr keine Milka zu kaufen.

[–] [email protected] 28 points 7 months ago

I believe this to be true for nearly all products. It has to be super simple to test, because you need to assess if it fits your needs. The mental model for a priori assessment is not strong enough usually.

[–] [email protected] 1 points 7 months ago

Yeah looks interesting!

[–] [email protected] 2 points 9 months ago

The name fits

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago)

LLMs don’t have live longterm memory learning. They have frozen weights that can be finetuned manually. Everything else is input and feedback tokens. Those work on frozen weights, so there is no longterm learning. This is short term memory only.

[–] [email protected] 1 points 9 months ago (2 children)

The term embodiment is kinda loose. My use is the version of AI learning about the world with a body and its capabilities and social implications. What you are saying is outright not possible. We don’t have stable lifelong learning yet. We don’t even have stable humanoid walking, even if Boston dynamics looks advanced. Maybe in the next 20 years but my point stands. Humans are very good at detecting miniscule differences in others and robots won’t get the benefit of „growing up“ in society as one of us. This means that advanced AI won’t be able to connect on the same level, since it doesn’t share the same experiences. Even therapists don’t match every patient. People usually search for a fitting therapist. An AI will be worse.

[–] [email protected] 5 points 10 months ago (4 children)

There is the theory that most therapy methods work by building a healthy relationship with the therapist and using that for growth since it’s more reliable than the ones that caused the issues in the first place. As others have said, I don’t believe that a machine has this capability simply by being too different. It’s an embodiment problem.

[–] [email protected] 43 points 10 months ago (1 children)
[–] [email protected] 4 points 10 months ago (1 children)

Is there no C#?

view more: next ›