this post was submitted on 16 Aug 2024
68 points (100.0% liked)

Technology

37573 readers
256 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 1 month ago* (last edited 1 month ago)

But there isn’t any mechanism inherent in large language models (LLMs) that would seem to enable this and, if real, it would be completely unexplained.

There's no mechanism in LLMs that allow for anything. It's a blackbox. Everything we know about them is empirical.

LLMs are not brains and do not meaningfully share any of the mechanisms that animals or people use to reason or think.

It's a lot like a brain. A small, unidirectional brain, but a brain.

LLMs are a mathematical model of language tokens. You give a LLM text, and it will give you a mathematically plausible response to that text.

I'll bet you a month's salary that this guy couldn't explain said math to me. Somebody just told him this, and he's extrapolated way more than he should from "math".

I could possibly implement one of these things from memory, given the weights. Definitely if I'm allowed a few reference checks.


Okay, this article is pretty long, so I'm not going to read it all, but it's not just in front of naive audiences that LLMs seem capable of complex tasks. Measured scientifically, there's still a lot there. I get the sense the author's conclusion was a motivated one.