this post was submitted on 26 Jul 2023
8 points (100.0% liked)

Technology

34395 readers
453 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 1 year ago (1 children)

The thing is, the LLM doesn’t actually know anything, and lies about it.

Just like your average human journalist. If you ever read an article from not specialist journal on a topic you are familiar with - you know. This seems actually where LLM are very similar to how human brain works - if we don't know something, we come up with some bullshit.

[–] [email protected] 0 points 1 year ago (1 children)

Even medium human writers can comprehend their work as a whole, though. There is a cohesiveness even to the bullshit. The LLM is just putting words down that match the prompt. It's rng driven, readable Lorum Ipsum.

If the results were still edited afterwards, there may be some merit to the output, but any company going full LLM isn't looking for quality. They want to use it to churn out endless content that they simply can't get from even a team of humans. More than could be edited even if they kept editors on staff.

[–] [email protected] 0 points 1 year ago (1 children)

Even medium human writers can comprehend their work as a whole, though

Sure, but a lot of humans are rather bad writers.

but any company going full LLM isn’t looking for quality.

That is true for 24h news cycle of online media, regardless LLM.

[–] [email protected] 0 points 1 year ago (1 children)

Sure, but a lot of humans are rather bad writers.

Bad writing is still a step above rng junk, imo.

but any company going full LLM isn’t looking for quality.

That is true for 24h news cycle of online media, regardless LLM.

Yes, that was my point. Setting up your company to put out more content than can possibly be processed by humans is a glaring sign of their values - ie quantity far above quality.

[–] [email protected] 0 points 1 year ago (1 children)

Bad writing is still a step above rng junk, imo.

I'v read writing worse than GTP. I had to help someone write an essay - and I just wrote it for him in the end, because he absolutely lacked the skills to write a long meaningful text. At at the same time - genius of a percussionist.

[–] [email protected] 0 points 1 year ago (1 children)

Do you think that person was signing up for jobs writing for blogs or content farms?

[–] [email protected] 0 points 1 year ago (1 children)

Have you read some low quality journalism? The whole yellow press can be replaced with GTP and no one would ever see a difference.

[–] [email protected] 0 points 1 year ago (1 children)

Ok, so do you wanna talk about your terrible writing partner in school? Or "yellow press"? Or maybe the topic of the article, which isn't journalism in the slightest? Or how about my point, which was, again, that even bad writers have context, as opposed to an LLM which is just filling in the arbitrary patterns it's programmed to delineate. Readability is not what I'm talking about.

[–] [email protected] 0 points 1 year ago (1 children)
[–] [email protected] -1 points 1 year ago* (last edited 1 year ago)

Did you get the room you were looking for, since you asked for it thrice? Trivago.