sciencesebi

joined 9 months ago
[–] [email protected] 1 points 9 months ago

I've known companies (usually smaller ones) that gave out genuine tasks or bugs they had during interviews.

[–] [email protected] 0 points 9 months ago (1 children)

Who said about production and non-garbage? We're not talking quality of responses or spread. You can use distilled roberta for all I give a fuck. We're talking if they're the first. They're not.

Are they the first to embed a LLM in an OS? Yes. A model with over x Bn params? Maybe, probably.

But they ARE NOT the first to deploy gen AI on mobile.

[–] [email protected] 0 points 9 months ago (1 children)

https://llm.mlc.ai/docs/deploy/android.html

Or does it have to be on the play store or some other BS you use to backpedal?

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago) (3 children)

Are you familiar with the difference between title and paragraph? Apparently not.

Answered the same question here

Feel free to not respond when you realize you are wrong and you have no clue what I'm talking about.

[–] [email protected] -2 points 9 months ago* (last edited 9 months ago) (1 children)

IEEE defines it as any software whos actions automate a human behavior. All those fall under the definition.

[–] [email protected] -2 points 9 months ago

What does that have to do with CACHING? That's client server.

No clue what you're talking about

[–] [email protected] -1 points 9 months ago (3 children)

MLC LLM does the exact same thing. Lots of apps have low quality LLMs embedded in chat apps. Low res image generation apps via diffusion models similar to DallE mini have been around a while.

Also Qualcomm Used its AI stack to deploy SD to mobile back in February. And this is not the low res one.

Think before you write.

[–] [email protected] 2 points 9 months ago (7 children)

Why would that matter?

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (6 children)

I was talking about the title, not the 10th paragraph way down. Use your reading skills and tell me where the fuck "generative" is in the title.

No. Autocomplete is a feature. The model behind it can be gen AI and was for a number of years. IDGAF if it's not general purpose.

The point it you have no fucking clue what you're defending. LLMs and diffusion models have been in apps for months. You can say that General purpose LLMs embedded into mobile OS functions is novel, the rest of it is bullshit.

[–] [email protected] 2 points 9 months ago

That's my point. AI includes features that were added years ago. Even ML is too broad. Autocomplete uses small ML models. Spam filters as well.

I think they mean LLMs, and specifically distilled BARDs. So a subset of a subset of a subset of AI.

Neckbeard marketing

[–] [email protected] -1 points 9 months ago (8 children)

It says AI not genAI. Anyway, autocomplete is genAI, even though it may be simple glove embeddings and MC.

You don't know what the fuck you're talking about.

[–] [email protected] 3 points 9 months ago (24 children)

"The first phone with AI built in."

LOL Google are dellirious

What about autocomplete? Face detection? Virtual assistants

view more: next ›