this post was submitted on 06 Dec 2023
38 points (93.2% liked)

Android

16862 readers
16 users here now

The new home of /r/Android on Lemmy and the Fediverse!

Android news, reviews, tips, and discussions about rooting, tutorials, and apps.

πŸ”—Universal Link: [email protected]


πŸ’‘Content Philosophy:

Content which benefits the community (news, rumours, and discussions) is generally allowed and is valued over content which benefits only the individual (technical questions, help buying/selling, rants, self-promotion, etc.) which will be removed if it's in violation of the rules.


Support, technical, or app related questions belong in: [email protected]

For fresh communities, lemmy apps, and instance updates: [email protected]

πŸ’¬Matrix Chat

πŸ’¬Telegram channels / chats

πŸ“°Our communities below


Rules

  1. Stay on topic: All posts should be related to the Android OS or ecosystem.

  2. No support questions, recommendation requests, rants, or bug reports: Posts must benefit the community rather than the individual. Please post to [email protected].

  3. Describe images/videos, no memes: Please include a text description when sharing images or videos. Post memes to [email protected].

  4. No self-promotion spam: Active community members can post their apps if they answer any questions in the comments. Please do not post links to your own website, YouTube, blog content, or communities.

  5. No reposts or rehosted content: Share only the original source of an article, unless it's not available in English or requires logging in (like Twitter). Avoid reposting the same topic from other sources.

  6. No editorializing titles: You can add the author or website's name if helpful, but keep article titles unchanged.

  7. No piracy or unverified APKs: Do not share links or direct people to pirated content or unverified APKs, which may contain malicious code.

  8. No unauthorized polls, bots, or giveaways: Do not create polls, use bots, or organize giveaways without first contacting mods for approval.

  9. No offensive or low-effort content: Don't post offensive or unhelpful content. Keep it civil and friendly!

  10. No affiliate links: Posting affiliate links is not allowed.

Quick Links

Our Communities

Lemmy App List

Chat and More


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago (1 children)

What about autocomplete? Face detection? Virtual assistants

How much of that is really built-in vs. offloaded to their cloud then cached locally (or just not usable offline, like Assistant)?

[–] [email protected] 2 points 9 months ago (2 children)
[–] [email protected] 1 points 9 months ago (1 children)

Services running in GCP aren't built into the phone, which is kinda the main point of the statement you took issue with.

[–] [email protected] -2 points 9 months ago

What does that have to do with CACHING? That's client server.

No clue what you're talking about

[–] [email protected] 0 points 9 months ago (1 children)

That's the entire point. Running the LLM on device is what's new here...

[–] [email protected] -1 points 9 months ago (1 children)

MLC LLM does the exact same thing. Lots of apps have low quality LLMs embedded in chat apps. Low res image generation apps via diffusion models similar to DallE mini have been around a while.

Also Qualcomm Used its AI stack to deploy SD to mobile back in February. And this is not the low res one.

Think before you write.

[–] [email protected] 0 points 9 months ago (1 children)

I can't find a single production app that uses MLC LLM (because of the reasons I listed earlier (like multi GB models that aren't garbage).

Qualcomm announcement is a tech demo and they promised to actually do it next year...

[–] [email protected] 0 points 9 months ago (1 children)

Who said about production and non-garbage? We're not talking quality of responses or spread. You can use distilled roberta for all I give a fuck. We're talking if they're the first. They're not.

Are they the first to embed a LLM in an OS? Yes. A model with over x Bn params? Maybe, probably.

But they ARE NOT the first to deploy gen AI on mobile.

[–] [email protected] 1 points 9 months ago

You're just moving the goal posts. I ran an LLM on device in an Android app I built a month ago. Does that make me first to do it? No. They are the first to production with an actual product.