this post was submitted on 04 Oct 2023
93 points (100.0% liked)

Technology

37573 readers
209 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 11 months ago (1 children)

While this is a reasonable take, the tensor chips are supposedly focused on AI (which would make sense given their push into the AI space for phone tools like spam, photo/video editing, assistant, etc.) and this refresh builds upon AI stuff they rolled out to previous gen phones. I doubt any of it is so cpu intensive that whatever AI they've created in a few years wont also run on the older phone, it just might not be as snappy.

[–] [email protected] 2 points 11 months ago (1 children)

I have a different impression about new AI features backporting plans, but we will see. My point is that ai targeting HW can potentially drive the next smartphones evolution, which is slowed down currently.

[–] [email protected] 1 points 11 months ago (1 children)

Training AI models takes a lot of development on the software side, and is computationally intense on the hardware side. Loading a shitload of data into the process, and letting the training algorithms dig down on how to value each of billions or even trillions of parameters is going to take a lot of storage space, memory, and actual computation through ASICs dedicated to that task.

Using pre-trained models, though, is a less computationally intensive task. Once the parameters are defined on that huge training set, that model can be applied by software that just takes the parameters already defined in training and applies it to smaller data sets.

So I would expect the AI/ML chips in actual phones would continue to benefit from AI development, including models developed many chip generations later.

[–] [email protected] 1 points 11 months ago

The thing is more complicated than than. Moreover, there is a wish/needs to train/fine-tune models locally. This is not comparable to initial training of chatGPT like models, but still require some power. Juts today I read that some pixel 8 video improvement features will not be ported to pixel 7 because they need tensor 3 power.