Why Apple Intelligence doesn’t work on older iPhones or Vision Pro

When asked why Apple Intelligence will not be available on older iPhones, the company has responded that the chips are not powerful enough to provide a good experience. The answer will take a long time on the old page.

But many said that if their phone isn’t up to the task, why not just use Apple’s Intelligence Server – aka Private Cloud Compute? After all, that happens today with many Siri requests. We have the answer to that…

Why Apple Intelligence doesn’t work on older iPhones or Vision Pro

John Gruber talked about this at Apple.

A question I get asked frequently is why devices that aren’t eligible for Apple Intelligence can’t do everything through Compute Cloud Private. Everyone understands that if a device is not fast or powerful enough for device processing, it is. But why do older iPhones (or in the case of the non-pro iPhone 15, newer iPhones with two-year chips) only use Cloud Compute for everything?

From what I understand, this is not how Apple Intelligence is designed to work. The models that run on the device are different models than those that run on the cloud, one of the types of the device is a heuristic that determines which tasks can use the configuration on the device and which require the cloud private or ChatGPT.

Read More: How to use Apple Notes to have secret conversations with others

Gruber admits that’s not the only reason: Apple already has to provide a lot of server computing to handle these requests that can’t be handled on the device, and the demand for servers will be very high if it will – handle all requests. from old phones. But the reason seems reasonable. Another surprising revelation is that Apple Intelligence will not support Vision Pro, despite the M2 chip being powerful enough to do so.

The reason here, Gruber explains, is that the chip works at almost full capacity and therefore does not have enough capacity for Apple Intelligence. Like the smart little birds, Vision Pro makes good use of the M2’s Neural Engine to complement the R1 chip for timing purposes — occlusion and object detection, things like that. With Macs and iPads with the M Series, the Neural Engine is there, perfect for Apple’s artificial intelligence features. With Vision Pro, it’s done.

Again, the explanation makes sense, but it’s really a shame, because the platform seems to be almost made for AI – and it’s the beginning of the product Apple Glasses will a possibility that will include Apple’s intelligence.

Another part from Gruber’s collection: Prepare to be constantly annoyed by requests for permission to transfer to ChatGPT. At least as it stands in Apple’s home, it’s not always an option. Some people would like an “Always Allow” option for sending requests to ChatGPT, but according to the Apple representatives I spoke to, such an option does not exist.

I think this will change, because it will always be an annoying app, but Apple may want to play it secretly and secretly as the project begins. Update: Mark Gurman says Apple is looking to bring Apple’s intelligence to Vision Pro next year, although there is no information on how the company will overcome the challenges Gruber said.

Apple Intelligence comes with Vision Pro. When Apple Intelligence was unveiled earlier this month, it was only available for Macs, iPhones and iPads. But there’s another device ready to go: the Vision Pro headset. I’m told Apple is working hard on bringing these features to the device, but it won’t happen this year.

Leave a Reply

Your email address will not be published. Required fields are marked *