Before I dig into what it’s like to use Apple Intelligence, let’s review Apple’s surprisingly sober promises about the product.
Apple is calling it “AI for the rest of us,” which translates as “AI that won’t frighten you” when you consider the competition.
Apple Intelligence is slow and steady and definitely not scary The new Apple Intelligence technology is baked into the latest versions of iOS, iPadOS, and macOS.
The new Apple Intelligence features are fine.
The email summaries were the first Apple Intelligence feature I noticed, although you won’t notice them if you use the Gmail app.
You can also record meetings or phone calls and then get Apple Intelligence to transcribe and summarize what was said.
One potential limitation Apple is facing in its quest to bring more advanced AI features to its users is its commitment to privacy.
First of all, you’ll need a device that supports Apple Intelligence, which, again, only works on the latest devices.
For the most advanced features, Apple will reportedly charge a monthly fee at some point in the future.
For now, I’m happy with the simplicity of Apple’s diet AI, and I don’t mind that it screws up sometimes.
Not to mention that it’s charmingly unfinished, Apple Intelligence is cool and useful. This software update for Macs, iPhones, and iPads does not seem to pose a threat to human survival, in contrast to some other AI projects. Two of its most notable features are said to be a functional Siri and privacy. Though it is about to launch, none of it functions just yet, and it most likely won’t for several months.
Apple finally disclosed that Apple Intelligence will launch in October, months after the product’s initial announcement, during this year’s annual new iPhone event. Only the most recent iPhone models—including the iPhone 15 Pro—as well as MacBooks and iPads with M1 processors or higher are compatible with Apple Intelligence. It won’t be shipped as a completed item either. You’ll receive a beta version when you update your devices with Apple Intelligence in a few weeks; it might or might not function better than the beta version I’ve been testing for the past few weeks on my laptop and phone.
Despite the unresolved bugs, I can already see how Apple Intelligence will alter my daily usage of my Mac and iPhone. Although it’s a small but significant change, I doubt any of these new AI-powered habits will drastically alter my life—at least not in the next year or so.
Vox Electronics.
Get weekly updates from Vox writers about the ways in which technology is affecting both the world and us.
Email is necessary.
You acknowledge and agree to our terms and privacy notice by sending us an email. This website is subject to the Google Privacy Policy and Terms of Service, and it is secured by reCAPTCHA.
Let’s take a look at Apple’s surprisingly realistic product promises before I get into what it’s like to use Apple Intelligence. The purpose of Apple Intelligence is not to astound you. “AI for the rest of us,” as Apple puts it, is essentially “AI that won’t frighten you” when you take the competition into account. Earlier in the year, Google’s Gemini ran into issues when its picture creation revealed an odd bias. Almost from the moment it launched in the fall of 2022, ChatGPT has terrified people. Experts, however, have warned that humanity could perish due to the unstoppable advancement of extremely powerful artificial intelligence (AI) technology, which uses enormous amounts of resources and is currently unknown.
Therefore, yes, I would prefer the diet version of that. I am all of us combined. I’m all for Apple Intelligence. Unfortunately, I won’t be able to utilize it for quite some time.
Apple intelligence is steady and slow, and it’s not frightening at all.
In the most recent iterations of iOS, iPadOS, and macOS, the new Apple Intelligence technology is integrated. It will be available for you to turn on when your software is updated, but because some features are hidden in menus, you might not even notice it at that point. However, once you locate them, they kind of function!
The four pillars of Apple Intelligence—language, images, action, and personal context—were outlined by Apple in their Monday announcement of the iPhone 16. The first two speak about characteristics found in many generative AI programs. Language describes Apple Intelligence’s capacity to read and summarize various items, such as your email inbox and notifications, and to modify text that you enter into applications like Notes and Pages. Images is the name of its image-editing tools, such as Clean Up, which allows you to eliminate objects from pictures.
In the developer beta that I tried, each of these features was available. (Beta versions of software are pre-release builds that developers make available to a big population to test how well they withstand stress. The updated Apple Intelligence features work well. While occasionally prone to errors, the summaries are sufficient. Although using the Gmail app will prevent you from seeing them, the email summaries are the first Apple Intelligence feature I noticed. If you highlight the text you want to play around with and then tap to bring up the dashboard, the Writing Tools feature can do a lot more. There, you can modify the text in a fairly straightforward way to make it sound friendlier or more formal. Additionally, Apple Intelligence can be used to record meetings or phone calls and then transcribe and summarize the content. While not groundbreaking, even the most basic writing tools can help you save time.
The Clean Up feature in Google Photos is a useful tool for editing images, albeit it bears a striking resemblance to the Magic Eraser functionality that has been available for several months. When you use the Clean Up tool, you can actually use your finger or mouse to erase the desired portion of the photo, and it will magically vanish. But my pictures did appear noticeably altered as a result of it. When a swan was taken out of a photo of my wife and daughter by a pond, the water appeared to have been altered. That solves the issue of AI producing phony images that are difficult to detect, though.
Apple says that two more image-generation features will be available “later this year and in the months following.” Image Playground allows you to create illustrations, while Genmoji allows you to create your own custom emoji.
This caveat also holds true for the other two Apple Intelligence pillars, action and personal context. These ambiguous phrases, in general, only allude to the new Siri’s capabilities and increased level of personalization. One more example Apple provides for this further development of Apple Intelligence is the ability to ask Siri to send pictures of a particular group of people at a particular event to a particular individual (e.g. G. If you tell Siri to “Text Grandma the pictures of our family from last weekend’s barbecue,” she will comply. Moreover, you can now type your Siri request into a new menu that appears when you double-tap the screen’s bottom. For those like me who find it awkward to speak to computers in public, this is a huge game changer.
Since these new Siri features were not present in the version of Apple Intelligence I tested, and it’s unclear when they will be, I have no idea if this works. To be sure, the new Siri is better than the old one, at least from what Apple has disclosed so far. Inquiries that are followed up on after the first can help it comprehend the context. Additionally, Siri can now comprehend speech even if you stammer or change your mind. While this does not mark a significant advancement in natural language processing, it is a positive development for the infamously unreliable and awkward Siri.
But we’ll also start to see what third-party apps do with the new Siri functionality once Apple Intelligence is made available to the general public. Craig Federighi, senior vice president of Apple, stated at the event on Monday that “Siri will gain screen awareness.”. It will have access to hundreds of new app actions. “.
Therefore, the terms “action” and “personal context” refer to the fact that Siri will be aware of your activities on your phone when you make requests and follow through on them. But once more, as awesome as these more sophisticated features sound, they’re not quite ready.
The best features will have to wait. That’s a positive development.
To say that Apple Intelligence has revolutionized the way I use my MacBook and iPhone would be a massive understatement. I truly forget the AI-powered features exist because they are so limited and hidden. I should also make it clear that I’m testing a beta version of the program. I’ve been using a version of Apple Intelligence that is unfinished and glitchy, but it looks a lot like what Apple will release next month. Over the next few weeks, Apple will fix a lot of the bugs. In October, a final product will not be released, though.
When will Apple Intelligence exit beta testing, according to Apple? Perhaps years will pass. Ultimately, Google kept Gmail in beta testing for five years before removing the label. We should also anticipate the label to remain in place as long as Apple wishes to disassociate itself from any errors made by Apple Intelligence, which includes generative AI’s mistake of producing hallucinations, which is essentially by design.
Given Apple’s dedication to privacy, one possible roadblock to providing its users with more sophisticated AI features may be encountered. One important feature that I didn’t mention before was privacy; it’s invisible, so I haven’t shown you how it looks. While AI technologies like ChatGPT from OpenAI and Google Gemini require sending large amounts of your data to cloud servers, Apple, which is well-known for taking privacy and security very seriously, guarantees that its proprietary AI models will accomplish as much on your device as possible, just as Apple does with a significant amount of your data. Additionally, Apple has a new system called Private Cloud Compute that allows it to send your data securely to a server when necessary. It will be interesting to see how this system performs in comparison to its rivals as more sophisticated features demand more processing power.
The issue of cost is another. The beta version of iOS 18 allowed me to test Apple Intelligence for free, but it’s unclear if all of the features will be available to users at no cost. Again, this is limited to the newest devices, so your first requirement is to have a device that supports Apple Intelligence. For the majority of them, that will entail purchasing a new iPhone. It has been reported that Apple will eventually charge a monthly fee for the most advanced features. That being said, unless you’re willing to pay for the privilege, your life probably won’t be completely changed by an extremely intelligent Siri.
And so here I am, on the verge of discovering Apple Intelligence. Prior to Apple Intelligence’s announcement, I upgraded to an iPhone 15 Pro, and even if I hadn’t, I wouldn’t buy a new phone just to get the new AI capabilities. That being said, I’ve been enjoying how much Apple Intelligence has simplified my life ever since it started to function on my phone. Apple Intelligence saves me a few minutes of work when I remember to use it, as opposed to spending it reading through every notification or editing a photo. I am currently required to review the work produced by the AI, but I don’t mind.
I’m afraid of the near future when AI labor is indistinguishable from human labor. I don’t mind that Apple’s diet AI occasionally makes mistakes because, for the time being, I’m content with its simplicity. And I do too.
.