Apple’s most important new product of the year (sorry, it’s not a $3,500 headset) will arrive this month. Apple Intelligence, a suite of software tools bringing what Apple describes as artificial intelligence to its devices, will be released through free software updates for owners of some iPhones, Macs and iPads.
The initial version of Apple Intelligence, which Apple is publishing as an unfinished “beta,” will include a slightly improved version of Apple’s virtual assistant, Siri, and tools that automatically summarize text, transcribe audio recordings and remove distractions like photo bombers from pictures.
For Apple, this debut is the beginning of a new era. Apple Intelligence is the result of a major restructuring of the Cupertino, California, giant nearly two years after the tech industry was upended by the ChatGPT chatbot from OpenAI.
Apple executives had been concerned that, without similar AI technology, the iPhone would eventually look antiquated, so Apple killed its self-driving car project, which had been more than a decade in the making, and reassigned its engineers to work on Apple Intelligence.
Apple Intelligence is arriving without many of the most hyped features that Apple announced in June. Although the company struck a deal with OpenAI to include ChatGPT in its software, the chatbot will not be part of this initial release. Siri also isn’t smart enough (yet) to do things like stitch together data from multiple apps to tell you whether a last-minute meeting will make you late for your child’s play. Apple said those features and others would be gradually rolled out through next year.
To get a sneak preview, I tested an early version of Apple Intelligence over the last week. The new features were a little tricky to find — they have been integrated into different parts of Apple’s software system, including into its buttons for editing text and photos.
I found a few features, including tools for proofreading text and transcribing audio, to be very handy. Others, like a tool for generating summaries of web articles and a button for removing unwanted distractions from photos, were so hit or miss that they should be ignored.
This is all to say that Apple Intelligence is worth watching over the next few years to see whether it evolves into a must-have product, but that it’s not a compelling reason to splurge on new hardware.
Apple Intelligence will work on the latest iPhone 16s and last year’s iPhone 15 Pro, as well as on some iPads and Macs released in the last four years. Here are the tools that will be most useful and the ones you can skip when the software lands on devices this month.
Apple Intelligence tools that are useful
Transcribe audio recordings
Apple Intelligence delivers a feature that feels long overdue: When you use the voice memos app to record audio, the app will now automatically produce a transcript alongside the file.
As a journalist who regularly records interviews, I was gung-ho about trying this tool and pleased that it worked well. When I met with a tech company last week, I pressed the record button in the app, and after I hit stop, the transcript was ready for me. Apple Intelligence detected whenever a different person was speaking and created a new paragraph accordingly in the transcript. It transcribed some words incorrectly whenever a person mumbled. But overall, the transcript made it easy for me to look up a keyword to pull a portion of the conversation.
Ask Siri for help with an Apple product
While it may be easy to use any smartphone or tablet, Apple’s software has grown increasingly complex over the years, so it can be difficult to know how to take advantage of features that are hard to find. Apple Intelligence has imbued Siri with the ability to offer help with navigating Apple products.
I can never remember, for the life of me, how to run two apps side by side on the iPad, for instance. So I asked Siri, “How do I use split screen on the iPad?” Siri quickly showed me a list of instructions, which involved tapping a button on the top of an app.
Ironically, Siri could not offer help on how to use Apple Intelligence to rewrite an email. Instead, it loaded a list of Google search results showing other websites with the steps.
Speed through writing
Speaking of email, Apple Intelligence includes writing tools to edit your words, and it can even generate canned email responses.
I used the automatic response tool to quickly shoo away a salesperson at a car dealership: “Thanks for reaching out. I’m no longer interested in purchasing a vehicle at this time.”
As for editing text, I highlighted an email I quickly wrote to a colleague and hit the “Proofread” button. Apple Intelligence quickly edited the text to insert punctuation that I had skipped.
Apple AI tools you can ignore
Removing distractions from photos
One of Apple Intelligence’s most anticipated features is the ability to automatically edit a photo to remove a distraction, such as a photo bomber in an otherwise perfect family portrait. Plenty of people will want to try this tool, called Clean Up, but prepare to be disappointed.
To try it, I opened a photo I shot of family members at an outdoor wedding a few years ago. I hit the “Clean Up” button with hopes of removing people sitting on lawn chairs in the background. The software deleted the people and lawn chairs, but they were replaced with an unintelligible jumble of black-and-white pixels.
I tried the tool again on a photo of my corgi, Max, sleeping on my couch next to a blanket. Apple Intelligence removed the blanket and tried to reproduce the couch cushion. Instead, it generated a deep, unflattering butt groove.
Summarizing text
Apple seems to think that the internet is filled with too many words. One of Apple Intelligence’s most prominent features is its ability to generate summaries of text in many applications, including an email, a web article and documents.
By pressing the “Summarize” button in the Safari browser, I got a three-sentence summary of a 1,200-word New York Times article about the pros and cons of eating tuna. Apple Intelligence summed up the premise of the article — that tuna was a nutritious food that could be high in mercury, and consumers should consider species of tuna with lower mercury levels.
Unfortunately, in its summary, Apple Intelligence recommended that people consume albacore, one of the species listed in the article as having the highest levels of mercury. This is what’s known in the tech industry as a hallucination, a common problem in which AI fabricates information after failing to guess the correct answer.
The tool also fell short summarizing my notes. Recently, to prepare for an office meeting, I took notes on three colleagues I was going to meet with. Instead of producing a tight dossier on each person, the tool generated a summary of only one person’s role.
Apple declined to comment.
In summary, you can skip this tool.
This article originally appeared in The New York Times.
Get more business news by signing up for our Economy Now newsletter.
With the rapid advancement in technology and digital innovation, tech giants like Apple are continuously striving to enhance the human-digital interface. Artificial Intelligence or AI is one such area where most tech majors are investing their resources and efforts. Apple is no exception. With a series of fascinating announcements and reveals, Apple’s AI is soon expected to launch on iPhones providing a unique and revolutionary digital interaction experience to the users.
Apple’s interpretation of AI is clearer and focused when compared to many of its competitors. It aims to leverage the power of AI not only to transform the iPhone experience but also to ensure user privacy. As Apple’s CEO, Tim Cook puts it, “We
Scientists have found a new way to view the Universe in a different light. This new method, known as Quasar-phase Space Densitometry, allows for a more in-depth understanding of the celestial bodies, exposing unseen aspects of the Universe.
The process involves the measurement of the density of stellar objects using quasars, which are extremely powerful sources of energy and light in space, located in the distant parts of the Universe. By mapping the density of these objects, scientists can better understand the distribution and movement of galaxies and other celestial objects.
The new method has revealed surprising data about the structure of the Universe, challenging our previous understandings. Quasar-phase Space Densitometry might also contribute to solving mysteries like dark matter and dark energy, the enigmatic substances accounting for most of the Universe’s mass and energy.
Further research and refinement of Quasar-phase Space Densitometry could potentially revolutionize our understanding of the Universe and offer us insights into its past, present, and future. This breakthrough in cosmology shows how our pursuit of knowledge is leading us to reshape our picture of the Universe, one discovery at a time.,
[/gpt3]
Apple’s AI is landing soon on iPhones. Here’s what it’s like.