Apple to revamp Siri as built-in chatbot on iPhone, Mac to counter OpenAI, Google

0
5
Apple to revamp Siri as built-in chatbot on iPhone, Mac to counter OpenAI, Google


Apple Inc plans to revive Siri by turning the digital assistant into the company’s first artificial intelligence chatbot later this year, drawing the iPhone maker into the generative AI race dominated by OpenAI and Google.

The chatbot – code-named Campos – will reportedly be deeply embedded in the iPhone, iPad and Mac operating systems and replace the current Siri interface. (Reuters/Representative image)

According to people familiar with the plan, the chatbot — code-named Campos — will be deeply embedded in the iPhone, iPad and Mac operating systems and replace the current Siri interface. Users will be able to summon the new service the same way they open Siri, by saying the “Siri” command or pressing the side button on their iPhone or iPad.

The new approach will go far beyond the capabilities of current Siri — or even a long-promised update coming earlier in 2026. Today’s Siri lacks the chat-like feel and back-and-forth conversation capabilities of OpenAI’s ChatGPT or Google’s Gemini.

The feature is a central part of Apple’s turnaround plan for the AI ​​market, where it has lagged behind its Silicon Valley peers. The Apple Intelligence Platform had a difficult rollout in 2024, with features that were few or slow to arrive.

Apple shares rose 1.7% on the chatbot news to a session high of $250.83. Google parent Alphabet Inc., which is supplying the underlying technology for the project, rose 2.6% to $330.32 as of 2:54 p.m. in New York.

A previously promised, non-chatbot update to Siri – keeping the current interface intact – is planned for iOS 26.4 in the coming months. The idea behind that upgrade is to add features unveiled in 2024, including the ability to analyze on-screen content and tap into personal data. It will also be better at searching the web.

Chatbot capabilities will come later this year, the people said, speaking on condition of anonymity because the plans are private. The company aims to unveil that technology at its Worldwide Developers Conference in June and release it in September.

Campus, which will have both voice- and typing-based modes, will be the primary new addition to Apple’s upcoming operating system. The company is integrating it into iOS 27 and iPadOS 27, both code-named Rave, as well as macOS 27, known internally as Fizz.

Apart from the chatbot interface, there are no major changes to the operating system this year. Apple is focusing more on improving performance and fixing bugs. Last year, it launched a major design overhaul, unifying the look and feel of its operating systems.

Internally, Apple is testing the chatbot technology as a standalone Siri app, similar to the ChatGPIT and Gemini options available in the App Store. However, the company does not plan to offer that version to customers. Instead, it would integrate software into its operating system, much like today’s Siri.

A spokesman for Cupertino, California-based Apple declined to comment.

The adoption of a chatbot approach represents a strategic shift for Apple, which has long supported OpenAI, Google and Microsoft Corp. Popular conversational AI tools have been downplayed. Executives have argued that users prefer AI incorporated directly into features — something Apple has done with its writing tools, Genmoji emoji generator and notification summaries — rather than standalone chat experiences.

Craig Federighi, senior vice president of software engineering, said in an interview with Tom’s Guide in June that it was never the company’s goal to release a chatbot. Apple didn’t want to send users to some chat experience to get the job done, he said.

But Apple risked falling further behind rivals without a chatbot of its own. Samsung Electronics Co., Google and many Chinese smartphone makers already have conversational AI deeply embedded in their operating systems. With ChatGate surpassing 800 million weekly active users in October, such tools have become increasingly essential.

OpenAI is poised to become Apple’s competitor, adding even more pressure. The creator of ChatGPT wants to develop its software into an AI operating system. It is also working on new devices under the direction of former Apple design chief Jony Ive.

The AI ​​company has laid off several dozen Apple engineers in recent months, a move that has angered the iPhone maker’s executives and raised concerns about OpenAI posing a threat to its underlying business.

Like ChatGPT and Google Gemini, Apple’s chatbot will allow users to search the web for information, create content, draw images, summarize information, and analyze uploaded files. It will also use personal data to complete tasks, enabling specific files, songs, calendar events and text messages to be found more easily.

Unlike third-party chatbots running on Apple devices, the planned offering is designed to analyze open windows and on-screen content to take actions and suggest commands. It will also be able to control device features and settings, allowing it to make phone calls, set timers and launch the camera.

More importantly, Siri will be integrated into all of the company’s core apps, including Mail, Music, Podcasts, TV, Xcode programming software, and Photos. This will allow users to do much more with just their voice. For example, they can ask Siri to find a photo based on a description of its contents and edit it with specific preferences – like cropping and color changes. Or a user can ask Siri to write a message to a friend about upcoming calendar plans within the Email app.

Campos could also let Apple turn off its Spotlight function. The feature lets users search for content on their devices and view limited information such as sports scores and weather details.

One point of discussion is how much chatbots will be allowed to remember about their users. ChatGPT and other conversational AI tools can retain an extensive memory of past conversations, allowing them to capture conversations and personal details when completing a request. Apple is considering sharply limiting this capability in the interest of privacy.

The chatbot will have a user interface designed by Apple, but will rely heavily on custom AI models developed by the Google Gemini team — an arrangement first reported by Bloomberg News last year.

The iOS 26.4 update of Siri, the first for True Chatbot, will rely on a Google-developed system known internally as Apple Foundation Model Version 10. This software will work on 1.2 trillion parameters, which is a measure of AI complexity.

However, Campos would go far beyond those abilities. The chatbot will run a high-level version of the custom Google model, which is equivalent to Gemini 3, known internally as Apple Foundation Model version 11.

In a potential policy change for Apple, the two partners are discussing hosting chatbots directly on Google servers running powerful chips known as TPUs or Tensor Processing Units. More immediate Siri updates, by contrast, will work on Apple’s own private cloud compute servers, which rely on high-end Mac chips for processing.

Apple is paying Google about $1 billion annually for access to the models. The company may also turn to Google technology to enhance existing Apple Intelligence features. Bloomberg first reported last June that Apple was considering using external models to fix its AI problems.

Apple is designing Campus so that its underlying models can change over time. This means that the company will have the flexibility to move away from the Google-powered system in the future if it wishes. Apple has also tested chatbots with Chinese AI models, indicating plans to eventually deploy the feature in the country, where Apple Intelligence is not yet available.

Both the next Siri upgrade and Campus will include a feature called World Knowledge Answers, which was first reported by Bloomberg in September. It will provide citations as well as web-summarized responses similar to Perplexity and ChatGPT.

Signs of Apple’s shift toward chatbots have emerged in recent months. Last year, the company developed an app internally called Veritas, which turned the new Siri engine into a text-based chatbot interface. The app was only for testing and is not planned to be released publicly.

The strategic pivot follows an Apple leadership change. Longtime AI chief John Giannandrea was relieved of his role in December, with Federighi tightening control over Apple’s AI efforts. The company has also appointed Amar Subramaniam as vice president of AI, reporting to Federighi. He previously helped lead engineering for Gemini at Google.


LEAVE A REPLY

Please enter your comment!
Please enter your name here