16.2 C
Casper
Tuesday, July 16, 2024

Apple Ventures Outside the Park for Generative AI

Must read

Without its powerful LLMs, Apple tried to make the best of the fait accompli of OpenAI’s dominance.

Think of the Faraday cage as a box that prevents any electromagnetic pinging of whatever’s inside the enclosure. It’s named after the 19th-century English scientist Michael Faraday, who demonstrated the physics involved.

On June 11, Elon Musk used his favorite broadcast medium, X, to say he hates Apple’s new partnership with OpenAI so much that anyone visiting his companies will be expected to leave their iPhones in Faraday cages at the door.

That partnership, as was widely anticipated, was revealed at the opening of Apple’s annual developer jamboree, the Worldwide Developer Conference (WWDC). The iPhone maker also announced a slew of other artificial intelligence (AI) features as it sought to answer the question in many people’s minds—from the common iPhone user on the street to the Wall Street financial analyst—as to what Apple had in store after Google, Samsung, and Microsoft had brought out many of their AI products.

Apple didn’t disappoint. Musk’s angry posts could be written off as sour grapes, perhaps, as his new AI company, xAI, offers its chatbot product, Grok, versus OpenAI’s ChatGPT, which Apple says will be “deeply integrated” into the iPhone and other products.However, Musk’s comments reflect the dire reality of the world’s biggest tech companies’ war to win the ongoing AI race. In some ways, Apple, as the planet’s biggest direct-to-consumer, hi-tech products company, has the most at stake with its privacy-first stance, which is part of its appeal.

That makes it even more significant that a company famous for its sophisticated vertical integration across products and services and software and hardware has chosen to look outside for something as critical as an important part of its AI strategy. This, undoubtedly, will eventually have a long-term bearing on every single user of any Apple product.

Musk’s criticism underscores this departure when he says it’s “absurd” that a company as phenomenally capable as Apple, with virtually limitless resources, couldn’t build its large language models (LLMs). He adds that the partnership means that we won’t know or can’t know what happens when an iPhone user’s data goes to OpenAI for processing on the cloud. This on-the-cloud processing is how Siri, Apple’s intelligent assistant, can return a more useful result versus what’s possible in-device, even with Apple’s powerful processors.

Apple has introduced the Private Cloud Compute, built with its processors and software. Whether this feature will kick in even with data sent to OpenAI is unclear. And, of course, it’s still unclear why Apple—with a billion iPhones out there, not to mention iPads and MacBooks and Watches—didn’t build LLMs of its own. It’s possible that the highly secretive company is developing one, but it’s at a stage where Apple isn’t confident that it is ready for its customers.

Two analysts that Forbes India spoke to ahead of WWDC offered their views on the missing Apple LLMs: In short, Apple didn’t have the type of data that Google had or that OpenAI scraped from the Internet.

As to the partnership with OpenAI, “given the expectations of Apple at this time, there simply wasn’t the time to wait and do it right themselves and then bring something to market,” says Dipanjan Chatterjee, vice president and principal analyst at Forrester Research. Others were ahead. “And I think this is a sharp departure from what Apple has done before.”

“Apple is never known to be the first. They’re often known to be the best,” he points out. And he was speaking in the context of how Apple brings a “maniacal obsession to the customer experience” so that they don’t want you to care what’s under the hood but how useful a product is; the iPhone itself, but also before that, the iPod, and later the Apple Watch, are all examples of that, of how they “upended categories”, he says.

This time, the anticipation that Apple would soon reveal its AI roadmap had built up so much—and indeed investors had begun to get more than a little restless—that Apple’s silence up until this year’s WWDC “was deafening”, he says. And this time, what’s under the hood has also caught much attention.

Also Read: Explained: Machine Learning

Apple is integrating ChatGPT access into experiences within iOS 18, iPadOS 18, and macOS Sequoia—its upcoming operating system software updates—the company said in its June 10 press release. When needed, Siri can tap into ChatGPT: Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly. Privacy protections are built for users accessing ChatGPT—their IP addresses are obscured, and OpenAI won’t store requests. According to Apple, ChatGPT’s data-use policies apply to users who choose to connect their accounts.

ChatGPT, based on OpenAI’s latest model, the GPT-4o, will be available in iOS 18, iPadOS 18, and macOS Sequoia later this year. iPhone, iPad, and Mac users can access it for free without creating an account, and ChatGPT subscribers can connect their accounts and access paid features within their Apple devices and apps.

One thing that may also happen is, as companies rush to commercialize generative AI, Apple’s somewhat delayed, and perhaps more measured, entry onto the stage will likely rekindle the conversation around privacy and how technology ought to develop around it.

With its partnership with OpenAI, Apple is doing nothing short of staking its reputation. At WWDC’s opening event, CEO Tim Cook, after introducing the idea of “Apple Intelligence” and its relevance and usefulness to individual users, quickly talked about privacy, Chatterjee points out. “It’s very much in the conversation.”

The nature of generative AI has created an ever-growing multiplicity of data, exacerbating the problems around data privacy. “So, Cook is very quick to pivot to this idea” that with the usefulness comes concerns of privacy, the analyst says. Apple’s answer for now is a tightly guarded private environment. “They are going to work through the mechanics of what’s needed to keep this private and secure. And I think it’s just going to be a process,” he says.

More articles

Latest posts