WWDC 2024

apple-and-openai-currently-have-the-most-misunderstood-partnership-in-tech

Apple and OpenAI currently have the most misunderstood partnership in tech

A man talks into a smartphone.

Enlarge / He isn’t using an iPhone, but some people talk to Siri like this.

On Monday, Apple premiered “Apple Intelligence” during a wide-ranging presentation at its annual Worldwide Developers Conference in Cupertino, California. However, the heart of its new tech, an array of Apple-developed AI models, was overshadowed by the announcement of ChatGPT integration into its device operating systems.

Since rumors of the partnership first emerged, we’ve seen confusion on social media about why Apple didn’t develop a cutting-edge GPT-4-like chatbot internally. Despite Apple’s year-long development of its own large language models (LLMs), many perceived the integration of ChatGPT (and opening the door for others, like Google Gemini) as a sign of Apple’s lack of innovation.

“This is really strange. Surely Apple could train a very good competing LLM if they wanted? They’ve had a year,” wrote AI developer Benjamin De Kraker on X. Elon Musk has also been grumbling about the OpenAI deal—and spreading misinformation about it—saying things like, “It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security & privacy!”

While Apple has developed many technologies internally, it has also never been shy about integrating outside tech when necessary in various ways, from acquisitions to built-in clients—in fact, Siri was initially developed by an outside company. But by making a deal with a company like OpenAI, which has been the source of a string of tech controversies recently, it’s understandable that some people don’t understand why Apple made the call—and what it might entail for the privacy of their on-device data.

“Our customers want something with world knowledge some of the time”

While Apple Intelligence largely utilizes its own Apple-developed LLMs, Apple also realized that there may be times when some users want to use what the company considers the current “best” existing LLM—OpenAI’s GPT-4 family. In an interview with The Washington Post, Apple CEO Tim Cook explained the decision to integrate OpenAI first:

“I think they’re a pioneer in the area, and today they have the best model,” he said. “And I think our customers want something with world knowledge some of the time. So we considered everything and everyone. And obviously we’re not stuck on one person forever or something. We’re integrating with other people as well. But they’re first, and I think today it’s because they’re best.”

The proposed benefit of Apple integrating ChatGPT into various experiences within iOS, iPadOS, and macOS is that it allows AI users to access ChatGPT’s capabilities without the need to switch between different apps—either through the Siri interface or through Apple’s integrated “Writing Tools.” Users will also have the option to connect their paid ChatGPT account to access extra features.

As an answer to privacy concerns, Apple says that before any data is sent to ChatGPT, the OS asks for the user’s permission, and the entire ChatGPT experience is optional. According to Apple, requests are not stored by OpenAI, and users’ IP addresses are hidden. Apparently, communication with OpenAI servers happens through API calls similar to using the ChatGPT app on iOS, and there is reportedly no deeper OS integration that might expose user data to OpenAI without the user’s permission.

We can only take Apple’s word for it at the moment, of course, and solid details about Apple’s AI privacy efforts will emerge once security experts get their hands on the new features later this year.

Apple’s history of tech integration

So you’ve seen why Apple chose OpenAI. But why look to outside companies for tech? In some ways, Apple building an external LLM client into its operating systems isn’t too different from what it has previously done with streaming video (the YouTube app on the original iPhone), Internet search (Google search integration), and social media (integrated Twitter and Facebook sharing).

The press has positioned Apple’s recent AI moves as Apple “catching up” with competitors like Google and Microsoft in terms of chatbots and generative AI. But playing it slow and cool has long been part of Apple’s M.O.—not necessarily introducing the bleeding edge of technology but improving existing tech through refinement and giving it a better user interface.

Apple and OpenAI currently have the most misunderstood partnership in tech Read More »

ipados-18-adds-machine-learning-wizardry-with-handwriting,-math-features

iPadOS 18 adds machine-learning wizardry with handwriting, math features

WWDC 2024 —

Also coming: new SharePlay features and a new “tab bar” for first-party apps.

  • The Calculator app is finally coming to iPad.

    Samuel Axon

  • You’ll be able to write out expressions with the Apple Pencil and see them solved in real time.

    Samuel Axon

CUPERTINO, Calif.—After going into detail about iOS 18, Apple took a few moments in its WWDC 2024 keynote to walk through some changes.

There are a few minor UI changes and new features across Apple’s first party apps. That includes a new floating tab bar. The bar expands into the side bar when you want to dig in, and you can customize the tab bar to include the specific things you want to interact with the most. Additionally, SharePlay allows easier screen sharing and remote control of another person’s iPad.

But the big news is that the Calculator app we’ve all used on the iPhone to the iPad, after years of the iPad having no first-party calculator app at all. The iPad Calculator app can do some things the iPhone version can’t do with the Apple Pencil; a feature called Math Notes can write out expressions like you would on a piece of paper, and the app will solve the expressions live as you scribble them—plus various other cool live-updating math features. (These new Math Notes features work in the Notes app, too.)

Apple didn’t use the word AI here, but this is surely driven by machine learning in some way. Doubly so for a new handwriting feature called Smart Script, which refines and improves your handwriting as you go, tweaking letters to make them more legible when you’re writing very quickly to take notes. It uses machine learning to analyze your handwriting, so these adjustments are meant to match your normal script. That means you can scribble as quickly and recklessly as you want during a conference or a day of classes, but ostensibly, it will be legible at the end of the day.

Not everyone’s a big Pencil user—for some of us, handwriting long ago took a back seat to typing—but Apple is aggressively selling these kinds of flashy features for those who want that experience.

The release date for iPadOS 18 hasn’t been announced yet, but it will likely arrive in September or October alongside iOS 18 and the new iPhone models that will probably be announced then.

Listing image by Samuel Axon

iPadOS 18 adds machine-learning wizardry with handwriting, math features Read More »

apple’s-new-vision-pro-software-offers-an-ultrawide-virtual-mac-monitor

Apple’s new Vision Pro software offers an ultrawide virtual Mac monitor

WWDC 2024 —

visionOS 2 offers iterative improvements and refinements, plus new developer APIs.

A floating Mac desktop over a table

Enlarge / A Mac virtual monitor in visionOS 2.

Samuel Axon

CUPERTINO, Calif.—Apple kicked off the keynote for its annual developer conference by announcing a new version of visionOS, the operating system that runs on the company’s pricey but impressive Vision Pro mixed reality headset.

The updates in visionOS 2 are modest, not revolutionary—mostly iterative changes, quality-of-life improvements, and some features that were originally expected in the first version of visionOS. That’s not too surprising given that visionOS just went out to users four months ago.

Vision Pro users hoping for multiple virtual Mac monitors will be disappointed that’s not planned this time around, but Apple plans to add the next-best thing: Users will be able to take advantage of a larger and higher-resolution single virtual display, including a huge, wraparound ultrawide monitor mode that Apple says is equivalent to two 4K monitors.

There’s one major machine learning-driven feature: You will soon be able to convert 2D images into 3D spatial ones in the Photos app, even photos you took years and years ago, long before iPhones could take spatial photos. (Apple also announced that a new Canon DSLR camera will get a spatial photo lens, as another option for taking new spatial photos.)

Other notable improvements include support for using travel mode on trains instead of just airplanes and a simple finger gesture to open the home screen so you don’t have to invoke Siri or reach up to press a physical button on the headset.

A lot of the improvements that will lead to better apps come in the form of new developer APIs that will facilitate apps that really take advantage of the spatial features rather than just being flat 2D windows floating around you—something we noted as a disappointment when we shared our impressions of the device. Some APIs help create shared spatial experiences with other Vision Pro users who aren’t in the same room as you. One of those, TabletopKit, is focused on creating apps that sit on a 2D surface, like board and card games.

There will also be new enterprise-specific APIs for things like surgical training and manufacturing applications.

Finally, Apple says Vision Pro is going international. It will go on sale in China (including Hong Kong), Japan, and Singapore on June 13 and in Australia, Canada, France, Germany, and the UK on June 28.

There was no specific release date named for visionOS 2.

Apple’s new Vision Pro software offers an ultrawide virtual Mac monitor Read More »

what-to-expect-at-wwdc-24:-big-ios-changes,-more-vision-pro,-and-so-much-ai

What to expect at WWDC 24: Big iOS changes, more Vision Pro, and so much AI

WWDC 2024 —

There might not be new hardware, but Apple could make up for it with software.

A colorful logo that says

Enlarge / The logo for WWDC24.

Apple

Apple’s annual developer conference, WWDC, kicks off in Cupertino, California, next week. As always, it will start with a livestream keynote on Monday morning at 10 am Pacific, 1 pm Eastern. We’ll be in attendance reporting on the event, so let’s take a moment to take stock of what we expect to see next week.

But first, let’s note something we don’t think we’ll see: Due to some peculiarities about Apple’s upgrade cycles, as well as a push toward the M4, we’re not actually expecting any major hardware announcements at WWDC this year.

That’s OK, though, because it looks like it’s going to be a big one for software news. iOS has seen relatively modest updates in the past couple of years, but that’s about to change.

AI in the spotlight

Most of the rumors leading up to WWDC have been about Apple making plans to announce tons of generative AI features for its platforms. Part of that is because AI is the hot topic right now, so anything about that is bound to get some coverage. However, according to leaks reported on by Bloomberg, The Information, and others, it looks like Apple is going to make a conscious effort to reposition itself as a leader in AI.

Apple was already doing neat things with machine learning in iOS and elsewhere, like features that make image editing easier, smart recommendations, and more. But there have been major new developments in models lately that allow for many new options, as we’ve seen from others like OpenAI, Google, and Microsoft.

We don’t know many details about exactly what Apple will do here beyond it being a focus. The company has published several papers related to new large-language model chatbots, major Siri improvements, image generation, and more, but it’s hard to tell what will become user-facing features.

Possibilities include auto-generated summaries in apps like Mail, new ways to block ads or interact with websites in Safari, GitHub Copilot-like code editing assistance in Xcode, clip art generation for iWork documents, more conversational and larger-scope answers from Siri, new image editing features, expanded accessibility features, new transcription capabilities, and more.

Apple has reportedly been in talks with companies like OpenAI and Google (it even sounds like a deal has already been reached with OpenAI) about augmenting Siri and other parts of the iOS or macOS experience with an external AI chatbot. Apple has reportedly experimented with its own chatbot, but it’s unlikely that one would be far enough along to be a strong alternative to the likes of ChatGPT. At a minimum, expect Apple to partner with at least one company (probably OpenAI) as a provider for out-of-scope answers to queries asked of Siri or in Spotlight.

There have been rumblings that Apple could offer users a choice of multiple AI providers or launch an AI App Store, but we don’t know for sure how it will all take shape.

iOS and iPadOS 18

iOS 18 (and its close sibling, iPadOS 18) will roll out later this year alongside new iPhones, likely in September or October. But WWDC is the first time we’ll get a look at the major features Apple has planned.

Typically, Apple announces most new iOS features during the upcoming keynote, but it might save a couple that are are related to as-yet unannounced iPhone hardware for later.

The rumor mill this year points to an overhaul of both Control Center and Settings, plus the aforementioned inclusion of numerous new machine learning, LLM, or image generation features. One rumored example of how AI could be used in iOS described a new home screen that allows users to quickly recolor app icons to create a consistent color palette across their phone. Apple might even allow users to place icons wherever they want, addressing the irritating “wobble mode” home screen management that we’ve criticized in our iOS reviews for years.

Expect big new features for Messages, too, like new text effects and formatting options. There’s also a strong possibility that Apple will go into detail about RCS support in iOS. Generative AI could allow users to create custom emojis or stickers, too.

There were also a few rumors that Apple will make some visual changes to iOS, borrowing a bit from the visual language we saw in visionOS this spring.

Oh, and one more thing: iPadOS is finally getting a calculator app. We’re not sure why that took so long, but there it is.

What to expect at WWDC 24: Big iOS changes, more Vision Pro, and so much AI Read More »

wwdc-2024-starts-on-june-10-with-announcements-about-ios-18-and-beyond

WWDC 2024 starts on June 10 with announcements about iOS 18 and beyond

WWDC —

Speculation is rampant that Apple will make its first big moves in generative AI.

A colorful logo that says

Enlarge / The logo for WWDC24.

Apple

Apple has announced dates for this year’s Worldwide Developers Conference (WWDC). WWDC24 will run from June 10 through June 14 at the company’s Cupertino, California, headquarters, but everything will be streamed online.

Apple posted about the event with the following generic copy:

Join us online for the biggest developer event of the year. Be there for the unveiling of the latest Apple platforms, technologies, and tools. Learn how to create and elevate your apps and games. Engage with Apple designers and engineers and connect with the worldwide developer community. All online and at no cost.

As always, the conference will kick off with a keynote presentation on the first day, which is Monday, June 10. You can be sure Apple will use that event to at least announce the key features of its next round of annual software updates for iOS, iPadOS, macOS, watchOS, visionOS, and tvOS.

We could also see new hardware—it doesn’t happen every year, but it has of late. We don’t yet know exactly what that hardware might be, though.

Much of the speculation among analysts and commentators concerns Apple’s first move into generative AI. There have been reports that Apple may work with a partner like Google to include a chatbot in its operating system, that it has been considering designing its own AI tools, or that it could offer an AI App Store, giving users a choice between many chatbots.

Whatever the case, Apple is playing catch-up with some of its competitors in generative AI and large language models even though it has been using other applications of AI across its products for a couple of years now. The company’s leadership will probably talk about it during the keynote.

After the keynote, Apple usually hosts a “Platforms State of the Union” talk that delves deeper into its upcoming software updates, followed by hours of developer-focused sessions detailing how to take advantage of newly planned features in third-party apps.

WWDC 2024 starts on June 10 with announcements about iOS 18 and beyond Read More »