Apple held their annual Worldwide Developers Conference, the flagship event of the year, on June 10th. Here’s our day-one roundup of the major announcements from the event, with a focus on actionable insights for developers and clients as you’re building your strategy for the coming year and thinking about how you can leverage the new features and functionalities rolling out in iOS 18, iPadOS 18, watchOS 11, visionOS 2, and macOS Sequoia.


Apple Intelligence

The rumors were spot on, and Apple unveiled their long-awaited Apple Intelligence technology system at WWDC 2024. Apple Intelligence is threaded throughout the entire iOS and iPadOS user experience, rather than just exposed through a single integration point, making it incredibly powerful and robust. And, also as expected, Apple built it with an architecture uniquely centered fully around privacy and security.

Apple Intelligence works as a hybrid system that runs both on-device operations for lightning-fast offline processing of time-sensitive tasks, as well as integrations with cloud-based operations for more complex requests. And either scenario has a strong emphasis on security and privacy, ensuring that, whether operations are running locally or in the cloud, they are kept private and not integrated into wider data models.

Some feature integrations, such as text-based writing tools and content generation, come “for free”, and are automatically available for developers using the default text-entry controls. The new Genmoji feature is also supported through the existing Attributed String framework that allows for rich text content to be built and rendered.

For more elaborate Apple Intelligence feature integrations, a combination of the existing App Intents, Spotlight, and SiriKit APIs are leveraged, which have now been expanded to cover these powerful new use cases.

Apple Intelligence invokes actions in your app by leveraging the App Intents API, which is already used by the Shortcuts and Siri experiences within iOS. This API is also being expanded with new supported intent types for a far greater variety of integration possibilities. So, if you want Apple Intelligence to be able to drill users into your app and allow them to get tasks done seamlessly, such as booking a flight, paying a bill, making a reservation, and so on, leverage the App Intents API to allow these actions to be triggered.

The Spotlight API is also used to provide Apple Intelligence with contextual information about what’s happening inside your app, allowing it to make inferences about what might be relevant to the user in the moment. For example, a mapping app might register the location of the city the user is currently looking at in the Spotlight API. Then, if a user asks Apple Intelligence to “share this location with my friend”, the system will know which location the user is looking at, and allow it to be shared within the next app.

Apple Intelligence was the biggest announcement of the event by far, and we’ve got many more questions about the deeper details! We’ll be keeping an eye on the sessions coming up through the week to learn more about the integrations and how developers and clients should be approaching them in their own apps.


iOS 18 and iPadOS 18

Outside of the many Apple Intelligence additions, iOS and iPadOS continue to evolve in their own exciting ways, with a few specific items for developers to plan for in their release roadmaps.

Most importantly, the first thing users see before they even launch your app - your app icon - is getting a set of updates. iOS 18 now allows users to switch between dark and light mode icons. They can also customize the general icon color palette by providing a custom tint color that gets applied to all the icons on their home screen. Designers should prepare to provide updated icon assets for dark mode, as well as tintable versions that can be used for any color setting the user chooses.

Control Center widgets can also now be developed that allow users to interact with your app content and control it without even launching it. Developers can create Control Center widgets that can be installed into the Control Center and lock screen, granting the ability to do things like control the settings in a car, adjust temperatures at home, turn smart devices on or off, and more, all within one consolidated and customizable control panel.

And speaking of smart devices, iOS 18 also offers a new device pairing experience for integrating with smart devices without needing broad, general access to the user’s Bluetooth or Wifi settings to connect. This should make it far easier for users to connect with custom-built hardware, and enhances the privacy and security of the experience by reducing the need to grant full access to Bluetooth and Wifi settings that may not be needed.

For any apps interacting with the user’s contacts list, an updated Contacts API will also allow users to grant access to specific subsets of contact info, rather than their entire contacts list. Limiting apps’ access to the complete contact list will help prevent apps from monitoring the status and changes in the contacts list, and will also help users feel more comfortable granting access to their contacts list by giving them the fine-grained control that was previously introduced for the photos and locations APIs.

And one of the most fundamental navigation components in iOS, the tab bar, is getting an overhaul in iPadOS 18. The new tab bar component looks similar to the visionOS equivalent and allows app content to stretch edge-to-edge across the screen without giving up screen real estate to a large tab bar running its full length. The tab bar also animatedly turns into a sidebar when a user needs to expand it, and includes the ability to create deeper sections within the tab bar and allow the user to customize the sections by dragging them around. iPad apps will need to be updated and redesigned to take advantage of this new paradigm, and we’ll also see if a similar update makes it to the iPhone in future SDK updates.


visionOS 2

Apple’s newest hardware platform will also be host to an exciting batch of updates coming up with the visionOS 2 release.

The platform in general is continuing to evolve, with a set of new gestures making it easier to navigate around the interface, an improved content creation pipeline for creating spatial video, and a planned rollout to a list of new countries.

Powerful new frameworks are also being introduced that will allow brand new classes of apps to be developed. Most excitingly, more volume- and object-based APIs are becoming available that allow developers to analyze the user’s surroundings, identify physical objects, and attach virtual objects and ornamentations to them. This is the perfect technology to enable interactive experiences like training courses, educational overlays, and hands-on maintenance assistants. Enterprise APIs are also being introduced, which further support Apple’s philosophy of the Vision Pro as a device that can be used to get things done and serve commercial customers using it as a work device.

And outside of the enterprise applications, new frameworks like TableTopKit can be leveraged to build collaborative gaming experiences like virtual board games, also tying multiple user avatars together into a unified experience.


watchOS 11

Apple’s smallest platform is also getting its fair share of big updates.

Live Activities are being brought to the Apple Watch through integrations into the Smart Stack. Developers who have worked to bring this functionality to the Dynamic Island already have a big head start on this feature, and watchOS can utilize the leading and trailing portions of that Live Activity treatment for the small screen. And a completely custom watchOS variant of a Live Activity can be designed and built by leveraging the new “small” style.

Interactive widgets can also now be developed, giving users a richer and more interactive experience directly on their Apple Watch home screen. These widgets are built with the same API as those used for iOS and macOS, so developers will feel right at home building these widgets for watchOS. Contextual metadata can also be specified for these widgets, allowing them to be surfaced to the user in a personalized way depending on the current conditions, time of day, and so on.

And the new double-tap gesture API allows developers to specify the action that should be triggered when users initiate the new double-tap gesture that was introduced with watchOS 10.1 and the Apple Watches Series 9 and Ultra 2.


Developer Tools

Apple’s developer tools are also taking a monumental leap forward this year with Xcode 16 and Swift 6. Two AI-powered tools are going to be especially game-changing, with Predictive Code Completion and Swift Assist.

Predictive Code Completion will make interactive suggestions for code snippets as developers work within Xcode. It works completely offline, is trained specifically for Swift and Apple SDKs, and generates fast suggestions in real-time. Predictive Code Generation uses its knowledge about the specific codebase, data models, and architecture to make suggestions based on the specific fields within data models, allowing it to intelligently auto-complete method names such as “eventsFilteredByStartDate” when it detects that an event model contains a field called “startDate”, for example.

Taking it a step further, Swift Assist, coming later this year, allows larger portions of code to be generated based on broad intent statements such as “build me a list view of recipe rows”, “give me a data set of common travel destinations”, “convert my list to a grid”, and “improve the accessibility of this view”. It can be used to gain insights into the code, make sweeping changes, and generate substantial features, all with text-based inputs written in natural language. This is incredibly powerful and brings the potential promise of reduced development time as apps are being built.

And, as with any Apple product, both of these AI-powered experiences treat privacy as a critical feature. Neither of them share developer code, keeping code and sensitive information secure and out of shared data models. Protecting proprietary code and data is a top priority for us, and we’re excited to see Apple taking this approach, making it all the easier for us to begin leveraging these tools without the risk of the unknowns associated with other similar tools.

Outside of the Apple Intelligence integration, a host of other exciting features are coming to Xcode and Swift, including safer concurrent programming through compile-time data-race safety; the new Swift Testing framework; updates to SwiftUI that allow gesture recognizers and animations to be leveraged with a shared approach between SwiftUI, UIKit, and AppKit; and updates to SwiftData for tracking data model history, indexing, and flagging unique data attributes, and configuration of alternate data stores.


Personalization and Customization

A key theme this year was the wide goal of offering greater personalization and customization in service of users.

On the personalization front, Apple is taking every opportunity to intelligently surface content specifically relevant to the user, including prioritizing notifications, exposing personalized health insights and vitals on Apple Watch, intelligently featuring app content within the Photos app, and many others. These experiences are automatically custom-catered specifically to the user based on the system’s knowledge of them and their behaviors, without them needing to take any manual action at all to trigger them.

And examples of customization controls include the new ability to fully customize and reposition home screen icons, switch between dark and light versions of icons or tint them in a custom color, customize the controls within the Control Center, and resize volumetric experiences in visionOS. These don’t automatically trigger, but are controls offered to users to enhance their experience in a way that works perfectly for them.

In general, we should be taking every opportunity to use our insights about the user to automatically personalize their experience and tailor it specifically to them. If they don’t even realize it’s happening, that’s a huge success, and makes that experience particularly magical. And where we can’t do it automatically, we should at least offer users the controls that allow them to customize their experience and make it specifically their own.


Next Steps

We’re only one day in, and there’s already a lot to love! Initial beta versions of the updated tools are already available today, with public betas scheduled in July, and production releases ready in the fall. We’ll be tuning into sessions and working with the new tools and frameworks over the coming week, and are excited to see what else we learn over that time. We’ll also bring additional details and insights to our clients as we learn more and begin to chart development roadmaps over the coming year.

It’s an exciting time to be an Apple developer!


Image credit Apple

Please provide your contact information to continue.

Before submitting your information, please read our Privacy Policy as it contains detailed information on the processing of your personal data and how we use it.

Related Content

Io
Insight

Google I/O 2024

Android 15 gets a boost of Intelligence with deep Gemini AI integration
Read Article
Hellmanns Mayo
In The Press

Hellmann’s Returns to the Big Game for 5th Consecutive Year

On February 9th, 2025, the MVP condiment brand will show fans how it brings out the best in game day dishes in a spot from VML
看更多