Apple kicked off their annual WWDC event on June 5th. As rumored and eagerly anticipated, the curtain was finally pulled away on the long-awaited Vision Pro! This is a tremendously exciting device, and it delivers a variety of entirely new paradigms into the Apple ecosystem.


Vision Pro

As a “hybrid” device, the Vision Pro straddles the line between an augmented reality and virtual reality device. Activities like watching movies and experiencing virtual environments can be fully-immersive, whereas other activities like running apps and collaborating with others can be virtually overlaid into the user’s actual surroundings.

Interactions with the Vision Pro are purely controlled by eyes, hands, and voice, with no need for additional controllers or input devices. Steve Jobs famously announced that a stylus had no place in the iPhone, and that more natural finger-based gestures were the ideal interaction paradigm. Vision Pro builds upon that same philosophy and extends it into all three dimensions, ensuring that user comfort and simple interactions are key.

Apple sees the Vision Pro as the establishment of a brand new type of computing experience. Where the Mac is a “personal computing” experience, and the iPhone is a “mobile computing” device, the term “spatial computing” was used repeatedly to express the new paradigm being offered by the device and visionOS.

Virtual reality experiences can be relatively isolating, and so it’s interesting to see the care Apple is paying to breaking down the social walls with Vision Pro. Strikingly, with “EyeSight”, the device’s front-facing display renders a 3D persona of the wearer’s eyes, giving the illusion that the front of the device is a pure pane of glass. When the wearer is in a more fully-immersive experience, the persona rendering goes away, and is replaced by an amorphous visual indicating that they’re focused on their content. And when other people draw near, the wearer will see a visual glimpse of them approaching in the display. Both of these technologies are meant to bring awareness and connectedness to people on either side of the display, and to bring a more “human” connection to the experience. And the device’s Digital Crown allows wearers to manually adjust the immersiveness of each experience.

The promise of fully three-dimensional interactive experiences are the most tantalizing, but a key component of the initial Vision Pro experience will be allowing two-dimensional apps to run within the virtual environment. As Apple explained, with Vision Pro, “your entire world is a canvas for apps”. iOS and iPad apps can run directly within the visionOS experience, and users will also be able to interact directly with their Mac within the virtual space. Even if you don’t have a need for a fully-3D experience, updating your iOS and iPad apps to target the visionOS experience should be an easy win and a great introduction to the platform.

Vision Pro will be available in early 2024 with a price tag of $3,499. We’re incredibly eager to get our hands on it and start exploring the possibilities!


Developing for Vision Pro

Vision Pro and visionOS development is extremely closely linked to iOS and iPadOS development, and the core tools including Xcode, Swift, SwiftUI, ARKit, and RealityKit are exactly the same. SDKs and documentation are expected to be available at the end of the month, as well as an update to the Human Interface Guidelines custom-catered towards the spatial computing environment.

We plan on starting by compiling existing iOS and iPadOS apps for visionOS and building an understanding of how those 2D experiences can be brought into 3D. This will be a fantastic way for clients and brands to build their presence in the visionOS App Store. And components from UIKit and SwiftUI will provide visionOS-specific behaviors like hover effects automatically, meaning that, as always, developers will find great efficiencies and apps will feel most at home when native development frameworks are leveraged to build visionOS apps.

Apple also announced that a simulator will allow developers to test visionOS experiences within a virtual world, even without having the actual hardware initially in hand. This will allow for a variety of environments to be simulated, including different times of day. Amazingly, Mac devices will be directly accessible within the virtual world, allowing developers to work inside Xcode on their real computer while wearing the Vision Pro device.


Widgets

Widgets are coming to iPadOS and macOS! Specifically, the new widget functionality introduced at last year’s WWDC is now being added to the larger-screen platforms. Interestingly, macOS Sonoma is adding support for iPhone widgets to run within macOS. This is a great opportunity to bring these experiences to macOS, and also a great incentive to build a widget for any iOS app that doesn’t currently have one. Interactive widgets will also now allow interactive controls to be placed directly within widgets, allowing buttons to be tapped, lists to be scrolled, and so on.

Adding support for these new features will require a bit of development work, and even existing widgets not leveraging the new features will still need minor updates to account for the new padding and layout controls being added to allow these widgets to work on the new platforms. So, expect to do a little work to make your widgets shine on the new platforms and OS versions!


TipKit

An unexpected addition to the iOS experience is the introduction of the TipKit API. TipKit allows developers to embed “tooltip” popovers into apps that can help explain how different pieces of app functionality work. We always strive to craft app designs that are intuitive and simple, and great supplementary onboarding experiences help orient new users to their apps. But for especially complex or hidden features, TipKit tooltips will be a fantastic way to help users get their bearings in a familiar way.


watchOS 10

watchOS is getting a handful of visual upgrades, and existing watchOS apps will likely need some adjustments and updates to take full advantage of them. And, as always, SwiftUI will make it especially easy to adhere to these updates. watchOS 10 is introducing a new navigation bar height, as well as dynamic expanding and collapsing animations during scrolling. And apps can now take deeper advantage of fullscreen designs with the ability to provide full-screen tint colors and controls in a variety of new layouts.

New Core Motion APIs will also allow for enhanced access to higher-fidelity motion and accelerometer data, which will be especially handy for creating motion-based watch experiences such as golf swing analyzers and tennis trackers.


Privacy and Security

As always, privacy and security are critical to Apple’s technology strategy. A few new interesting features have been added to help developers protect the integrity of their apps. The new Calendar add-only permission allows events to be added to the user’s calendar, but not read, which will help to protect their privacy. A new photo picker will also allow users to manage giving access to individual photos, and/or all photos, in an easy-to-use default system prompt.

Privacy nutrition labels are also getting two helpful updates. First, “privacy manifests” will allow developers to get a complete understanding of any third-party SDKs in their apps, and exactly how each of them operate. This will allow developers to provide accurate privacy nutrition information for all of the SDKs within the app, even if they don’t have access to the content within. Signatures for third-party SDKs will allow developers to ensure that SDKs are coming from validated and trusted developer sources.


Swift and Xcode 15

Swift is also getting a suite of exciting improvements. Swift Macros promise to simplify development and allow powerful functionality to be easily dropped in with simple adornments to blocks of code. Swift Data leverages Swift Macros to beautifully integrate Core Data-like functionality into SwiftUI, including data persistence and modeling, migrations, and iCloud integrations. Animations continue to be enhanced, including the ability to add keyframed animations to view components. In addition, Swift C++ interoperability will allow powerful C++ code to be integrated directly into Swift code, with no need for additional bridging overhead.

Xcode is also getting some great updates. Swift Macros are bringing simpler UI component previews to the experience. Integrated source control controls give the ability to make per-line commits and prepare pull requests right within the Xcode editor. A more robust unit and UI test suite allow tests to be run with recorded video outputs and consolidated reports for each test.


Next Steps

We’re only one day in, and there’s already so much exciting news! WWDC sessions will continue throughout the week, and we’ll be deeply tuned in. Excited about Vision Pro? Interested in building a new app for it, or even bringing an existing app into the third-dimension? Reach out and let’s chat about it!



Please provide your contact information to continue.

Before submitting your information, please read our Privacy Policy as it contains detailed information on the processing of your personal data and how we use it.

Related Content

VMLs Audrey Melofchik Honored at All Stars Benefit 1 2
News

VML’s Audrey Melofchik Honored at All Stars Benefit

VML Global Chief BX Officer recognized for her leadership and commitment to youth development through the All Stars Project’s performance-based programs
Read Article
ANA
News

VML Earns Three Wins at 2024 ANA Multicultural Excellence Awards

Campaigns from VML Canada and VML Argentina honored for groundbreaking work in LGBTQ+, People with Disabilities, and Holidays & Milestones categories
Read Article