apple vision pro

vision-pro-dev-kit-applications-will-open-in-july

Vision Pro Dev Kit Applications Will Open in July

Apple says it will give developers the opportunity to apply for Vision Pro dev kits starting sometime in July.

In addition to releasing a first round of developer tools last week, including a software ‘Simulator’ of Vision Pro, Apple also wants to give developers a chance to get their hands on the headset itself.

The company indicates that applications for a Vision Pro development kit will open starting in July, and developers will be able to find details here when the time comes.

There’s no telling how many of the development kits the company plans to send out, or exactly when they will start shipping, but given Apple’s culture of extreme secrecy you can bet selected developers will be locked down with strict NDAs regarding their use of the device.

The Vision Pro developer kit isn’t the only way developers will be able to test their apps on a real headset.

Developers will also be able to apply to attend ‘Vision Pro developer labs’:

Apply for the opportunity to attend an Apple Vision Pro developer lab, where you can experience your visionOS, iPadOS, and iOS apps running on Apple Vision Pro. With direct support from Apple, you’ll be able to test and optimize your apps and games, so they’ll be ready when Apple Vision Pro is available to customers. Labs will be available in six locations worldwide: Cupertino, London, Munich, Shanghai, Singapore, and Tokyo.

Our understanding is that applications for the developer labs will also open in July.

Additionally, developers will also be able to request that their app be reviewed by Apple itself on visionOS, though this is restricted to existing iPhone and iPad apps, rather than newly created apps for visionOS:

If you currently have an iPad or iPhone app on the App Store, we can help you test it on Apple Vision Pro. Request a compatibility evaluation from App Review to get a report on your app or game’s appearance and how it behaves in visionOS.

Vision Pro isn’t planned to ship until early 2024, but Apple wants to have third-party apps ready and waiting for when that time comes.

Vision Pro Dev Kit Applications Will Open in July Read More »

vision-pro’s-modular-design-invites-apple’s-massive-third-party-accessory-ecosystem

Vision Pro’s Modular Design Invites Apple’s Massive Third-party Accessory Ecosystem

One thing that didn’t get much attention during the announcement of Apple Vision Pro is the headset’s modular design. With straightforward connections and magnetic mounting, the company’s robust ecosystem of third-party accessory makers will no doubt be scurrying to offer options.

In a, perhaps surprising, move Apple built Vision Pro with modularity in mind. This is surely a recognition by the company that one size does not fit all when it comes to an XR headset.

When I tried Vision Pro for myself earlier this month, I found that ergonomics were one of the few places where it didn’t feel like Apple was raising the bar. But considering the modular design of the headset, it seems likely there will be options to choose from.

Not only did the headband of my demo unit have an ‘M’ on it (suggesting Apple itself is probably making S, M, and L sizes), but the way each piece of the headset attaches together makes it appear that the door is wide open for third-party accessories.

First there is the quick-release headstrap, which is easily disconnected with a simple pull of the orange tab.

Image courtesy Apple

And luckily the way the speakers are mounted means they’ll always be there no matter which strap you’re using.

Image courtesy Apple

Then there’s the facepad which is magnetically attached, meaning third-party facepads can make use of those magnetic attachment points.

Image courtesy Apple

The same goes for the prescription lens inserts; although Zeiss has been named as the official maker of prescription lenses for Vision Pro, any lens maker should be able to make lenses that clip in magnetically.

Image courtesy Zeiss

The only question that’s up in the air is the headset’s battery, which attaches with a curious rotating connector.

Image courtesy Apple

It’s unclear if Apple will have made this connector proprietary in some way that’s difficult for third-parties to couple with. If Apple didn’t go out of their way to prevent third-parties from doing so, then we’ll likely see additional battery options, like larger capacity batteries and even battery-headstraps to prevent having a tether down to your pocket.

Apple has one of the most robust third-party accessory ecosystems of any consumer electronics brand—estimated at tens of billions in annual revenue. The company is also pretty good about providing detailed resources and guidelines for accessory makers, including full diagrams of products for accessories that require precise fitting, and it’s very likely this will eventually extend to Vision Pro.

Be it iPhone cases or MacBook keyboard covers, it’s not uncommon for the company’s third-party accessory makers to race to be the first on the market with an accessory for the newest Apple product, and you can bet there will be at least a few gunning for that finish line when Vision Pro launches early next year.

Image courtesy Apple

The thing I’m most looking forward to is third-party headstraps. While the one that comes with Vision Pro is nice from the standpoint of the materials and tightening mechanism, I still almost always prefer a more rigid strap, which should be possible given the modularity of the headset as we know it today.

Vision Pro’s Modular Design Invites Apple’s Massive Third-party Accessory Ecosystem Read More »

how-cities-are-taking-advantage-of-ar-tech-and-how-apple’s-vision-pro-could-fuel-innovation

How Cities Are Taking Advantage of AR Tech and How Apple’s Vision Pro Could Fuel Innovation

Apple unveiled its first mixed reality headset, the Vision Pro, at this year’s Worldwide Developers Conference (WWDC) on June 5, 2023. The company’s “first spatial computer” will enable users to interact with digital content like never before by leveraging a new 3D interface to deliver immersive spatial experiences.

The Vision Pro marks a new era for immersive technologies, and it can potentially be used to bolster efforts in using such technologies to improve communities.

How the Vision Pro Headset Can Strengthen Efforts to Transform Orlando

Cities around the world are starting to apply new technologies to help improve their communities. City, University of London, for instance, has launched an initiative that will bring about the UK’s largest AR, VR, and metaverse training center. London has also been mapped in 3D, allowing locals and visitors to have an immersive view of the city.

In 2021, Columbia University started a project called the “Hybrid Twins for Urban Transportation”, which creates a digital twin of New York’s key intersections to help optimize traffic flows.

Using New Technologies to Enhance Orlando’s Digital Twin Initiative

With Orlando, Florida, being designated as the metaverse’s MetaCenter, new MR headsets like Apple’s Vision Pro can help create radical changes to bolster the city’s digital twin efforts, which can accelerate Orlando’s metaverse capabilities.

Apple Vision Pro

In an interview with ARPost, Tim Giuliani, the President and CEO of the Orlando Economic Partnership (OEP), shared that emerging technologies like the digital twin enables them to showcase the region to executives who are planning to relocate their companies to Orlando.

Moreover, the digital twin helps local leaders ensure that the city has a robust infrastructure to support its residents, thus positively impacting the city’s economy and prosperity.

The digital twin’s physical display is currently housed at the OEP’s headquarters in downtown Orlando. However, Giuliani shared that AR headsets can make it more accessible.

We can use the headset’s technology to take our digital twin to trade shows or whenever it goes out to market to companies,” said Giuliani. According to Giuliani, utility companies and city planners can use the 3D model to access a holographic display when mapping out proposed infrastructure improvements. Stakeholders can also use it to create 3D models using their own data for simulations like climate change and infrastructure planning.

He added that equipment like the Vision Pro can help make VR, AR, and 3D simulation more widespread. According to Giuliani, while the Vision Pro is the first one to come out, other new devices will come out in the coming years and the competition will make these devices a consumer device.

Apple’s announcement cements the importance of the MetaCenter. The Orlando region has been leading in VR and AR and 3D simulation for over a decade now. So, all the things that we have been saying of why we are the MetaCenter, this hardware better positions us to continue leading in this territory,” he told us.

Leveraging the Vision Pro and MR to Usher in New Innovations

Innovate Orlando CEO and OEP Chief Information Officer David Adelson noted that aside from companies, ordinary individuals who aren’t keenly interested in immersive tech for development or work can also use devices like the Vision Pro to help Orlando with its effort to become the MetaCenter.

These new devices are one of the hardware solutions that this industry has been seeking. Through these hardware devices, the software platforms, and simulation market that has been building for decades, will now be enabled on a consumer and a business interface,” said Adelson.

Adelson also shared that Orlando has been leading in the spatial computing landscape and that the emergence of a spatial computing headset like the Vision Pro brings this particular sector into the spotlight.

How can businesses leverage the new Vision Pro headset and other MR technologies to usher in new developments?

According to Giuliani, businesses can use these technologies to provide a range of services, such as consulting services, as well as help increase customer engagement, cut costs, and make informed decisions faster.

AR can be a powerful tool to provide remote expertise and remote assistance with AR helps move projects forward and provide services that would otherwise require multiple site visits. This is what we are taking advantage of with the digital twin,” said Giuliani.

Giuliani also noted that such technologies can be a way for companies to empower both employees and customers by enhancing productivity, improving services, and fostering better communication.

Potential Drawbacks of Emerging Technologies

Given that these are still relatively new pieces of technology, it’s possible that they’ll have some drawbacks. However, according to Adelson, these can be seen as a positive movement that can potentially change the Web3 landscape. Giuliani echoes this sentiment.

We like to focus on the things that can unite us and help us move forward to advance broad-based prosperity and this means working with the new advancements created and finding ways to make them work and facilitate the work we all do,” he told us.

How Cities Are Taking Advantage of AR Tech and How Apple’s Vision Pro Could Fuel Innovation Read More »

apple-releases-vision-pro-development-tools-and-headset-emulator

Apple Releases Vision Pro Development Tools and Headset Emulator

Apple has released new and updated tools for developers to begin building XR apps on Apple Vision Pro.

Apple Vision Pro isn’t due out until early 2024, but the company wants developers to get a jump-start on building apps for the new headset.

To that end the company announced today it has released the visionOS SDK, updated Xcode, Simulator, and Reality Composer Pro, which developers can get access to at the Vision OS developer website.

While some of the tools will be familiar to Apple developers, tools like Simulator and Reality Composer Pro are newly released for the headset.

Simulator is the Apple Vision Pro emulator, which aims to give developers a way to test their apps before having their hands on the headset. The tool effectively acts as a software version of Apple Vision Pro, allowing developers see how their apps will render and act on the headset.

Reality Composer Pro is aimed at making it easy for developers to build interactive scenes with 3D models, sounds, and textures. From what we understand, it’s sort of like an easier (albeit less capable) alternative to Unity. However, developers who already know or aren’t afraid to learn a full-blown game engine can also use Unity to build visionOS apps.

Image courtesy Apple

In addition to the release of the visionOS SDK today, Apple says it’s still on track to open a handful of ‘Developer Labs’ around the world where developers can get their hands on the headset and test their apps. The company also says developers will be able to apply to receive Apple Vision Pro development kits next month.

Apple Releases Vision Pro Development Tools and Headset Emulator Read More »

a-concise-beginner’s-guide-to-apple-vision-pro-design-&-development

A Concise Beginner’s Guide to Apple Vision Pro Design & Development

Apple Vision Pro has brought new ideas to the table about how XR apps should be designed, controlled, and built. In this Guest Article, Sterling Crispin offers up a concise guide for what first-time XR developers should keep in mind as they approach app development for Apple Vision Pro.

Guest Article by Sterling Crispin

Sterling Crispin is an artist and software engineer with a decade of experience in the spatial computing industry. His work has spanned between product design and the R&D of new technologies at companies like Apple, Snap Inc, and various other tech startups working on face computers.

Editor’s Note:  The author would like to remind readers that he is not an Apple representative; this info is personal opinion and does not contain non-public information. Additionally, more info on Vision Pro development can be found in Apple’s WWDC23 videos (select Filter → visionOS).

Ahead is my advice for designing and developing products for Vision Pro. This article includes a basic overview of the platform, tools, porting apps, general product design, prototyping, perceptual design, business advice, and more.

Overview

Apps on visionOS are organized into ‘scenes’, which are Windows, Volumes, and Spaces.

Windows are a spatial version of what you’d see on a normal computer. They’re bounded rectangles of content that users surround themselves with. These may be windows from different apps or multiple windows from one app.

Volumes are things like 3D objects, or small interactive scenes. Like a 3D map, or small game that floats in front of you rather than being fully immersive.

Spaces are fully immersive experiences where only one app is visible. That could be full of many Windows and Volumes from your app. Or like VR games where the system goes away and it’s all fully immersive content that surrounds you. You can think of visionOS itself like a Shared Space where apps coexist together and you have less control. Whereas Full Spaces give you the most control and immersiveness, but don’t coexist with other apps. Spaces have immersion styles: mixed, progressive, and full. Which defines how much or little of the real world you want the user to see.

User Input

Users can look at the UI and pinch like the Apple Vision Pro demo videos show. But you can also reach out and tap on windows directly, sort of like it’s actually a floating iPad. Or use a bluetooth trackpad or video game controller. You can also look and speak in search bars. There’s also a Dwell Control for eyes-only input, but that’s really an accessibility feature. For a simple dev approach, your app can just use events like a TapGesture. In this case, you won’t need to worry about where these events originate from.

Spatial Audio

Vision Pro has an advanced spatial audio system that makes sounds seem like they’re really in the room by considering the size and materials in your room. Using subtle sounds for UI interaction and taking advantage of sound design for immersive experiences is going to be really important. Make sure to take this topic seriously.

Development

If you want to build something that works between Vision Pro, iPad, and iOS, you’ll be operating within the Apple dev ecosystem, using tools like XCode and SwiftUI. However, if your goal is to create a fully immersive VR experience for Vision Pro that also works on other headsets like Meta’s Quest or PlayStation VR, you have to use Unity.

Apple Tools

For Apple’s ecosystem, you’ll use SwiftUI to create the UI the user sees and the overall content of your app. RealityKit is the 3D rendering engine that handles materials, 3D objects, and light simulations. You’ll use ARKit for advanced scene understanding, like if you want someone to throw virtual darts and have them collide with their real wall, or do advanced things with hand tracking. But those rich AR features are only available in Full Spaces. There’s also Reality Composer Pro which is a 3D content editor that lets you drag things around a 3D scene and make media rich Spaces or Volumes. It’s like diet-Unity that’s built specifically for this development stack.

One cool thing with Reality Composer is that it’s already full of assets, materials, and animations. That helps developers who aren’t artists build something quickly and should help to create a more unified look and feel to everything built with the tool. Pros and cons to that product decision, but overall it should be helpful.

Existing iOS Apps

If you’re bringing an iPad or iOS app over, it will probably work unmodified as a Window in the Shared Space. If your app supports both iPad and iPhone, the headset will use the iPad version.

To customize your existing iOS app to take better advantage of the headset you can use the Ornament API to make little floating islands of UI in front of, or besides your app, to make it feel more spatial. Ironically, if your app is using a lot of ARKit features, you’ll likely need to ‘reimagine’ it significantly to work on Vision Pro, as ARKit has been upgraded a lot for the headset.

If you’re excited about building something new for Vision Pro, my personal opinion is that you should prioritize how your app will provide value across iPad and iOS too. Otherwise you’re losing out on hundreds of millions of users.

Unity

You can build to Vision Pro with the Unity game engine, which is a massive topic. Again, you need to use Unity if you’re building to Vision Pro as well as a Meta headset like the Quest or PSVR 2.

Unity supports building Bounded Volumes for the Shared Space which exist alongside native Vision Pro content. And Unbounded Volumes, for immersive content that may leverage advanced AR features. Finally you can also build more VR-like apps which give you more control over rendering but seem to lack support for ARKit scene understanding like plane detection. The Volume approach gives RealityKit more control over rendering, so you have to use Unity’s PolySpatial tool to convert materials, shaders, and other features.

Unity support for Vision Pro includes for tons of interactions you’d expect to see in VR, like teleporting to a new location or picking up and throwing virtual objects.

Product Design

You could just make an iPad-like app that shows up as a floating window, use the default interactions, and call it a day. But like I said above, content can exist in a wide spectrum of immersion, locations, and use a wide range of inputs. So the combinatorial range of possibilities can be overwhelming.

If you haven’t spent 100 hours in VR, get a Quest 2 or 3 as soon as possible and try everything. It doesn’t matter if you’re a designer, or product manager, or a CEO, you need to get a Quest and spend 100 hours in VR to begin to understand the language of spatial apps.

I highly recommend checking out Hand Physics Lab as a starting point and overview for understanding direct interactions. There’s a lot of subtle things they do which imbue virtual objects with a sense of physicality. And the Youtube VR app that was released in 2019 looks and feels pretty similar to a basic visionOS app, it’s worth checking out.

Keep a diary of what works and what doesn’t.

Ask yourself: ‘What app designs are comfortable, or cause fatigue?’, ‘What apps have the fastest time-to-fun or value?’, ‘What’s confusing and what’s intuitive?’, ‘What experiences would you even bother doing more than once?’ Be brutally honest. Learn from what’s been tried as much as possible.

General Design Advice

I strongly recommend the IDEO style design thinking process, it works for spatial computing too. You should absolutely try it out if you’re unfamiliar. There’s Design Kit with resources and this video which, while dated, is a great example of the process.

The road to spatial computing is a graveyard of utopian ideas that failed. People tend to spend a very long time building grand solutions for the imaginary problems of imaginary users. It sounds obvious, but instead you should try to build something as fast as possible that fills a real human need, and then iteratively improve from there.

Continue on Page 2: Spatial Formats and Interaction »

A Concise Beginner’s Guide to Apple Vision Pro Design & Development Read More »

apple-vision-pro-will-have-an-‘avatar-webcam’,-automatically-integrating-with-popular-video-chat-apps

Apple Vision Pro Will Have an ‘Avatar Webcam’, Automatically Integrating with Popular Video Chat Apps

In addition to offering immersive experiences, Apple says that Vision Pro will be able to run most iPad and iOS apps out of the box with no changes. For video chat apps like Zoom, Messenger, Discord, and others, the company says that an ‘avatar webcam’ will be supplied to apps, making them automatically able to handle video calls between the headset and other devices.

Apple says that on day one, all suitable iOS and iPad OS apps will be available on the headset’s App Store. According to the company, “most apps don’t need any changes at all,” and the majority should run on the headset right out of the box. Developers will be able to opt-out from having their apps on the headset if they’d like.

For video conferencing apps like Zoom, Messenger, Discord, Google Meet, which expect access to the front-camera of an iPhone or iPad, Apple has done something clever for Vision Pro.

Instead of a live camera view, Vision Pro provides a view of the headset’s computer-generated avatar of the user (which Apple calls a ‘Persona’). That means that video chat apps that are built according to Apple’s existing guidelines should work on Vision Pro without any changes to how the app handles camera input.

How Apple Vision Pro ‘Persona’ avatars are represented | Image courtesy Apple

Persona’s use the headset’s front cameras to scan the user’s face to create a model, then the model is animated according to head, eye, and hand inputs tracked by the headset.

Image courtesy Apple

Apple confirmed as much in a WWDC developer session called Enhance your iPad and iPhone apps for the Shared Space. The company also confirmed that apps asking for access to a rear-facing camera (ie: a photography app) on Apple Vision Pro will get only black frames with a ‘no camera’ symbol. This alerts the user that there’s no rear-facing camera available, but also means that iOS and iPad apps will continue to run without errors, even when they expect to see a rear-facing camera.

There’s potentially other reasons that video chat apps like Zoom, Messenger, or Discord might not work with Apple Vision Pro right out of the box, but at least as far as camera handling goes, it should be easy for developers to get video chats up and running using a view of the user’s Persona.

It’s even possible that ‘AR face filters’ in apps like Snapchat and Messenger will work correctly with the user’s Apple Vision Pro avatar, with the app being none-the-wiser that it’s actually looking at a computer-generated avatar rather than a real person.

Image courtesy Apple

In another WWDC session, the company explained more about how iOS and iPad apps behave on Apple Vision Pro without modification.

Developers can expect up to two inputs from the headset (the user can pinch each hand as its own input), meaning any apps expecting two-finger gestures (like pinch-zoom) should work just fine, but three fingers or more won’t be possible from the headset. As for apps that require location information, Apple says the headset can provide an approximate location via Wi-Fi, or a specific location shared via the user’s iPhone.

Unfortunately, existing ARKit apps won’t work out of the box on Apple Vision Pro. Developers will need to use a newly upgraded ARKit (and other tools) to make their apps ready for the headset. This is covered in the WWDC session Evolve your ARKit app for spatial experiences.

Apple Vision Pro Will Have an ‘Avatar Webcam’, Automatically Integrating with Popular Video Chat Apps Read More »

cloudhead-games-ceo:-apple-vision-pro-is-an-ar-headset-wearing-vr-clothes

Cloudhead Games CEO: Apple Vision Pro is an AR Headset Wearing VR Clothes

Cloudhead Games is one of the most successful and senior VR studios in the industry. In this Guest Article, studio head Denny Unger shares his thoughts on Apple’s entrance into the space.

Guest Article by Denny Unger

Denny Unger is CEO and CCO at Cloudhead Games. Based in British Columbia and founded in 2012 Cloudhead’s pioneering approach to VR gave rise to broadly adopted movement standards including Snap Turns and Teleportation. Working closely with Valve, Sony, and Meta, Cloudhead is best known for their title Pistol Whip and has shipped four popular VR titles (Pistol Whip, Valve’s Aperture Hand Labs, Call of the Starseed, and Heart of the Emberstone).

So let’s get the obvious over first; Apple Vision Pro is Apple’s first generation attempt at AR glasses using a Mixed Reality VR headset. AVP is a development platform also serving an enthusiast demographic. Make no mistake, this no compromise MR device appears to get many things right for AR at a premium cost. Will Cloudhead Games be buying one to better understand Apple’s approach? Heck yes. AVP will give developers a powerful foundation and ecosystem for which to develop AR apps for a future ‘glasses formfactor’ device in that mythical 5–10 year window. And to the victor, the spoils of a smartphone replacing device.

No doubt (and if rumors are true) there were many debates at Apple HQ about VR. Whether or not to open the device up to VR studios and successful titles. Whether or not to include controllers to support legacy VR titles. Whether to allow users to full-dive into Virtual Reality, freely move around, and be active in the medium. But in an effort to sharpen their messaging, and to command a dominating lead within the AR space, VR and its many benefits were expertly omitted on nearly every level. Do I understand the strategy to strike a different cord as an XR business owner? Absolutely. Does it frustrate me as a VR-centric studio owner? You bet it does.

Image courtesy Apple

I question why the AVP didn’t maximize its potential, leveraging almost a decade of know-how from the VR community working within this space. Why not set a vision for a future device that would accommodate both AR and VR as complimentary mediums? Apple could have embraced a dual launch strategy with a rich and proven catalog of best selling VR games, perfectly tuned to onboard a completely new audience to XR. Apple could have expanded into VR’s recent success, growth and competition within the current market. In their recent presentation VR is essentially reduced to a gimmick, the thing you lightly touch the edges of, instead of a complimentary and equally important medium. Unity engine support is promised but with no plans for motion control support, Apple has cut out any possibility of porting most of the existing or future VR catalog to its platform.

Hand-tracking is a logical affordance for AR based spatial computing and no doubt some experiences will work well with that design philosophy. However it is important to point out that most VR games built over the last 10 years (and many more in production) are not compatible with, nor will they ever be “portable” to hand-tracking only design. Inputs and Haptics are incredibly important to Virtual Reality as a major tenant in reinforcing immersion and tactile interaction with virtual objects. Buttons pushed, triggers pulled, vibrational feedback experienced, objects held, thrown or touched, alternative movement schemes supported. There is a comfort in understanding the topological landscape of a controller and a physical touchpoint within the virtual environments themselves. When introducing users to a radically different medium like VR, convention & feedback matters. And over the last 50 years in gaming, input has evolved to encourage a suite of highly refined game design standards, creating a particular kind of muscle memory in the gaming population. Say what you will about which genres remain popular in this 450 Billion dollar industry but it does strain belief to think we’ll all be playing with finger guns in the latest and greatest shooter.

I know what some are likely to say “ there will be new innovative standards and we’ll look back on controllers as a crutch”, but I would push back and say hand-tracked or not, moving away from future haptic devices and innovation is a backwards step in XR design. Even smartphone games utilize basic haptics, because touch is foundational to the human experience.

In the aftermath of the AVP launch some would argue that VR is not yet mainstream and that Apple did the right thing by ignoring it. I would argue that VR turned a significant mainstream corner when Quest 2 outsold Xbox, when Sony reentered the market with PSVR2, and when Google teamed up with Samsung to work on what’s next, and on it goes. Over its 10 year rebirth, the last 3 years of VR have experienced Hockey Stick levels of growth. OEM’s have increased investments, and significant indicators keep coming with more titles earning revenues north of $20 Million. Fully immersive VR is a legitimized medium not because I say it is but because people like it, and are willing to part with their hard earned money to experience it.

Image courtesy Apple

I hope Apple is more inclusive of VR over time but the Apple Vision Pro appears to be a VR headset pretending not to be a VR headset. Because of this strategy it represents a unique opportunity for Apple’s competitors to double-down on supporting Virtual Reality at a more affordable entry point. Sure, they can all wage the 5-10 year war for a smartphone replacement but why in the world would one ignore an equally compelling revenue stream within a blended MR ecosystem? Maybe, because it took too long to go mainstream? Sorry all, we had to learn a few things along the way but I’m happy to say that after 10 years, the trail ahead has never been this clear.

Cloudhead Games CEO: Apple Vision Pro is an AR Headset Wearing VR Clothes Read More »

zuckerberg-gives-his-first-reaction-to-apple’s-vision-pro

Zuckerberg Gives His First Reaction to Apple’s Vision Pro

Meta founder and CEO Mark Zuckerberg hasn’t been shy about addressing the elephant in the room: with Apple Vision Pro, the Cupertino tech giant is officially entering a market that, up until now, Meta has basically owned. In a meeting with Meta employees, Zuckerberg thinks that while Apple Vision Pro “could be the vision of the future of computing […] it’s not the one that I want”

As reported by The Verge, Zuckerberg seems very confident in the company’s XR offerings, and is less impressed with Apple’s design tradeoffs. During a companywide meeting, Zuckerberg said that with Vision Pro, Appe has “no kind of magical solutions” and that they haven’t bypassed “any of the constraints on laws of physics that our teams haven’t already explored and thought of.” He calls that “the good news.”

Largely, Zuckerberg says Apple is making some telling design tradeoffs, as its higher resolution displays, advanced software, and external battery comes alongside a $3,500 price tag—or seven times more than Meta’s upcoming Quest 3 mixed reality standalone.

Photo by Road to VR

But it’s also about ethos. Zuckerberg says the companies’ respective headsets represent a divide in company philosophy, as Apple products are typically developed to appeal to high income consumers. “We innovate to make sure that our products are as accessible and affordable to everyone as possible, and that is a core part of what we do. And we have sold tens of millions of Quests,” he said.

“More importantly, our vision for the metaverse and presence is fundamentally social. It’s about people interacting in new ways and feeling closer in new ways,” Zuckerberg continued. “Our device is also about being active and doing things. By contrast, every demo that they showed was a person sitting on a couch by themself. I mean, that could be the vision of the future of computing, but like, it’s not the one that I want.”

The Meta chief echoed some of these statements on the Lex Fridman podcast where he spoke about his opinions on Apple Vision Pro, noting that Apple’s mixed reality headset offers a “certain level of validation for the category.” Because Vision Pro will cost so much though, Zuckerberg maintains Quest 3 will overall benefit as people inevitably gravitate to towards the cheaper, more consumer-friendly option.

Here’s Zuckerberg’s full statement, sourced from the companywide address:

Apple finally announced their headset, so I want to talk about that for a second. I was really curious to see what they were gonna ship. And obviously I haven’t seen it yet, so I’ll learn more as we get to play with it and see what happens and how people use it.

From what I’ve seen initially, I’d say the good news is that there’s no kind of magical solutions that they have to any of the constraints on laws of physics that our teams haven’t already explored and thought of. They went with a higher resolution display, and between that and all the technology they put in there to power it, it costs seven times more and now requires so much energy that now you need a battery and a wire attached to it to use it. They made that design trade-off and it might make sense for the cases that they’re going for.

But look, I think that their announcement really showcases the difference in the values and the vision that our companies bring to this in a way that I think is really important. We innovate to make sure that our products are as accessible and affordable to everyone as possible, and that is a core part of what we do. And we have sold tens of millions of Quests.

More importantly, our vision for the metaverse and presence is fundamentally social. It’s about people interacting in new ways and feeling closer in new ways. Our device is also about being active and doing things. By contrast, every demo that they showed was a person sitting on a couch by themself. I mean, that could be the vision of the future of computing, but like, it’s not the one that I want. There’s a real philosophical difference in terms of how we’re approaching this. And seeing what they put out there and how they’re going to compete just made me even more excited and in a lot of ways optimistic that what we’re doing matters and is going to succeed. But it’s going to be a fun journey.

Zuckerberg Gives His First Reaction to Apple’s Vision Pro Read More »

apple-vision-pro-debrief-on-the-voices-of-vr-podcast

Apple Vision Pro Debrief on the Voices of VR Podcast

Apple’s announcement of Vision Pro is reverberating throughout the industry. Beyond just a new headset, the company’s entrance into the space introduces new ideas that are now being discussed around the tech-sphere. To dig further into what Apple Vision Pro means for the XR industry more broadly, I spoke with host Kent Bye on the Voices of VR podcast.

Kent Bye has been consistently documenting the XR space since 2014 through his prolific podcast, Voices of VR, which now spans more than 1,200 episodes.

Over the years I’ve had the fortune of joining Bye on the podcast during pivotal moments in the XR industry. With the long-awaited release of Apple Vision Pro, it was once again time for a check-in; you can listen here to episode #1,217 of the Voices of VR podcast.

Beyond my previously published hands-on impressions with the headset, our discussion on the podcast covers some of the broader implications of Apple Vision Pro, including how the company’s ecosystem plays a major role in the value of the headset, whether or not the headset’s ergonomics are aligned with its use-case vision, and the ways in which Apple’s entrance into the space feels like a reboot of the industry at large.

Bye also interviewed several others for their takes and impressions of Apple Vision Pro. You can check out episode #1,216 to hear from Sarah Hill, CEO of Healium, and Raven Zachary, COO of ARound; episode #1,218 with Ian Hamilton, Editor at UploadVR; and episode #1,219 with Scott Stein, Editor at CNET.

Voices of VR is a listener-supported podcast; if you like what you hear, you can support Bye’s work on Patreon.

Apple Vision Pro Debrief on the Voices of VR Podcast Read More »

apple-vision-pro-to-support-one-of-vr’s-most-prominent-social-apps

Apple Vision Pro to Support One of VR’s Most Prominent Social Apps

Apple unveiled Vision Pro on Monday, its long-awaited standalone headset capable of both virtual and augmented reality. While the Cupertino tech giant seems to be emphasizing Vision Pro’s AR capabilities thanks to its color passthrough cameras, it’s also going to pack one of VR’s most prominent social apps, Rec Room.

Apple’s launch of Vision Pro is still a good bit away—it’s coming first to the US in early 2024 at the hefty price of $3,500. Still, what apps the Fruit Company will allow on the undoubtedly very curated Vision App Store will be telling.

As first noted by UploadVR, among them will be the hit social VR game Rec Room, which so far shares cross-compatibility with SteamVR, Meta Quest, Meta PC VR, PSVR, PlayStation 4/5, Xbox, iOS, Android, and standard monitors via Steam.

Rec Room was the only native VR app shown during the part of the keynote discussing third-part apps, which are coming to the headset via Apple’s integration of the Unity game engine.

Notably, Vision Pro doesn’t offer any sort of motion controller, instead relying on hand and eye-tracking, and voice input. In the past, Rec Room has primarily targeted motion controllers for VR input, however the apps is also set to bring both full-body avatars and new hand models to the platform, which will seemingly do away with the game’s wristless mitten-hands.

Apple Vision Pro to Support One of VR’s Most Prominent Social Apps Read More »

meta-quest-3-and-apple-vision-pro:-a-tale-of-two-headsets

Meta Quest 3 and Apple Vision Pro: A Tale of Two Headsets

Both Apple and Meta revealed highly anticipated headsets in the last seven days. The hype is just about the only thing they have in common. The two headsets have different audiences, different affordances and requirements, and drastically different price points. As such, they illuminate two drastically different approaches to the XR space.

The Meta Quest 3

The Meta Quest 3 has been the subject of rumors, speculation, leaks, and outright fabrication probably since before the Quest 2 came out. But, it was officially announced last week and is set to launch this fall.

The headset has been relatively easy to speculate on, seeing as it is the fourth iteration of flagship headsets in a familiar and built-out hardware and software ecosystem. As was largely expected, the Quest 3 appears to largely be a blend of the successful elements of the Quest 2 and some of the more advanced features that we saw from the Quest Pro.

From fairly early on, the Quest Pro was largely seen as a dev kit and a testing ground for future products. Pro hallmarks coming to the Quest 3 include color passthrough and the Pro controllers. The price of the Quest 3 is significantly less expensive than the Quest Pro, if slightly more expensive than the Quest 2, coming in at $499.

Following the announcement, most of what we have to go on as far as what kind of punch the Quest 3 packs still does largely come from speculation. Where previous Quest headsets were announced at last year’s Connect – Meta’s biggest annual event – the Quest 3 was announced through a relatively low-key social media campaign on the morning of the Meta Gaming Showcase.

Meta Quest 3 headset

As anticipated as the Quest 3 has been, it might not be the most hyped headset of all time – or even the most hyped headset to come out this week. Meta’s seemingly hasty announcement of the Quest 3 seems to have been timed to come out before Apple’s announcement of the Vision Pro.

The Apple Vision Pro

Apple’s entry into XR has long been awaited – if only because the company has repeatedly put off announcing a product. So, after all of the wait, what can the headset actually do?

The headset’s 12 cameras, five sensors, and six mics enable controller-free tracking, voice interactions, and impressive audio. Connected to a device it can run all day, but a battery pack will let it run on its own for two hours. Apple reported over 10 million pixels per eye, but did not release familiar resolution metrics or field-of-view.

The headset is capable of handling 3D assets sent via messages, and it constructs a 3D avatar that can be used in video calls. Developers with early access include JigSpace, and the headset’s software is compatible with Unity.

All of that said, currently available material doesn’t show a lot of actual XR content. The headset arguably looks more like what industry insiders call an “AR viewer” – that is, rather than providing spatial content, it provides a spatial interface for 2D content. That also made up the bulk of Apple’s demonstrations of the headset – and their justification for its massive price tag.

“If you purchased a new state-of-the-art TV, surround-sound system, powerful computer with multiple high-definition displays, a high-end camera, and more, you still would not have come close to what Vision Pro delivers,” Apple’s Technology Development Group VP Mike Rockwell said in the WWDC event that introduced the device. “Apple Vision Pro starts at $3,499.”

Apple Vision Pro headset

That raises an interesting question for the XR community. An Apple Vision Pro might be able to replace all of those things. But, could it replace a Meta Quest?

New Headsets and the XR Ecosystem

Both the Quest 3 and the Apple Vision Pro had XR experts excited. Many aren’t necessarily viewing the two headsets as in competition, but are rather thinking about how the headsets’ introduction will affect the general XR ecosystem.

“The Quest 3 has finally been announced, which is great for everyone in the industry,” 3lbXR and 3lb Games CEO Robin Moulder said in a talk at this year’s Augmented World Expo. “Next week is going to be a whole new level when Apple announces something – hopefully.” 

On the morning of the third day of AWE, the Quest 3 had been announced just an hour or so earlier and the Apple announcement had yet to be aired. However, a number of speakers presented the same idea – this is an exciting time for XR and Apple is contributing to that momentum rather than creating that momentum on its own.

A Bushel of Apples in XR…

One of the XR community’s biggest hopes regarding Apple’s entrance into the market is that XR will be catapulted to the mainstream by Apple’s gravitas. One of the main concerns is that Apple tends to play by their own rules, so a wave of new Apple XR users might be disruptive (or at least, not additive) to XR’s established ecosystem. Not all share this concern.

“If Apple does announce something, they’ll do a lot of education,”  Khronos Group President Neil Trevett said during a panel at AWE. “It will progress how people use the tech whether they use open standards or not.”

Now that we’ve seen from Apple, it’s worth wondering how much education would be involved. The Vision Pro seems largely intent on offering a spatial view of Apple’s apps rather than on incorporating existing XR solutions and uses.

“Apple is not seeking to build a wide XR ecosystem with a cheaper device, like Meta with the Quest line-up,” XR analyst Tom Ffiske wrote on the headset. “The company instead seeks to monetize an already-lucrative subsection of its audience with high-margin subscriptions and software, accessible at a $3,499 price tag.”

Make no mistake: at least for now, Apple Reality Pro is another Apple device within the Apple hardware ecosystem. It is not a new XR device within anything like a metaverse ecosystem.

Apple Vision Pro

“Apple Reality Pro’s biggest advantage is integration into the Apple ecosystem,” VRDirect founder and CEO Rolf Illenberger said in an email shared with ARPost. “The all-important ability to go from the iPhone, to iPad, to Apple Watch with a similar user experience and low barrier to entry… That’s where we believe the potential exists to create believers in AR.”

Mesmerise Head of Platform, Michael Hoffman, is of the same opinion, “Compatibility with other iOS devices will also help the headset integrate more seamlessly into daily life.”

…Or a Rush on Pears?

The next question is whether Apple can bring people into XR without the Vision Pro. Meta (ignoring its various evils) did a lot to introduce people into the XR universe through incredibly low price points. This is not a game that Apple is playing at all – there probably won’t be too many Vision Pro headsets sitting under Christmas trees this year.

That doesn’t mean that Vision Pro doesn’t have a future. After all, this is the first iteration and it hasn’t even hit the market yet.

“The wire may prevent some users from purchasing the product, and the price tag will scare away many people,” Mytaverse CTO and co-founder Jamie Lopez said in an email shared with ARPost. “But Apple has a long history of lowering prices and making new hardware easier to use. Time will tell how Vision Pro changes the world.”

Still, the announcement may have brought people’s attention to XR in the shorter term. If the sticker shock scares them away from the Vision Pro but doesn’t scare them away from XR, they may find that a lot of what the Vision Pro announcement demoed is already possible on far more affordable headsets.

#WWDC23 pic.twitter.com/FVB2iu5zy0

— XREAL 👓 (@XREAL_Global) June 6, 2023

XREAL (formerly Nreal) made this connection on Twitter and, (potentially unpopular opinion) I didn’t see Apple showing the Vision Pro doing a whole lot of things that the XREAL Air can’t do – and the XREAL Air is literally a tenth of the price.

To be clear, Apple claims sarcastically high resolution for the Vision Pro. This allows users to do things like read emails and webpage text in-headset. This is still a bit of a tall order for products like XREAL.

Advancing Next-Generation Inputs

“By the time we have the Apple headset and the new Quest 3, everybody is going to be freaking out about how great hand tracking is and moving into this new world of possibilities,” said Moulder.

Hand-tracking does play a huge part in the Vision Pro announcement, as the Vision Pro doesn’t have controllers. While it can connect to other devices like Bluetooth keyboards, the headset uses a combination of gaze tracking, hand tracking, and voice commands – which was a first sign that the headset might not be particularly robust for applications like gaming.

Meta, on the other hand, has been experimenting with inputs like voice and gesture controls since the Quest 2, but is by no means ditching the controllers. As cool as gesture recognition is, more robust applications like games and enterprise applications require more nuanced controls – and for now, that still means buttons.

For what it’s worth, Moulder advocated for an input system that uses one controller for things like conjuring menus and one free hand for fine interactions. I would like to see a system like that work with applications like Nanome that provides intuitive interactions that would be great with hand tracking, but provides enough interactions that controllers are still the only way to go.

“Current hand tracking technology does not meet the needs that 6DoF controllers can provide, which consumer AR glasses don’t,” Nanome co-founder and CEO Steve McCloskey told ARPost in a recent in-platform interview.

Time Will Tell

Will flocks of people buy the Apple Vision Pro? Will those that don’t, pick up headsets like the Quest 3 or even consumer AR glasses? Do Quest 3 and Apple Vision Pro exist in the same market, or was Zuckerberg wrong to time his announcement in a way that forced the two headsets into an artificial narrative? All I know is, it’s going to be an interesting couple of years.

Meta Quest 3 and Apple Vision Pro: A Tale of Two Headsets Read More »

apple-to-open-locations-for-devs-to-test-vision-pro-this-summer,-sdk-this-month

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month

Ahead of the Apple Vision Pro’s release in ‘early 2024’, the company says it will open several centers in a handful of locations around the world, giving some developers a chance to test the headset before it’s released to the public.

It’s clear that developers will need time to start building Apple Vision Pro apps ahead of its launch, and it’s also clear that Apple doesn’t have heaps of headsets on hand for developers to start working with right away. In an effort to give developers the earliest possible chance to test their immersive apps, the company says it plans to open ‘Apple Vision Pro Developer Labs’ in a handful of locations around the world.

Starting this Summer, the Apple Vision Pro Developer Labs will open in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino.

Apple also says developers will be able to submit a request to have their apps tested on Vision Pro, with testing and feedback being done remotely by Apple.

Image courtesy Apple

Of course, developers still need new tools to build for the headset in the first place. Apple says devs can expect a visionOS SDK and updated versions of Reality Composer and Xcode by the end of June so support development on the headset. That will be accompanied by new Human Interface Guidelines to help developers follow best practices for spatial apps on Vision Pro.

Additionally, Apple says it will make available a Vision Pro Simulator, an emulator that allows developers to see how their apps would look through the headset.

Developers can find more info when it’s ready at Apple’s developer website. Closer to launch Apple says Vision Pro will be available for the public to test in stores.

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month Read More »