Meta has introduced the Segment Anything Model, which aims to set a new bar for computer-vision-based ‘object segmentation’—the ability for computers to understand the difference between individual objects in an image or video. Segmentation will be key for making AR genuinely useful by enabling a comprehensive understanding of the world around the user.
Object segmentation is the process of identifying and separating objects in an image or video. With the help of AI, this process can be automated, making it possible to identify and isolate objects in real-time. This technology will be critical for creating a more useful AR experience by giving the system an awareness of various objects in the world around the user.
The Challenge
Imagine, for instance, that you’re wearing a pair of AR glasses and you’d like to have two floating virtual monitors on the left and right of your real monitor. Unless you’re going to manually tell the system where your real monitor is, it must be able to understand what a monitor looks like so that when it sees your monitor it can place the virtual monitors accordingly.
But monitors come in all shapes, sizes, and colors. Sometimes reflections or occluded objects make it even harder for a computer-vision system to recognize.
Having a fast and reliable segmentation system that can identify each object in the room around you (like your monitor) will be key to unlocking tons of AR use-cases so the tech can be genuinely useful.
Computer-vision based object segmentation has been an ongoing area of research for many years now, but one of the key issues is that in order to help computers understand what they’re looking at, you need to train an AI model by giving it lots images to learn from.
Such models can be quite effective at identifying the objects they were trained on, but if they will struggle on objects they haven’t seen before. That means that one of the biggest challenges for object segmentation is simply having a large enough set of images for the systems to learn from, but collecting those images and annotating them in a way that makes them useful for training is no small task.
SAM I Am
Meta recently published work on a new project called the Segment Anything Model (SAM). It’s both a segmentation model and a massive set of training images the company is releasing for others to build upon.
The project aims to reduce the need for task-specific modeling expertise. SAM is a general segmentation model that can identify any object in any image or video, even for objects and image types that it didn’t see during training.
SAM allows for both automatic and interactive segmentation, allowing it to identify individual objects in a scene with simple inputs from the user. SAM can be ‘prompted’ with clicks, boxes, and other prompts, giving users control over what the system is attempting to identifying at any given moment.
It’s easy to see how this point-based prompting could work great if coupled with eye-tracking on an AR headset. In fact that’s exactly one of the use-cases that Meta has demonstrated with the system:
Part of SAM’s impressive abilities come from its training data which contains a massive 10 million images and 1 billion identified object shapes. It’s far more comprehensive than contemporary datasets, according to Meta, giving SAM much more experience in the learning process and enabling it to segment a broad range of objects.
Image courtesy Meta
Meta calls the SAM dataset SA-1B, and the company is releasing the entire set for other researchers to build upon.
Meta hopes this work on promptable segmentation, and the release of this massive training dataset, will accelerate research into image and video understanding. The company expects the SAM model can be used as a component in larger systems, enabling versatile applications in areas like AR, content creation, scientific domains, and general AI systems.
Varjo hasn’t been hibernating over the winter but they’ve definitely had a very active spring. ARPost typically reports on the Finnish XR company’s groundbreaking hardware and software developments, but the company also helps develop and distribute XR experiences and solutions ranging from operas to flight simulations.
An Opera Produced Entirely Through XR
The Finnish National Opera and Ballet (FNOB) spent two years producing Turandot opera with Sweden’s Malmö Opera. The ambitious international product involved designing complex sets and orchestrating intricate scene transitions. FNOB has commented before that Varjo is the only headset manufacturer that matches their high-quality requirements.
Varjo partners with FNOB and Sweden’s Malmö Opera – Digital Twin vs real-world view comparison
FNOB, which has been gradually implementing XR production tools over the last three years, started with two things: the in-house Unreal Engine-driven “XR Stage” visualization tool, and a 3D model of the main stage created by ZOAN – a 3D content agency that uses the same hardware to bring employees into their virtual headquarters.
“Our artists were only open to using virtual tools if models would be photorealistic and it was an intuitive user experience,” FNOB Production and Technical Director Timo Tuovila, said in a release shared with ARPost. “We have been able to create a digital twin of our stage that actually is true to life, matching the expectations of our ambitious artistic and technical teams.”
The virtual stage was used collaboratively between the two opera houses to virtually design and redesign sets – a process that would normally have taken tremendous time, resources, and materials. It is estimated that using XR preproduction instead saved over $82,000 and 1,500 hours of production time, not to mention enhanced crew safety.
Varjo partners with FNOB and Sweden’s Malmö Opera for Turandot
This is the eighth FNOB production using XR preproduction, but this is the first time that they – or anybody – have used the technology at every step from proof-of-concept to final production, according to the release. It would be interesting to see XR also being used in recording and distributing live content at this level.
Twinmotion Programs Come to Varjo Devices
The virtual production of Turandot is a highly artistic example of using Varjo for architecture and design. That use case is about to get a big leg up as the company recently announced support for its hardware on Twinmotion. Users of the real-time visualization platform’s most recent update will be able to view its high-fidelity models on Varjo devices.
According to the hardware manufacturer, Twinmotion works with all headsets including Aero – the closest thing that the company offers to an entry-level headset. The headset, priced below the company’s other offerings but still within the almost exclusively enterprise range, comes without mandatory software subscriptions making compatibility with other solutions crucial.
Advancing Brain Health With MachineMD
Aero also plays an important part in a partnership with Swiss medical device company machineMD. The partnership’s goal is the development of “neos™” – a proposed device that would use eye-tracking technology for earlier diagnosis of brain disorders. In addition to helping specialists, the device could also be more accessible to doctors that aren’t neuro specialists.
“As a neuro-ophthalmologist, I use the eye as a window to the brain,” said michineMD Medical Director Professor Mathias Abegg in a blog post announcing the partnership. “The Varjo Aero provides me with the most powerful and precise view through this window.”
machineMD’s solution will be based on Varjo Aero to perform comprehensive eye exams for the diagnosis of brain disorders
Between the advancement of the technology and healthy financial support, machineMD expects neos to be ready by the end of this year. Of course, Varjo is also excited to be a part of the important work which could have far-reaching benefits in the brain health world.
“VR-based eye tracking in combination with ophthalmology and neuroscience is opening up important new avenues for researchers and the larger medical community,” co-founder and CTO of Varjo, Seppo Aaltonen, said in the post. “A rare window into the brain is possible with the Varjo Aero headset and we are proud to partner with machineMD to make this technology a reality.”
Hardware Built for Flight Simulation
Varjo is also a leading player in the simulation world, particularly automotive and flight simulation. In fact, one of its premiere headsets, XR-3, was recently released in a specially-tooled edition with cockpit simulations in mind. The variable-passthrough headset has a specially calibrated focal plane to optimize the display of the user’s immediate surroundings.
More recently, Varjo partnered with Leonardo – an aerospace, defense, and security developer. The relationship is intended to “enhance the capability of Leonardo’s aircraft training devices.” The above-mentioned XR-3 Focal Edition headset is already being used.
Varjo and Leonardo partner in developing and deploying immersive solutions for flight training
“Leonardo simulation and training experience, coupled with advanced Varjo technology, will allow our products to increase pilots’ training experience, bringing it into a more immersive environment, both within the specific customer training pipelines and within the scope of International Flight Training School,” said Leonardo Aircraft Division’s Head of Simulation and Training Systems Giuseppe Pietroniro.
Jumping Off of Virtual Cliffs
Simulation has a special place in the consumer space as well, where it allows individuals to experience convincing replications of activities that are costly, dangerous, or both. A recent experience offered by Varjo and Red Bull recreates diving from an 80 ft cliff.
“The VR cliff diving experience is really something that you cannot miss,” real-life cliff diver Orlando Duque said of the activation. “It places you right there in the location, in the middle of the action. It’s probably the closest thing to the real deal.”
Varjo and Red Bull partnership – Mixed reality watersports experience “Water – Breaking the Surface”
The experience is still not coming to a living room near you anytime soon. In addition to using the XR-3 rather than the more consumer-available Aero, the experience is currently only being offered as part of an exhibit at the Swiss Museum of Transport. The complete exhibit also uses AR technology and virtual production to replicate activities like surfing.
While available materials don’t mention plans for a more widely available version, it sure sounds like it would be a step up from Richie’s Plank Experience.
A Virtually Imagined Real World
Varjo is still pushing the limits of extended reality technology – and that means that a lot of its experiences and hardware aren’t available to just anybody. While some people get to put on the headset and jump off of a cliff, the benefits of the technology are definitely benefiting more and more people through the ramifications of work in design, defense, and the arts.
There’s a million things you could do to enhance your life if you had the power to seamlessly augment the world around you with digital and interactive information. And while we’re far from the “seamless” part of that reality, we’re getting early glimpses of how the world could be better with this power.
Take, for example, this project which uses augmented reality to turn a regular electronic drum set into a fully functional rhythm ‘game’ that provides real practice for drum players.
This isn’t a concept—it’s a live demo running on a Quest Pro headset, powered by the VR drumming app Paradiddle. While the app already allows players to play a fully virtual drum set of their dreams, or align virtual drums with their real ones, in the future this AR mode will be added to the app to give drummers a view of their real drums while retaining all the benefits of the digital overlay.
And what benefits are those? Well practicing existing techniques is obvious, but imagine learning entirely new songs in this interactive way, complete with gamified metrics for how well you’re doing and how quickly you’re improving. And how about turning down a song’s speed until you get the fundamentals, and then slowly cranking it up until you’re hitting every note?
While we might think of drum lessons as a fairly niche use-case for AR, it’s easy to imagine how similar systems could apply to almost any instrument. And what do you know—there’s already a similar project for piano players! Somebody give me one for the sax because I’ve been saying I’d pick it up for years!
There’s still a lot of work to be done to make experiences like these easy enough that anyone can use them, but there’s a real possibility that the future of ‘rhythm games’ could actually teach players how to play real instruments at a high level.
VR Skater, the skateboarding sim for PC VR headsets, is making its way to PSVR 2 this summer.
First launched on Steam Early Access in 2021, VR Skater offers up a unique way of sidestepping the fact that most people don’t have tracked feet (or skateboard perioherals) in VR yet.
In it, you move your motion controllers in the same way you might move your feet. It’s a pretty unique locomotion concept that sidesteps the need for some sort of skateboard peripheral, like the old skateboard controller for Tony Hawk: Ride (2009).
Customizing at the shop | Image courtesy Deficit Games, Perp Games
In VR Skater, you push your board forward by moving your hand in a skiing motion, and use precise controller motions to execute a wide variety of flip tricks, grabs, grinds, slides and manuals.
Developer Deficit Games and publisher Perp Games say the urban skating sim offers up seven environments as well as the Mega Ramp, which will test your mettle by launching you across a giant chasm.
In addition to an online leaderboard, it also lets you earn XP, medals, trophies and even a VR Skater shop, where you can exchange XP for grip tape, trucks, wheels and board artwork.
The studios haven’t mentioned a precise launch date beyond “summer 2023,” although in the meanwhile you can wishlist VR Skater on PSVR 2 here.
Joy Way, the studio behind STRIDE and AGAINST, revealed a new VR game which seems to combine the web-slinging action of Spider-Man with the demon-slaying melee carnage of DOOM.
Called Dead Hook, the studio calls the upcoming Quest title an “explosive mix of roguelike and shooter genre with brutal combat and captivating storytelling.”
Prior to Dead Hook, Joy Way released a game called Outlier on Steam Early Access for PC VR, which was then cancelled shortly thereafter. At the time, the studio cited “overestimated demand” as a reason for pulling the plug on the alien-centric roguelike. Joy Way says it has since reworked the mechanics, storyline, and overall gameplay of Outlier to create Dead Hook.
In Dead Hook, you take on the role of Adam Stone, a mercenary, smuggler, and thief. In it, you explore the elder planet Resaract, collect legendary weapons and customize your character with what the studio says is “100 buffs and permanent upgrades to make each run unique.”
Enemies include regular and elite elders, which try to stop you in the air and on the ground as they defend their tombs, the studio says. Joy Way also says there’s bosses too which have “multiple phases, making each encounter feel tense and exciting.”
The roguelike shooter is also set to have a story. In it, you’ll “uncover the secrets of Resaract and AI duality, facing tough choices and unexpected twists along the way,” the studio says.
Halfbrick Studios announced that Fruit Ninja VR 2 is finally set to leave early access on SteamVR headsets, as the full version will be available starting today on Quest and PC VR headsets.
Released in Early Access on PC VR headsets in late 2021, the game (now in 1.9.2) offers up multiple ways to get into the fruit-slicing grove: you can attack classic, zen and arcade modes, along with a rhythm mode, letting you slice away to the beat.
In addition to a social hub, a multiplayer mode is also available which lets you interact with the environment and challenge other ninjas around the globe. This thankfully includes cross-play.
And while lesser fruit ninjas may have only studied the blade, you’ll also be able to master the bow too across multiple modes. Check out the trailer below to see some of the action:
Fruit Ninja VR 2 is available starting today on the Quest Store for Quest, Quest 2 and Quest Pro. It’s not live yet, although we expect it in the next few hours.
This comes along with a price increase from $20 to $25 for the PC version, which Halfbrick says in a Steam update will bring it in line with Quest Store pricing.
A version is also said to launch on the Pico Store, however we haven’t seen a listing for that yet, so stay tuned.
In the meantime, you can see all of the 1.9.2 patch notes here, which details a number of bug fixes and performance optimizations the studio has made in preparation for the Quest launch today.
Mojo Vision, a company once noted for its work on smart contact lenses, has raised $22.4 million in a new Series A investment round which it will use in a pivot to develop and commercialize micro-LED display technology for consumer, enterprise, and government applications.
The funding round is led by existing investors NEA and Khosla Ventures, with participation from other investors including Dolby Family Ventures, Liberty Global Ventures, Fusion Fund, Drew Perkins, Open Field Capital, and Edge.
Prior to the pivot, the company had amassed $205 million in outside investment, with its most recent in January 2022 bringing to the company $45 million.
Its new focus is on displays for AR/VR, automotive, light field, large format displays and others that require high performance micro-LED displays. Mojo’s prototype smart contacts made use of its own in-house displays, which at the time included a monochrome display capable of over 14,000 pixels per inch (ppi).
Now the company is developing its own High Performance Quantum Dot (HPQD) technology to make a “very small, very bright, very efficient RGB pixel,” the company says in a press statement.
The company is boasting a number of advances in its proprietary technology, including dynamic displays with up to 28,000ppi, efficient blue micro-LED devices at sub-μm scale, high efficiency quantum dot ink for red and green, high brightness at 1M+ nits, and a display system that incorporates an optimized CMOS backplane, wafer-to-wafer bonding, and custom micro-lens optics.
Mojo Vision’s new CEO, Dr. Nikhil Balram, is said to bring semiconductor and display technology expertise to the company:
“The market opportunity in the display industry is big – over $100 billion. Sometimes in order to do something very big, you have to start very small. That is exactly what we are doing at Mojo,” said Balram. “We started by developing the world’s smallest, densest dynamic micro-LED display, and now we are applying that innovation to power the next generation of displays. Mojo is combining breakthrough technology, leading display and semiconductor expertise, and an advanced manufacturing process to commercialize micro-LEDs for the most demanding hardware applications.”
“This round of funding will enable us to deliver our breakthrough monolithic micro-LED technology to customers and help bring high-performance micro-LEDs to market,” concluded Balram.
Following last month’s release of the Half-Life 2: Episode 1 VR mod, the Source VR Mod team is set to release Episode 2 on April 6th. A launch trailer shows how the episode has been fully adapted to be played in VR.
After years of starts and stops on various attempts to turn Half-Life 2 into a fully playable VR game, the Source VR Mod Team released the Half-Life 2 VR Mod to major acclaim last year. Since then the team has followed up with an equally well received VR mod for Episode 1.
Less than a month later, the team is set to release the Half-Life 2: Episode 2 VR Mod on April 6th, including a full set of VR features like hands-on weapons, comfort options, real ladder climbing, and—of course—a crow bar you can actually swing.
The Source VR Mod Team released a launch trailer for the game ahead of release, showing the classic Episode 2 action that players can now relive in VR.
The Half-Life 2: Episode 2 VR Mod is free, but requires that players own the original Episode 2 game in order to play. The mod supports all SteamVR headsets like Index, Vive, and Quest via Oculus Link.
Image courtesy Source VR Mod Teamsou
All three of these Half-Life 2 VR mods are built by volunteers who make up the Source VR Mod Team. If you want to support their current and future work, they accept contributions on their Ko-fi page.
The long-awaited sequel to Vertigo is here, bringing with it another dose of its distinctly Half-Life-esque flair and patently strange yet captivating universe. Does Vertigo 2 outdo the original? No need to leave you in suspense since you already read the headline. Quick answer: Yes. For the long answer, read on.
Vertigo 2 Details:
Available On: SteamVR Release Date: March 31st, 2023 Price: $30 Developer: Zach Tsiakalis-Brown Publisher: Zulubo Productions Reviewed On: Quest 2 via PC Link
Gameplay
Like the first in the series, you’re again tasked with linearly fighting your way home through a robot and alien-infested science facility, however the sequel puts a host of new worlds and lifeforms between you and your version of Earth. You really don’t need to play the original Vertigo though to get lost in the weird and expansive narrative of Vertigo 2, although I would suggest it—if only for natural access to the narrative and about four more hours of blasting.
Even if you played Vertigo Remastered in 2020 like me though, you may still have absolutely no idea what the hell is going on in the sequel. The franchise’s brand of absurdist sci-fi kitch gets a new layer of narrative density this time around, one that may be too thick and convoluted for most. Whether you choose to engage with it or not really doesn’t change the fact that the underlying game is undoubtedly a triumph over the original, and many other such VR shooters to boot.
I wouldn’t hesitate to call it PC VR’s best game of 2023 so far, which is doubly impressive since it was basically made by a single person, Zach Tsiakalis-Brown. Seriously, for the magnitude of the experience, Vertigo 2’s credit screen is the shortest I’ve ever seen.
Courtesy Zulubo Productions
Granted, we don’t factor in a team’s size or budget—only the end product—but it bears mentioning just the same that this game, which is so solid and clever, was built by a very (very) small team without the sort of AAA budget we’ve seen squandered on experiences half this good.
While paying tribute to some of gaming’s greats, Vertigo 2 is a VR native through and through. Its 14 collectible weapons feature unique reload mechanics, all of which were designed with VR users in mind. The user-friendly emphasis on weapons means you won’t be faffing with doing real world actions like racking gun slides or manipulating charge handles, which are better suited for realistic combat sims with a much slower pace of gameplay. It’s not long until you find out a room of weirdos will magically zap into existence, hell-bent on setting you back to the last save point; realism simply isn’t a concern here.
Reloading typically requires you to eject a spent magazine (or pod of some sort) with a controller button press, grab a fresh magazine from your left hip holster, and insert the magazine into the mag well. Usually, you’ll only have three such magazines immediately at your disposal, as automatically regenerating ammo takes time. There’s a little counter where a magazine should be.
This means that although you’ll find yourself sticking to a number of more effective weapons along your adventure, both large-scale fights and boss battles will have you relying on weaker guns like your starter pistol as you wait for your favorites to become operational. Additionally, auto-recharging ammo means you won’t need to constantly hoover up loot around the level, save the odd health syringe or bomb you’ll find stashed around periodically.
Developing the muscle memory to rapidly reload, shoot, and change to a new weapon takes time, which can definitely add in a measure of unforeseen difficulty in a fire fight. Still, the wheel-style gun inventory system is accessible enough to eventually let you build that skill and put it to good use as the mixture and number of baddies increases.
Thankfully, you can upgrade a number of guns in your arsenal, which somewhat like Half-Life: Alyx is only accessible in one-off synthesizer points that you encounter on your one-way trip through the game’s 18 chapters.
It’s a tried-and-true method of forcing you to explore levels completely, because modding stations might be underwater, in a cave passageway that leads to nowhere, or hidden behind a bunch of filing cabinets. It’s not a terribly deep upgrade system, but it’s enough to keep those starter guns relevant as you progress through the arsenal of bigger and badder weapons.
Like the first in the series, Vertigo 2 is all about big and wild boss fights, of which there are 10 new encounters. I won’t spoil any of them for you, although they’re mostly what you’d expect, i.e. bespoke battles in medium-size arenas that require you to use the environment and your most powerful weapons to your advantage. Although pretty standard fare, bosses were both distinct and varied enough to keep your interest, and have attack patterns that you’ll have to decrypt, likely after a death or two.
Courtesy Zulubo Productions
Vertigo however goes a step further by tossing in a very wide assortment of baddies that mix and match as you traverse the multiverse. What’s more, you’ll need to intimately acquant yourself with all of their weak points as you head for the game’s end, as you’ll encounter a miasma of all of the multiverse’s baddies all at once.
One thing Vertigo 2 lacks is a wide set of puzzles. The quality of the ones there is good, although I really wish there were more. Still, it’s more about shooting, bosses, enemy vairety, and a weird story, and that’s fine by me.
The game’s lengthy and frankly astoundingly varied campaign took me around 10 hours to complete on the normal difficulty, although you could spend longer exploring every nook and cranny for weapon upgrades and easter eggs, or with a higher difficulty so enemies are more difficult to defeat.
Immersion
The game’s infectious cartoon style is back on display, this time offering up much more fine-tuned environments that are massive in size and variability. While humanoid character models are a little stiff (and maybe overly avatar-y), enemy models and animations are all really well done, which accounts for 99 percent of your encounters anyway.
Courtesy Zulubo Productions
Outside of its excellent, sweeping musical score, one of my favorite bits about Vertigo 2 is the constant change in player expectations.
Once you think you know what the deal is with Vertigo 2, you’ll find an alien trying to rent you a boat, or a war between robots where you have to choose sides, and an interdimensional space opera that gets thicc. Level design slowly becomes equally unpredictable, as you’ll be whisked away at any moment to a new world, a new mission, and ultimately a new revelation about why you’re stuck in such an odd universe. It’s all stupid wacky, and I love it.
You may find yourself challenged with having to ostensibly sweep out a five-floor facility looking for a single puzzle piece, but have the mission completely changed halfway through. In another instance, you scurry up to what must be another boss battle, only to find the thing eaten by something much larger and terrifying. And it does it all without ever breaking the fourth wall. Your mission might be straight forward, or it might be completely derailed at any moment.
Meanwhile, Vertigo 2 unabashedly pays tribute to the Half-Life series, and many others in the process. You’ll find VR-ified health regen stations throughout most of the science-y levels mixed in with mobile versions of the wall-mounted syringes—definitely Half-Life inspired. Stick it in your arm, juice up, and keep going. You’ll immediately attune yourself to its audible beep too.
That said, character voiceovers range from professional to mediocre, which means you’ll probably need the subtitles on at all times so you don’t miss a word. Unfortunately, I found this out after the first cutscene which definitely required subtitles to be anywhere near understandable, since it’s between an alien with a thick Spanish accent and another one with its own Yoda-esque idiolect.
As a side note, the game also includes a number of recording options for when you want to capture in-game video, including a third-person view and smoothed first-person view for a more polished and stabile capture. Basically, all VR games should have those options considering how useful they are to recording in-game footage and screenshots. There’s even a smartphone that is basically just hotkeyed to Steam’s F12 screengrab function, making in-game shots in VR so much easier. Here’s my Insta-friendly selfie, starter pistol in hand.
Image captured by Road to VR
Comfort
Vertigo 2 is a big and varied place, and it includes a number of things you should look out for if you’re sensitive to VR-induced motion sickness. Seasoned VR users and people not adversely affected by artificial locomotion shouldn’t have a problem playing through some of the most challenging bits from a comfort standpoint, as there are periodic bits of forced movement that may or may not jibe with your comfort level.
You’ll be forced to spring through the air on jumping pads, go on fast-moving vehicles that aren’t controlled by the player character, and strafe around at a near constant whilst shooting, which introduces lateral movement that some might feel uncomfortable with.
The game does however make full use of a hybrid locomotion system, which includes smooth locomotion and teleportation presented as viable movement options during gameplay. Provided you don’t want to use teleportation, users can also toggle a jump button in the menu settings, although this is not advisable if you’re at all sensitive.
In an interview ahead of Apple’s upcoming Worldwide Developers Conference event, CEO Tim Cook talks about the potential of XR and why elements of it may be “even better than the real world.”
In an interview by GQ’s Zach Baron, Apple CEO Tim Cook explained that he first joined Apple—which at the time was nearly bankrupt—because Steve Jobs convinced him the company could really change the world.
And change the world it has, with products like the iPhone that have fundamentally altered the way much of the world goes about its daily business.
The next shot the company is rumored to take has a chance to do more than change the world—it could change everyday reality itself.
While Apple remains secretive about its plans for an XR device—which is rumored to be revealed at WWDC in June—Cook said in the interview that in some ways the technology could be “even better than the real world.”
“If you think about the technology itself with augmented reality, just to take one side of the AR/VR piece, the idea that you could overlay the physical world with things from the digital world could greatly enhance people’s communication, people’s connection,” Cook told GQ. “It could empower people to achieve things they couldn’t achieve before.”
“We might be able to collaborate on something much easier if we were sitting here brainstorming about it and all of a sudden we could pull up something digitally and both see it and begin to collaborate on it and create with it. And so it’s the idea that there is this environment that may be even better than just the real world—to overlay the virtual world on top of it might be an even better world,” said Cook. “And so this is exciting. If it could accelerate creativity, if it could just help you do things that you do all day long and you didn’t really think about doing them in a different way.”
When prompted about the company’s criticism of Google Glass around the time the device was introduced back in 2013—saying that head-worn devices would feel to invasive—Cook suggests he may have changed his mind on that point.
“My thinking always evolves. Steve [Jobs] taught me well: never to get married to your convictions of yesterday. To always, if presented with something new that says you were wrong, admit it and go forward instead of continuing to hunker down and say why you’re right.”
Just as Apple was skeptical of Google Glass, Cook knows Apple will always be in a similar boat when launching new products.
“Pretty much everything we’ve ever done, there were loads of skeptics with it,” Cook said. “If you do something that’s on the edge, it will always have skeptics.” When entering new markets, Cook said he considers a handful of questions: “Can we make a significant contribution, in some kind of way, something that other people are not doing? Can we own the primary technology? I’m not interested in putting together pieces of somebody else’s stuff. Because we want to control the primary technology. Because we know that’s how you innovate.”
For the Fectar AR and VR content creation platform users, creating XR content with hand tracking feature has just become simpler and easier.
Launched in 2020, Fectar is “the multi-sided platform that makes the metaverse accessible for everyone, everywhere.” Focused on creating AR and VR spaces for education, training, onboarding, events, and more, and aimed at non-technical users, the company provides a cross-platform, no-code AR/VR building tool.
Last week, Fectar integrated the Ultraleap hand tracking feature within its AR and VR content creation platform, allowing users to build VR training experiences with hand tracking from the beginning.
AR and VR Content Creation With Integrated Ultraleap Hand Tracking
Ultraleap was founded in 2019 when Leap Motion was acquired by Ultrahaptics, and the two companies were rebranded under the new name. Ultraleap’s hand tracking and mid-air haptic technologies allow XR users to engage with the digital world naturally – with their hands, and without touchscreens, keypads, and controllers.
Thanks to the Ultraleap feature, Fectar’s users will now be able to create and share immersive VR experiences that use hands, rather than VR controllers. According to Ultraleap, this makes the interaction more intuitive, positively impacts the training outcomes, reduces the effort of adoption, and makes the experiences more accessible.
Non-Technical People Can Develop Immersive Experiences
The new addition to the AR and VR content creation platform is a strategic decision for Fectar. The company’s target clients are non-technical content creators. They don’t need to know how to code to create VR apps and tools, including training programs.
This is, in fact, one of the most frequent use cases of the Fectar AR and VR content creation platform. “We want our customers to be able to create world-class VR training experiences,” said Fectar CTO and founder, Rens Lensvelt, in a press release. “By introducing Ultraleap hand tracking to our platform we’re giving them an opportunity to level up their programs by adding an intuitive interaction method.”
VR Programs and Tools – the Future of Collaborative Work and Training
Virtual reality content has expanded beyond the field of games or applications for entertainment. VR is part of education and training, medicine, business, banking, and, actually, any kind of work.
This is why an AR and VR content creation platform for non-technical users, like Fectar, is so successful. Companies worldwide want to create their own training and collaborative VR tools, without hiring developers.
“The combination of Ultraleap and Fectar provides people with the right tools they need to develop the best education or training programs – and makes it easy to do so. We already know that enterprise VR programs improve productivity by 32%,” said Matt Tullis, Ultraleap VP of XR. “By making that experience even more natural with hand tracking, Fectar customers can expect to see their VR training ROI increase even further.”
XREAL Games, the Budapest-based studio behind Zero Caliber VR (2018), announced that its upcoming co-op shooter Gambit! is finally set to release on Quest 2 and PC VR headsets next week.
Update (March 31st, 2023): After a long wait, XREAL Games today announced that Gambit! is now set to launch on April 6th, releasing simultaneously on SteamVR headsets and Quest 2.
The cross-platform game is said to offer “hours to plunder, 4-player co-op, dozens of guns, a myriad of attachments, skins, masks, deathmatch, tournament ladders, minigames, leaderboards, climbing, graffiti, hidden rewards, the GNOP, bossfights, free updates, dedicated customer support, and so on.”
Check out the latest gameplay trailer below:
Original Article (February 9th, 2022): Gambit! was first revealed around a year and half ago, promising a campaign-driven adventure full of guns, gangs, mayhem, and rock and roll. Although originally hoping to launch in 2021, now the team says it’s coming in Q2 2022.
Back then we only had a brief teaser to go on (seen below), however the game’s more recent Steam page shows off a bit more of what to expect, including a few new images.
XREAL says Gambit! will feature “20+ hours” of gameplay, that span nine levels across three chapters.
The four-player co-op missions are also said to include “dozens of guns, a myriad of attachments, skins, masks, deathmatch, tournament ladders, minigames, leaderboards, climbing, graffiti, hidden rewards, the GNOP, bossfights, free updates, dedicated customer support, and so on.”
We’re still waiting for the big gameplay reveal, although with the start of Q2 coming on April 1st, we may be getting a peek sometime soon. In the meantime we’ll be keeping our eyes peeled on the game’s Twitter for more info as it arrives.