Snap Inc.

snap-partner-summit-2023-details-changes-coming-to-snapchat-and-beyond

Snap Partner Summit 2023 Details Changes Coming to Snapchat and Beyond

Snap’s annual Partner Summit is the company’s opportunity to showcase its working relationship with other brands. That includes the experiences that come out of those partnerships, as well as the hardware and software updates that drive them. The event covered a lot of ground but we’ll be looking specifically at AR-related updates and announcements.

Some of the announcements are already available for Snapchatters to explore, while others are coming soon. Even the parts of the summit that may seem boring for the average end users help to understand where the platform is going in the coming months and years. And this year’s event is extra special because it was held in person for the first time since 2019.

Snap Map and Bitmoji Features

Bitmojis, the 3D avatars used by Snapchatters for their profiles as well as in games and messages, is constantly expanding, including through new virtual fashion partnerships and this year is no different.

Digital fashions inspired by the Marvel Cinematic universe will be available soon. The avatar system itself will also be updating to allow for “realistic dimensions, shading, and lighting,” according to Vice President of Product Jack Brody.

“Bitmoji style has changed quite a bit, and they continue to evolve,” said Brody.

Snap Partner Summit 2023 - Jack Brody showing 3D Snap Map
Jack Brody showing 3D Snap Map

Brody also announced that the Snap Map is getting more updates, including more 3D locations and tags to help users find popular locations from their Snapchat communities. Users who access the app with Verizon +Play will also be getting new options for games and puzzles in calls with Snapchat’s connected lenses.

Camera Kit Integrations

Snap’s Head of Global AR Product Strategy and Product Marketing Carolina Arguelles Navas took to the stage to talk about recent and upcoming partnerships, including some that affect apps and experiences outside of Snapchat itself through its Camera Kit offering.

For example, Snap lenses can now be used in Microsoft Teams and in the NFL app. LA Rams’ SoFi Stadium even uses Snap Lenses on their Infinity Screen to show the audience with augmented reality effects.

Navas also discussed Snap’s ongoing partnership with Live Nation, bringing custom AR lenses to over a dozen concerts this year including Lollapalooza in Paris and The Governor’s Ball in New York. She also announced a new partnership with Disguise, a company that specializes in real-time interactive visuals for live events.

Snap is also partnering with individual artists. The first to be announced is KYGO, a DJ, with more artist partnerships to be announced throughout the year.

More Opportunities for Brands

Until now, Camera Kit has been the main way that other companies were able to use Snap’s technology. However, Jill Popelka announced a new division, Augmented Reality Enterprise Services (ARES), of which she is the head.

Snap AR Enterprise Services (ARES)

“We all know the shopping experience today, whether online or in-store, presents a lot of options,” said Popelka. “We’ve already seen how our AR advancements can benefit shoppers and partners.”

The “AR-as-a-Service” model currently consists of two main offerings. Shopping Suite brings together Snap’s virtual try-on and sizing recommendation solutions, while the Enterprise Manager helps companies keep track of their activations including through analytics.

Popelka also announced a new “Live Garment” feature that generates a wearable 3D garment from a 2D photo of a garment uploaded into a lens.

Commercial Hardware

Popelka also introduced two new hardware offerings from Snap to commercial partners – AR mirrors and AR-enabled vending machine.

AR mirrors are already making their way into clothing stores to make virtual try-on even easier for shoppers, including those who don’t have Snapchat. Some partners have even experimented with incorporating AR games that shoppers can play to unlock in-store rewards. Retailers are also using the opportunity specifically to engage with younger audiences.

Snap Partner Summit 2023 - Jill Popelka showing AR mirrors
Jill Popelka showing AR mirrors

Snap currently has its AR mirror in a Men’s Wearhouse store.

“[Men’s Wearhouse is] proud to launch digital partnerships and store innovations specifically geared toward how high school students want to shop and prepare for prom,” Tailored Brands President John Tighe said in a release shared with ARPost. “We are excited to offer these younger customers experiences in-store and online to make the shopping experience easier. Everyone deserves to look and feel their best on prom night.”

Snap also partnered with Coca-Cola to create a prototype of an AR vending machine controlled with hand gestures displayed on a screen.

AR-enabled Coca-Cola vending machine - Snap

It might be a while before you see either of these devices in a store near you, but keep an eye out all the same.

App Updates

The standard app is getting some AR updates too, mainly related to the company’s work with AI. When Snapchatters capture a photo or video, the app will recommend lenses that might match the scene. AI will also recommend lenses for reacting to Snapchat memories and produce a new generation of lenses available to users.

Keep Exploring Snapchat

There really was a lot in the Partner Summit that wasn’t detailed here. So, if you use Snapchat for more than just AR, keep checking into the app to see even more changes coming in the next few months.

AWE USA 2023 giveaway

Snap Partner Summit 2023 Details Changes Coming to Snapchat and Beyond Read More »

ray-tracing-comes-to-snap-lens-studio

Ray Tracing Comes to Snap Lens Studio

One of the most powerful recent breakthroughs in graphics and rendering is coming to mobile AR thanks to a recent update to Snap’s Lens Studio. We’re talking about ray tracing.

What Is Ray Tracing?

Ray tracing is a rendering technique that helps to bring digital assets to life in the environment around them – whether that environment is digital or viewed in augmented reality. Recent examples in gaming include convincingly reflective surfaces like water, believable dynamic shadows, and improved light effects.

The technique can be fairly computing-heavy, which can be a problem depending on the program and how it is accessed. For example, when some existing games are updated to use ray tracing, users accessing that game on an older or less fully-featured computer or console may have to turn the feature off to avoid problematic latency.

Fortunately, ray tracing is being developed at the same time as new computing and connectivity methods like cloud and edge computing. These advancements allow the heavy lifting of advanced computing techniques to take place off of the device, allowing older or less fully-featured devices to run more high-level experiences smoothly.

While Snap releases detailing the update didn’t mention Lens Cloud, it’s likely that that feature is behind the update. Announced at the 2022 Snap Partner Summit, which also announced ray tracing for the first time, Lens Cloud provides improved off-device storage and compute, among other advancements.

The Road to Lens Studio

If you closely follow Snap, you’ve known for almost a year that this was coming. Snap also discussed ray tracing at the fifth annual Lens Fest in December. There we learned that the update has been in the hands of select developers for a while now, and they’ve been working with Snap partners to create experiences pioneering the software.

The news announced yesterday is that the feature is now in Lens Studio, meaning that any Lens creator can use it. We also have a new demonstration of the technology: a Lens created with Snap partner Tiffany & Co.

Snap ray tracing - Tiffany & Co

The company has likely been so involved in the development and showcasing of Snap’s ray tracing at least in part because the jewelry that the company is known for provides both a great challenge for and an excellent demonstration of the technology. However, Snap is already looking forward to the feature finding other use cases.

“Now, Lenses that feature AR diamond jewelry, clothing and so much more can reach ultra-realistic quality,” Snap said in the announcement.

The principal use case presented by Snap in the announcement is virtual try-on for clothing retail, like the Tiffany & Co. Lens. However, it is likely only a matter of time before the new feature finds its way into other kinds of AR experiences as well.

What’s Next?

Ray tracing is likely to be a topic yet again at the upcoming Snap Partner Summit in April, and ARPost will be there to hear about it. The online event doesn’t have the same energy as Lens Fest but as we saw here, the Partner Summit is often the first look at Snap’s developing software offerings. We always look forward to seeing what they’ll roll out next.

Ray Tracing Comes to Snap Lens Studio Read More »

how-different-xr-companies-approach-cloud-services

How Different XR Companies Approach Cloud Services

 

XR hardware is on the move. But, software is important too. The bigger your XR needs are, the larger your software needs are. So, more and more XR providers are providing cloud services in addition to their hardware and platform offerings. But, what is the cloud anyway?

Generally, “the cloud” refers to remote servers that do work off of a device. This allows devices to become smaller while running more robust software. For example, some of the cloud services that we’ll look at are cloud storage solutions. Cloud storage is increasingly important because 3D assets can take up a lot of space. Others run computations on the cloud.

Other solutions make up “local clouds.” These are networks of devices managed from a central portal all on location. This kind of solution is usually used by organizations managing a large number of devices from one central computer.

Varjo’s Reality Cloud

“Cloud” takes on yet another meaning for Varjo. For Varjo clients, a lot of the management and IT solutions that make up cloud services for other developers are handled through software subscriptions bundled with almost all Varjo hardware. Varjo’s “Reality Cloud” allows users to join XR meetings including remotely present coworkers and virtual assets.

Varjo Reality Cloud - XR cloud services

“Varjo Reality Cloud is our platform that will allow the ultimate science fiction dream – photo-realistic teleportation – to come true,” CTO Urho Konttori said in a launch event last summer. “What this means, in practice, is true virtual teleportation – sharing your reality, your environment, with other people in real time so that others can experience your world.”

At the beginning of this year, Varjo announced that XR content will soon stream through Reality Cloud services as well. Just like streaming other forms of media, XR streaming aims to provide more content to smaller devices by hosting that content remotely and serving it to users on demand.

“These scalability opportunities that the cloud provides are significantly meaningful when we talk about XR deployment in the corporate world,” Konttori told ARPost in January. “We are now at the level that we are super happy with the latency and deployments.”

In a recent funding announcement, Varjo announced the most recent development in their cloud services. Patrick Wyatt, a C-suite veteran, has been appointed the company’s new CPO and “will be the primary lead for Varjo’s software and cloud development initiatives.” As this article was being written, Varjo further expanded its cloud with Unreal and Unity engine integrations.

CloudXR From NVIDIA

XR streaming is already a reality on other cloud platforms. NVIDIA offers CloudXR that streams XR content to Android and Windows devices. (Remember that Android isn’t a hardware manufacturer, but an operating system. While almost all non-Apple mobile devices run Android, it is also the backbone of many XR headsets.)

NVIDIA CloudXR - XR cloud services

According to NVIDIA, “CloudXR lets you leverage NVIDIA RTX-powered servers with GPU virtualization software to stream stunning augmented and virtual reality experiences from any OpenVR application. This means you can run the most complex VR and AR experiences from a remote server across 5G and Wi-Fi networks to any device, while embracing the freedom to move—no wires, no limits.”

This can be a “pure” cloud application, but it can also be an “edge” application that does some lifting on the device and some remotely. While NVIDIA promotes their cloud services for use cases like location-based experiences and virtual production, edge computing is being embraced by enterprises who may want to keep sensitive content offline.

RealWear’s New Cloud Services

Enterprise XR hardware manufacturer RealWear recently launched their own cloud. This is of the last kind of cloud discussed above. The solution allows IT specialists to “easily control and manage their entire RealWear device fleet from one easy-to-use interface.” That includes content, but it also includes managing updates.

If you own one headset, you know that installing software and updates can be a chore. Now, imagine owning a dozen headsets, or even a hundred or more. Putting on each headset individually to add content and install updates quickly becomes unscalable. The RealWear Cloud also allows real-time tech support, which wouldn’t be possible otherwise.

RealWear Cloud

The RealWear Cloud also allows data analysis across headsets. This is vital in enterprise applications which may be tracking items as they move through a supply chain or tracking employees as they move through tasks or training modules. Handling this data for an individual on an individual headset is possible but, again, becomes unbearable at scale sans cloud.

Cloud Storage in Lens Studio

As for cloud storage, Snapchat recently announced a solution in a Lens Studio update that gives creators up to 25MB of remote storage. While the file size is still capped per asset (you can’t have one 25MB asset), it drastically increases the abilities of Lens Creators working with large or complex models.

Snap Lens Cloud

“Prior to the launch of Remote Assets, if a project was over the Lens size limit, you only had two options: either remove the asset if it wasn’t critical to the experience or resize the image to lower its RAM usage and re-submit,” reads the release. “Now you can utilize our Lens Cloud service to host assets of larger sizes outside of the Lens, and then load them in at run time.”

This is significant because Snap Lenses run on mobile devices that not only have limited space but also share that computing power with a slew of non-XR applications. At least, until Snapchat makes a consumer version of Spectacles.

“At first, we were just building for the phone and porting to the glasses,” Lens Creator Alex Bradt told me when I got to demo Snap’s Spectacles at AWE. “Now we’re like, ‘what can we actually do with these that will solve problems for people that they didn’t know they had?’”

Parents and Partners

Not all XR companies offer their own cloud services. For example, Magic Leap has had a partnership with Google Cloud for the past year now. Likewise, AutoDesk offers its XR cloud services through a partnership with Amazon.

Similarly, ThinkReality cloud services are offered through parent company Lenovo. A similar relationship exists between Azure and Microsoft’s MR hardware.

Partnerships like these help each company get the most out of their existing offerings without needing to build services from the ground up. As enterprises explore entering XR, these offerings also help them integrate into cloud services offered by suppliers that they may already be working with, like Microsoft, Google, Amazon, or Lenovo.

Your Forecast: Cloudy

Right now, a lot of cloud services serve industry – where it is doing very impactful things for industry. That doesn’t mean that people with just one headset (or a phone) shouldn’t be taking note. Developments in XR cloud development (for enterprise or for consumer applications) are making smoother, faster, lighter-weight, and more robust XR applications possible for everyone.

How Different XR Companies Approach Cloud Services Read More »

snap-celebrates-the-fifth-annual-lens-fest

Snap Celebrates the Fifth Annual Lens Fest

 

Snap Lens Fest took place on December 6 and 7. The annual event is a celebration of the Snap Lens creator community, as well as an opportunity for the company to announce initiatives and software offerings. The event is also home to the Lens Fest Awards.

If you’re a Snapchat user, Lens Fest gives you an insight into what’s happening behind the lens and what’s coming next. If you aren’t a Snapchat user, you should still pay attention as Snap’s design tools are used by other organizations to develop their AR tools and games as well.

Welcome Back!

Four Snap leads used the keynote to set the stage for the rest of Lens Fest (as well as make the first major announcements). First, Snap Chief Technology Officer, Bobby Murphy, presented Lens Studio – Snap’s developer suite – as being an organic collaboration between the company and the users of the platform.

Lens Studio - Snap Lens Fest 2022

“We’re excited to be here to celebrate you, the global Lens developer community,” said Murphy. “We’re excited to continue developing Lens Studio along with you … There is so much opportunity ahead of us.”

To Murphy, this two-way development of Lens Studio is a major part of the development of AR as a whole. This draws on Snap’s longstanding position that AR is the future of immersive tech. It would come to be an ongoing theme for later presentations as well.

“The best and most engaging AR filters add to the world rather than replace it,” said Murphy. “Over time, we see the potential for wearable technology, like our Spectacles, to make it even more accessible.”

Updates and Opportunities

Following Murphy, Software Engineering Senior Manager, Trevor Stephenson, discussed some of the big updates to Lens Studio in the past year, including Ray Tracing. The feature, which was announced at Snap’s Partner Summit in April, is already in the hands of select partners but is coming to the platform more publicly next year.

Next, Joe Darko, Global Head of AR Developer Relations, spoke about learning and development opportunities culminating in a new “Lensathon.” The remote opportunity opened on the first day of Lens Fest and continues through the end of January. Following the event, a total of $200,000 will be awarded, including $40,000 to the top project.



The Future of Creator Monetization?

Finally, Director of AR Platform Partnerships and Ecosystems, Sophia Dominguez, took the virtual stage. She teased early experiments with creators to create AR items and assets available to Snap users in exchange for tokens – but didn’t suggest a release date. The coming feature, which was promoted as a creator monetization option, tied back into Murphy’s opening themes.

“We believe that as more developers like you establish businesses, we move closer to our wearable future,” said Dominguez. “We’re committed to pushing the AR industry forward alongside you.”

More on the Economic Future of AR

For the Lens Fest next session, a non-Snapchatter took the stage – Mike Boland of ARtillery Intelligence. He discussed the trajectory of AR as a market, specifically for advertising.

Boland likened AR to the early internet, saying that it will meet and even exceed all of our expectations provided that we remember that that kind of development will take time. Boland also said that, historically, emerging technologies had done well in market economic slowdowns as they cut their legacy ad spending while continuing to find the next big thing.

According to Boland, we’re already seeing signs of AR maturing as a market, such as the shift from the selfie cam to the world-facing cam. While Boland said that AR glasses are “years away,” he also pointed out that AR glasses only have world-facing cameras. He also pointed to the shift to more productive and informative AR lenses as a further sign of maturity.

“In addition to fun and whimsical lenses, we see an increase in practical lenses,” said Boland. “We’ll still see lots of fun and games in AR just like we do on the web today.”

“What’s New in Lens Studio”

The next Lens Fest session took a closer look at the more near-term future. Lens Studio Product Marketing Manager Leigh Brown and Product Manager Charmain Lee presented updates to Snap Lens Cloud and collision mesh software.

Lens Cloud was announced at the Partner Summit and allows lens creators to store assets remotely so that running a lens is less of a technical task for the device. An impending update to Lens Cloud will allow users to edit the live version of their lens by changing which assets are in the Lens Cloud version of the project.

Another coming update to Lens Studio can automatically make a collision mesh of both virtual objects and the physical world. There will also be new filters for finding a mesh that will provide just the right collision.

The Lens Fest Awards

This is the Fifth annual Lens Fest, but only the second annual Lens Fest Awards. The event recognized 50 finalists across five categories with one winner in each. Hosting the Awards was Snap’s European AR Developer Relations Lead, Oscar Falmer. Once again, judges came from across Snap, though the categories were different this year.

“We’re thrilled to be here today to celebrate the year’s most creative lenses and the developers who build them,” said Falmer. “All great AR begins with the creativity of Lens developers and creators.”

Play

The first category recognized lenses “that use gaming or entertainment to enhance how we experience the world” and the award went to Table Trenches: Operation Living Room by DB Creations. The multi-player game uses scans of a player’s environment to create a reactive map for a tower-defense-style strategy game.

The Lens Fest Awards - Table Trenches

“Thanks so much to everyone who helped us make this game a reality,” said DB Creations co-founder Dustin Kochensparger. “I can’t wait to show you what we’re working on next.”

Fashion

The Fashion category recognized lenses that “revolutionize the world of personal style.” The award went to Vishal Yadav’s Flux Fashion, a lens that allows users to customize a virtual garment using colors sourced from their physical environments.

The Lens Fest Awards - Vishal Yadav’s Flux Fashion

Yadav, also a nominee in the Wellness category, expressed gratitude at recognition of his lens saying, “It means a lot to me.”

Education

The Education category “celebrates lenses that raise awareness for important causes or foster knowledge through AR.” The award went to Inna Horobchuk for Sky Map. Sky Map is an interactive annotated map of stars and constellations – something that took a whole dedicated app when this XR journalist started writing.

“I’m super excited that people from all around the globe can engage with my lens and learn about stars and constellations in AR,” said Horobchuk, who was also a nominee for the Wellness category.

Wellness

The Wellness category celebrated “lenses that contribute to physical and mental well-being” and was awarded to Soft Drink Info by Wasim Ghole.

The lens displays nutrition information for a number of popular sodas and energy drinks. Ghole thanked Snap for recognizing the lens, and for providing the tools to create and distribute it.

Moonshot

The final Lens Fest Award category “highlights creators who have seen the limitless potential of AR and have challenged themselves to do something that has never been done before.” The award went to Dennis Rossiev’s Imaginary Friends, which allows users to turn scans of objects in their environment into cartoon companions.

The Lens Fest Awards - Dennis Rossiev Imaginary Friends

“With machine learning, I was able to build a lens that I’ve been dreaming about for so long,” said Rossiev. In addition to having been a nominee in two other categories this year, Rossiev also won in an “originality” category at last year’s Lens Fest Awards.

The Camera That Keeps on Giving

For the outfit that still calls itself a “camera company,” Snap is leaning more into AR than ever before. Dedicated specifically to Lenses, Lens Fest is a necessarily AR-focused event. If the company stays consistent, the next Partner Summit should be in a few months to key us into other elements of the company’s strategy.

Snap Celebrates the Fifth Annual Lens Fest Read More »