Development

star-citizen-still-hasn’t-launched,-but-it’s-already-banning-cheaters

Star Citizen still hasn’t launched, but it’s already banning cheaters

pre-cheating —

Developer bans “over 600” players for exploiting an item duplication glitch.

For an unreleased game, <em>Star Citizen</em> still has some really pretty ships…” src=”https://cdn.arstechnica.net/wp-content/uploads/2023/03/star-citizen-update-800×343.jpg”></img><figcaption>
<p><a data-height=Enlarge / For an unreleased game, Star Citizen still has some really pretty ships…

RSI

At this point in Star Citizen‘s drawn-out, 11-plus-year development cycle, we’re usually reminded of the game when it hits some crowdfunding microtransaction milestone or updates its increasingly convoluted alpha development roadmap. So last week’s announcement that developer Cloud Imperium Games (CIG) has banned over 600 cheaters from its servers is a notable reminder that some people are actually enjoying—and exploiting—the unpolished alpha version of the game.

Shortly after the May release of Star Citizen’s Alpha 2.23.1 update, players started noticing that they could easily make extra money by storing a freight ship, selling their cargo, and then returning to the ship to find the cargo ready to be sold a second time. As knowledge of this “money doubling” exploit spread, players reported that the price of basic in-game resources saw significant inflation in a matter of days.

Now, Cloud Imperium Games Senior Director of Player Relations Will Leverett has written that the developer has investigated “multiple exploits within Star Citizen that compromised stability and negatively impacted the in-game economy.” In doing so, CIG says it “identified and suspended over 600 accounts involved in exploitative behaviors while also removing the illicitly gained aUEC [in-game currency] from the Star Citizen ecosystem.”

A ban for “over 600” players may not seem that notable when games like Dota 2 and World of Warcraft routinely announce ban waves that include tens of thousands of players. Still, it’s a reminder that at least a small portion of the game’s more than 5.2 million backers are actively playing the alpha so much that they’re willing to cheat to see more of what the game has to offer.

“From zero, in two evenings, I did make about [200 million aUEC] just to buy ships that [are] unavailable for me, to try it in full!” user ZeroInsideOut wrote on the game’s forums. “There [are] many things in Star Citizen [which] I would like to try and test, but I am short of money.”

It’s getting late for “early access” bugs

Leverett wrote that exploits like these should be expected in Star Citizen “at this stage of development”—a stage that we hasten to once again point out is now part of well over a decade of active development. Finding and squashing these kinds of bugs “early” is all part of the game’s crowdfunded development plan and “one of the benefits of open development and working closely with our community,” Leverett wrote.

“We’ve gained valuable insights through your issue council reports, and we thank you for that,” he continued. “However, once an exploit is identified and confirmed, continued abuse for personal gain will not be tolerated and will result in action on our part.”

However, some players feel that the “open development” process failed to find this significant issue quickly enough. Commenter Nitebird took CIG to task for “allow[ing] exploits reported during [Public Test Universe] to go to live despite many people confirming the issue in [the Issue Council] and urging CIG to pause to fix it. The patch is ruined regardless for many people… What is IC good for than to prevent this?”

Star Citizen development roadmap summary, posted in early 2023.” height=”481″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/06/rsiroadmap-640×481.jpg” width=”640″>

Enlarge / A fan-designed Star Citizen development roadmap summary, posted in early 2023.

CIG launched an important “Persistent Universe” update for Star Citizen over a year ago and announced late last year that spin-off Squadron 42 had reached “feature complete” status. Despite those signs of progress, though, there’s still no target date for Star Citizen to finally transition from its extended alpha to that fabled “Version 1.0” release.

CIG founder and CEO Chris Roberts said in a March update that the development team “is hard at work, heads down, driving toward the finish line” and that the leadership team has now “spent significant time looking at what Star Citizen 1.0 means and what it would take to get there.” That includes the planned introduction of long-sought key features like base building and crafting that were apparently not a priority during the game’s first 11-plus years of development work.

“As that roadmap [for a 1.0 release] comes together and becomes validated, we look forward to sharing with you both its vision and executional plan later this year,” Roberts wrote.

Star Citizen still hasn’t launched, but it’s already banning cheaters Read More »

after-decades-of-mario,-how-do-developers-bridge-a-widening-generation-gap?

After decades of Mario, how do developers bridge a widening generation gap?

A prototype wonder effect—featuring Mario's head turned into blocks that could be eaten by enemies—didn't make it into the final game.

Enlarge / A prototype wonder effect—featuring Mario’s head turned into blocks that could be eaten by enemies—didn’t make it into the final game.

Nintendo

In a game industry that seems to engage in periodic layoffs as a matter of course, it’s often hard for even popular game franchises to maintain continuity in their underlying creative teams from sequel to sequel. Then there’s the Mario series, where every person credited with the creation of the original Super Mario Bros. in the 1980s ended up having a role in the making of Super Mario Bros. Wonder just last year.

In a recent interview with Ars Technica, Wonder producer Takashi Tezuka said it wasn’t that tough to get that kind of creative continuity at Nintendo. “The secret to having a long-tenured staff is that people don’t quit,” he said. “For folks who have been there together for such a long time, it’s easy for us to talk to each other.”

That said, Tezuka added that just getting a bunch of industry veterans together to make a game runs the risk of not “keeping up with the times. Really, for me, I have a great interest in how our newer staff members play, what they play, what they think, and what is appealing to them. I think it’s very interesting the things we can come up with when these two disparate groups influence each other to create something.”

Young and old

For Super Mario Bros. Wonder, the development team solicited literally thousands of ideas for potential game-changing Wonder Effects and badges from across Nintendo. In doing so, the game was able to incorporate the viewpoints of people with a wide variety of histories and memories of the series, Tezuka told Ars.

  • Super Mario Bros. Wonder Producer Takashi Tezuka.

    Nintendo

  • Super Mario Bros. Wonder Director Shiro Mouri.

    Nintendo

“Among our staff, there are folks who actually maybe haven’t played some of the [older] game titles we’re talking about,” he said. “So I think there was some familiarization for those folks with some of those titles. And maybe there was some inspiration drawn from those titles that I’m not aware of.”

For a series as long-running as Mario, though, even some of the relatively “younger” development cohort can have a deep history with the series. Super Mario Bros. Wonder Director Shiro Mouri, who joined Nintendo in 1997, recalled playing the original Super Mario Bros. back in elementary school, and being “so moved and awed by the secrets and mysteries I discovered in that game.” The Wonder Effects in Wonder were an explicit attempt to recapture that feeling of being young and discovering new things for the first time, which can be difficult in such an established series.

Mouri also drew some parallels between Yoshi’s Island—where Yoshi could sometimes turn into a vehicle—and Wonder transformation effects that could turn the player into slime or a spiky ball, for instance. “That’s not to say that we drew [direct] inspiration from [Yoshi’s Island] or anything, but I think… providing surprises has always been a theme throughout our philosophy,” he said.

After decades of Mario, how do developers bridge a widening generation gap? Read More »

treedis-transforms-physical-spaces-into-hybrid-experiences-with-a-new-augmented-reality-app

Treedis Transforms Physical Spaces Into Hybrid Experiences With a New Augmented Reality App

Augmented reality (AR) transforms how we view the world and do things. Since its first introduction in the 1960s, it has rapidly developed and been used extensively in fashion, marketing, the military, aviation, manufacturing, tourism, and many others.

Consumers are increasingly becoming adept at using augmented reality apps to try on products, learn new things, and discover information about their surroundings. Research shows that 56% of shoppers cite AR as giving them more confidence about a product’s quality, and 61% prefer to shop with retailers with AR experiences.

Aside from its impact on brands, AR is also transforming how companies operate internally by introducing better ways to perform jobs, train employees, and develop new designs.

No-Code Platform for Creating Your Own Immersive Experience

Creating AR experiences is no walk in the park. Firms that want to implement their own augmented reality apps require working with talented in-house app builders or purchasing from third-party app builders, with costs ranging from tens to hundreds of thousands of dollars.

Treedis platform

Treedis makes the process simple with its Software-as-a-Service platform, which helps users create immersive experiences using a no-code drag-and-drop visual editor. Users can create digital, virtual reality, and augmented reality dimensions of their digital twin with just a single scan.

Digital twins are immersive, interactive, and accurate 3D models of physical spaces. They’re a digital replica of devices, people, processes, and systems whose purpose is to create cost-effective simulations that help decision-makers make data-driven choices.

Powered by Matterport technology, Treedis helps companies create these immersive experiences for retail, training, marketing, onboarding, games, and more.

Enhancing Digital Twins With an Augmented Reality App

According to Treedis CEO Omer Shamay, the Treedis augmented reality app helps you “view enhanced versions of your digital twins within their physical counterparts.” You can visualize any changes or modifications in real time and view all the 3D objects, tags, directions, and content in the digital twin.

“Any changes made to your digital twin will be instantly visible in AR, ensuring seamless collaboration and communication across your team,” Shamay adds.

The platform helps 3D creators and enterprises create an immersive and powerful digital experience for their users, so they can fully harness the benefits of AR solutions without huge developmental costs or challenges.

It can be used extensively for creating unique shopping experiences that incorporate elements of virtual commerce and gamification features. It’s ideal for developing immersive learning experiences to help learners grasp concepts better through physical interaction with their environment. The app can also be used to provide indoor navigation for guiding visitors to different access points and key locations within a space.

Treedis augmented reality app

The app is already available for Treedis’ enterprise users and promises to be “an accessible app with low prices and an easy-to-use AR solution,” according to Shamay.

With AR becoming more accessible, it won’t be long before more brands and firms adapt the technology and provide better and enhanced experiences to their audiences.

Treedis Transforms Physical Spaces Into Hybrid Experiences With a New Augmented Reality App Read More »

wonderland-engine-is-here-to-make-webxr-development-faster-and-easier

Wonderland Engine Is Here to Make WebXR Development Faster and Easier

WebXR development is increasingly popular. Developers want to create content that users can enjoy without having to install apps or check the compatibility of their devices.

One of the companies working for the advancement of immersive technologies, Wonderland GmbH, based in Cologne, Germany, has recently announced one giant leap forward in this process. They have recently released Wonderland Engine 1.0.0, a WebXR development platform already vouched for by top content creators.

Wonderland Engine 1.0.0 – Bringing Native XR Performance to WebXR Development

What is special about the new engine launched by Wonderland? Its first benefit is the ability to mimic native XR performance. Before its launch, Wonderland Engine 1.0.0 passed the test of content creators.

WebXR development platform Wonderland Engine editor vr website with browser

Vhite Rabbit XR and Paradowski Creative, two companies creating XR games, used the engine to develop content. The Escape Artist, an upcoming title by Paradowski Creative, is created with Wonderland Engine 1.0.p0, and its developers say that it matches native games in terms of polish and quality.

“We’re excited to announce this foundational version of Wonderland Engine, as we seek to bridge the gap between native XR app development and WebXR,” said the CEO and founder of Wonderland, Jonathan Hale, in a press release shared with ARPost. “We see a bright future for the WebXR community, for its developers, hardware, support, and content.”

Top Features of Wonderland Engine 1.0.0

The developers who choose Wonderland GmbH’s WebXR development platform to create content will be able to use the following:

  • Full 8th Wall integration – complete integration of 8th Wall AR tracking features such as face tracking, image tracking, SLAM, and VPS;
  • Runtime API rewrite – better code completion, static checks for bugs before running the code, and complete isolation for integration with other libraries;
  • Translation tools – necessary for the localization of WebXR content;
  • Benchmarking framework – to check for content performance on various devices.

Developers can find the complete list of features and bug fixes on the official release page.

According to the company, Wonderland Engine users can launch their first running app into the browser in less than two minutes. With a bit of experience, users can build a multi-user environment that supports VR, AR, and 3D in 10 minutes, as demonstrated in this video.

The XR Development Platform Is Optimized for VR Browsers

To indicate their commitment to helping content creators, Wonderland GmbH is optimizing their tool specifically for the most popular VR browsers: Meta Quest Browser, Pico Browser, and Wolvic.  

Wonderland Engine WebXR meta browser

Wonderland Engine-based apps support any headset that has a browser available. Also, any headset released in the future will automatically be supported, if it has a browser. Apps created with Wonderland Engine can also run on mobile devices through the browser, as Progressive Web Apps (PWA), which also allows them to run offline.

Apart from the two game development companies mentioned above, the company is also working with various content creators.

“It was crucial to bring the whole ecosystem with us to test and validate the changes we made. This resulted in a highly reliable base to build upon in upcoming versions,” Hale said. “By making it easier to build XR on the web we hope to attract developers and content creators to WebXR. We see WebXR truly being able to rival native apps and offer consumers a rich world of rapidly accessible content to enjoy.”

Meet the Wonderland Team at AWE USA 2023

The creators of Wonderland Engine 1.0.0 will present the WebXR development platform at AWE USA 2023 (use ARPost’s discount code 23ARPOSTD for 20% off your ticket), which is taking place in Santa Clara, CA between May 31 and June 2.

The company is one of the sponsors of the event and will also be present at the event in booth no. 605.

Wonderland Engine Is Here to Make WebXR Development Faster and Easier Read More »

croquet-for-unity:-a-new-era-for-multiplayer-development-with-“no-netcode”-solution

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution

Croquet, the multiplayer platform for web and gaming, which took home the WebXR Platform of the Year award at this year’s Polys WebXR Awards, recently announced Croquet for Unity.

Croquet for Unity is an innovative JavaScript multiplayer framework for Unity – a platform for creating interactive, real-time 3D content – that simplifies development by eliminating multiplayer code and server setup. It connects developers with the distinct global architecture of the Croquet Multiplayer Network. The framework was demonstrated at GDC last week, while early access beta is arriving in April 2023.

Effortless Networking for Developers

Croquet for Unity alleviates the developers’ need to generate and sustain networking code. By employing Croquet’s Synchronized Computation Architecture, server-side programming and traditional servers become unnecessary.

Users connect through the Croquet Multiplayer Network, which consists of Reflectors—stateless microservers located across four continents—that guarantee smooth and uniform experiences for gamers.

Synchronizing Computation for Flawless Multiplayer

At its essence, Croquet focuses on synchronizing not only the state but also its progression over time. By harmonizing computation, Croquet eliminates the need to transmit the outcomes of intricate computations like physics or AI.

It also eliminates the necessity for particular data structures or sync indicators for designated objects. As a result, crafting multiplayer code becomes akin to creating single-player code, with the full game simulation executing on-device.

Shared Virtual Computers for Perfect Sync

A shared virtual computer runs identically on all clients, providing perfect synchronization and giving each player a unique perspective. Lightweight reflectors can be positioned at the edge of the cloud or in a 5G network’s MEC, offering lower latency than older architectures.

In addition, synchronized calculations performed on each client will replace traditional server computations, resulting in reduced bandwidth and improved latency.

Unprecedented Shared Multiplayer Simulations

Croquet not only facilitates multiplayer development but also enables previously unfeasible shared multiplayer simulations. Examples include real-time interactive physics as a fundamental game feature, fully reproduced non-player character behaviors, and sophisticated player interactions that allow players to interact while the game is live.

Due to bandwidth limits and intrinsic complexity, traditional networks are incapable of supporting these simulations.

“Innately Multiplayer” Games With No Netcode

“Multiplayer games are the most important and fastest-growing part of the gaming market. But building and maintaining multiplayer games is still just too hard,” said David A. Smith, founder and CTO of Croquet, in a press release shared with ARPost. “Croquet takes the netcode out of creating multiplayer games. When we say, ‘innately multiplayer,’ we mean games are multiuser automatically from the first line of code and not as an afterthought writing networking code to make it multiplayer.”

Croquet’s goal is to simplify developing multiplayer games, making it as easy as building single-player games. By removing netcode creation and administration, developers can concentrate on improving player experiences while benefiting from reduced overall creation and distribution costs, a speedier time to market, and enhanced player satisfaction.

Opening Doors for Indie Developers

Croquet for Unity is created for a wide range of gaming developers, but it is highly advantageous for small, independent developers that often find it more difficult to create multiplayer games because of the absence of in-house networking and backend technical background.

Secure Your Spot on the Croquet for Unity Beta Waitlist

Developers can sign up for the Beta Waitlist to access the Croquet for Unity beta, launching in April.The Croquet for Unity Package will be available in the Unity Asset Store upon commercial release for free, requiring a Croquet gaming or enterprise subscription and developer API key for global Croquet Multiplayer Network access.

Croquet for Unity: A New Era for Multiplayer Development With “No Netcode” Solution Read More »

nvidia-cloudxr-4.0-enables-developers-to-customize-the-sdk-and-scale-xr-deployment

NVIDIA CloudXR 4.0 Enables Developers to Customize the SDK and Scale XR Deployment

In January, NVIDIA announced new products and innovations at CES 2023. At this year’s NVIDIA GTC, “the developer conference for the era of AI and the metaverse,” NVIDIA announced the latest release of CloudXR. Businesses can definitely look forward to boosting their AR and VR capabilities with the new NVIDIA CloudXR developments, enhanced to bring more flexibility and scalability for XR deployments.

The latest release augurs well for developers looking to improve the customer experience while using their apps whether on the cloud, through 5G Mobile Edge Computing, or corporate networks.

In CloudXR 4.0, new APIs allow flexibility in the development of client apps as well as in using various distribution points to deliver XR experiences. Scalability in multi-platform use is another plus as broader options for CloudXR interface are likewise made available. The new NVIDIA CloudXR also makes it possible for developers to create custom user interfaces through the use of Unity plug-in architecture.

Among the benefits that developers can enjoy with the new NVIDIA CloudXR 4.0 are:

  • No Need for OpenVR or OpenXR Runtime – CloudXR Server API lets developers build CloudXR directly into their applications, although OpenVR API via the SteamVR runtime continues to be fully supported by the new version.
  • More Deployment Options With the Use of the Unity Plug-in – Developers can build on the Unity engine and create a full-featured CloudXR Client using Unity APIs.

NVIDIA CloudXR 4.0 - Unity Plug-in

  • Reduced Lag and Delay Problems Through the l4s Technology – Lags on interactive cloud-based video streaming are reduced as the new NVIDIA CloudXR release makes use of a convenient “togglable” feature in the implementation of the advanced 5G packet delivery optimization.

More Immersive Experiences With the New NVIDIA CloudXR Developments

The new NVIDIA CloudXR developments now make it possible to provide more immersive high-fidelity XR experiences to the users. Developers and businesses can offer high-performance XR streaming to their customers through the most accessible platforms and devices. They can now customize their applications to give the kind of XR experiences their customers are looking for.

“At VMware we’re using NVIDIA CloudXR to enable our customers to stream high-fidelity XR experiences from platforms, like VMware Horizon, to standalone VR devices running VMware Workspace ONE XR Hub,” said VMware Director of Product Management, Matt Coppinger, in a press release shared with ARPost. “This gives our customers the power of a graphics workstation along with the mobility of a standalone VR device.” 

With CloudXR 4.0, developers are able to improve integrations and consequently the overall performance of their apps and solutions.

NVIDIA also revealed strategic partnerships with tech companies like Ericsson and Deutsche Telekom to ensure that integrations, specifically of the L4S, into the new NVIDIA CloudXR developments, are implemented seamlessly.

The availability of high bandwidth, low latency networks for optimal streaming performance are also assured through these alliances. Head of Deutsche Telecom’s Edge Computing, Dominik Schnieders, reiterates how they believe that the CloudXR and L4S optimization is a critical component of streaming XR for both enterprises and consumers in the public 5G network.

Most Requested Features on the New NVIDIA CloudXR Developments

The new version of the CloudXR puts together more “in-demand” features. Among these are: generic controller support, call back-based logging, and flexible stream creation. This demonstrates great responsiveness to the needs of the XR developer community and is perceived as a significant improvement in the distribution of enterprise XR software.

NVIDIA CloudXR 4.0 Enables Developers to Customize the SDK and Scale XR Deployment Read More »

nvidia-and-autodesk-bring-collaborative-xr-experiences-to-the-cloud

NVIDIA and Autodesk Bring Collaborative XR Experiences to the Cloud

 

XR technology is evolving quickly. Today, millions of people use AR and VR as they go through their daily lives. While many of the popular use cases of AR and VR are still in the realm of gaming and entertainment, other industries are finding practical use cases unique to their sectors.

Developments in extended reality are expanding from innovating hardware to elaborating experiences through advanced technologies and accessible systems. In October, tech giants NVIDIA and Autodesk announced the official launch of NVIDIA CloudXR and Autodesk VRED on Amazon Web Services (AWS), a cloud computing platform for users to run their choice of applications and software.

The joint NVIDIA-Autodesk release is available as a “Quick Start” deployment system on AWS. Virtually, any user has access to leverage Autodesk VRED with the powerful NVIDIA CloudXR infrastructure. Embracing a collaborative environment of manipulating and designing high-fidelity immersive XR experiences on the cloud hastens the design workflows of industry professionals. In addition, it advocates accessible extended reality environments, accelerating the adaptability of XR technologies.

Bolstering the Future of Accessible XR Technologies

The world’s first virtual reality (VR) machine was built in 1956 (and patended in 1961)—the Sensorama was a movie booth incorporating 3D, audio, and video with a vibrating seat for an immersive viewing experience.

Inspired by the Sensorama came the development of the world’s first VR headset in 1961. The Headsight headset was built for military operations, complete with motion tracking technology. By 1968, the world witnessed the creation of the first augmented reality headset. Invented by Ivan Sutherland, a Harvard professor, the Sword of Damocles set the blueprint for generating present-day immersive AR experiences.

The long and exciting evolution of XR has yet to reach its turning point: becoming accessible for mainstream use. The general public has yet to have firsthand experience of using extended reality technologies.

The World Economic Forum states that user experience is pivotal to the mainstream success of many technologies, including XR and the metaverse. For now, the target demographic is strongly engaged in 2D platforms, sharing and communicating content in 2D format. Web3 developers have yet to devise a solution for users to relay their immersive experiences to one another.

The Significance of Collaboration for Globally Immersive XR Experiences

The joint decision of NVIDIA and Autodesk to launch their technologies as a “Quick Start” option on AWS is a step forward toward closing the gap between extended reality technologies and mainstream use. Users can now execute NVIDIA CloudXR and Autodesk VRED to create high-quality and immersive XR experiences, anytime, anywhere.

NVIDIA Autodesk VRED on AWS

Autodesk VRED is a 3D visualization solution that professionals in the architecture, engineering, and construction (AEC) industries are familiar with. VRED users design dynamic presentations and interactive environments with real-time 3D assets.

NVIDIA CloudXR is based on NVIDIA RTX technology, delivering seamless streaming of extended reality experiences across various networks—on the cloud, from data centers, or mobile data networks.

Anyone can easily access these technologies via AWS Quick Start. VRED users can maximize designing and streaming immersive XR experiences with the support of NVIDIA CloudXR with dedicated NVIDIA RTX graphic cards and virtual workstation platforms.

Transformative Partnerships to Scale XR Across Industries

The collaborative effort between Autodesk and NVIDIA did not come out of the blue. In fact, NVIDIA has been sealing partnership deals with various tech and automotive firms to scale extended reality in industrial action.

For instance, NVIDIA collaborated with automaker BMW to showcase a digital twin of the brand’s car assembly system. This summer, both NVIDIA and Autodesk collaborated with Lenovo and Varjo to bring Porsche Mission R to life with AR and MR demo.

Germany-based infrastructure company Siemens engaged with NVIDIA to leverage extended reality technologies and the metaverse for the production and manufacturing industries. NVIDIA Omniverse enables digital twin design and simulation of workflows in factories.

Autodesk also collaborated with game developer Epic Games to streamline workflows and tools for AEC designers. In fact, XR headsets manufacturer Varjo worked with Autodesk VRED for AR/VR headset support and remote collaboration through its Reality Cloud platform.

The recent Autodesk University event welcomed industry professionals to discover more of the CloudXR Quick Start option. Featured courses were led by David Randle, the Global Head of GTM for Spatial Computing at AWS.

NVIDIA and Autodesk Bring Collaborative XR Experiences to the Cloud Read More »