No more ungainly break-out boxes to contend with; thankfully PSVR 2 connects to PlayStation 5 via a single USB-C cable. But if you think you’ll be able to plug in that seemingly standard cable to a VR-ready PC to play Half-Life: Alyx like you might with standalones such as Quest 2, Vive XR Elite or Pico 4, you’ll be sorely disappointed. PSVR 2 won’t work as a PC VR headset, and according to the developer behind unofficial conversion software iVRy Driver, you shouldn’t buy one with the anticipation that it ever will.
Plug in an original PSVR into a computer, and the PC thinks it’s an additional monitor. That was the starting point back in late 2016 for many to begin cobbling together unofficial support for PC VR games. One such go-to staple for PSVR-to-PC conversion is iVRy Driver for SteamVR, an ongoing project created by indie studio Mediator Software.
But what about PSVR 2? In a Reddit thread discussing the topic, Mediator Software says you should save your cash if you want to buy a PSVR 2 specifically for PC VR gaming:
Quantum computing has immense potential but incredible complexities. While zealots claim it will cure cancer and save the planet, critics warn their promises are far from being fulfilled.
One of their key challenges lies at the very heart of the field: quantum bits, or “qubits.” These information units are the quantum analog of binary bits in classical computers. To make quantum computers useful, the qubits have to be reliably controlled and manufactured at scale.
It’s a requirement that still confounds the world’s leading computer scientists. The likes of IBM and Google made impressive strides by building qubits into their quantum chips, which have to obey the laws of quantum physics at temperatures near absolute zero.
One issue with this approach is t it requires million-dollar refrigerators. Another is that just a single atom in the wrong place on the chip can cause computing mistakes.
Oxford Ionics, a startupbased in the UK, applies a different technique. The company uses a proprietary technology called Electronic Qubit Control (EQC) to control the qubits. This system applies different voltages and currents on a traditional microchip, which create magnetic fields in the surrounding space.
The quantum bits in this system are comprised of individual atoms. In their natural state, these atoms don’t tend to stay still long enough to perform a computation. To stabilize them, one of their electrons is removed to make an ion. These ions have an electrical charge, which enables the electromagnetic field to “trap” them less than a hair’s width above a chip.
“We have perfect qubits.
Dr Chris Balance, who co-founded Oxford Ionics in 2019, compares the effect to toys that use magnets to suspend objects in the air.
“This gives us the best of both worlds: we have a chip that can be made just like a normal computer processor and which can run at room temperature, and we have perfect qubits made from single ions hovering above the chip,” Balance tells TNW. “Not building the qubits means we can’t build them wrong. Nature guarantees each individual atom is perfectly identical to any other.”
Ballance (right) and Tom Hardy founded Oxford Ionics aftedr earning PhDs in Quantum Computing from Oxford University: Oxford Ionics
Unlike other “trapped-ion” exponents, Oxford Ionics doesn’t rely on lasers to control qubits. According to Balance, laser-controlled devices are effective for small systems, but extremely difficult to fabricate and integrate at chip scale. They also become error-prone as the size of the processor and the number of qubits grows.
In tests, the Oxford Ionics system has shown seemingly superior results. The technology currently holds arangeofrecords for quantum computing performance, speed, and error rates, Ballance’s research was also cited in thescientific release that accompanied this year’s Nobel Prize in Physics.
These achievements have caught the eyes of investors. Last week, Oxford Ionics announced that it had raised £30 million in Series A funding, which will be used to grow the team and bring the tech to market.
“We are entering the discovery phase.
Balance is now looking forward to solving real-world problems.
“Over the next few years, we are entering the discovery phase of quantum computing: up to now we have not had quantum computers that solve problems we can’t solve any other way — now we do!”
Balance doesn’t expect to integrate Quantum Ionics’ tech into general-purpose chips. Instead, he envisions the company’s quantum chips running in parallel with classical semiconductors.
“Think GPUs alongside CPUs,” he says.
It may likely still take years for killer apps to emerge, but Oxford Ionics could push quantum computing closer to the mainstream.
In 2021, Microsoft won a United States Army defense contract worth up to $22 billion which would support the development of an Integrated Visual Augmentation System (IVAS), a tactical AR headset for soldiers based on HoloLens 2. Now Congress has rejected the Army’s request for $400 million to buy as many as 6,900 more of the AR combat goggles this year, a Bloomberg report maintains.
The rejection cites rocky tests conducted last year. Testing was done over a three-week period ending June 18th, where the Army assessed Microsoft’s IVAS with a cadre of 70 Army infantry soldiers, who were tasked with using the device during three 72-hour combat scenarios.
Complaints included “mission-affecting physical impairments,” with more than 80 percent of soldiers experiencing headaches, eyestrain and nausea after less than three hours using the goggles.
Softening the blow somewhat, lawmakers have earmarked $40 million to develop a new IVAS model, Army spokesman David Patterson said in an email obtained by Bloomberg.
This comes only a few weeks after the Army awarded a $125 million “task order” for the development of a new model, dubbed version 1.2, which is said to include software improvements for better reliability and reduced power demand.
The 1.2 version task order is said to provide “improvements based on completed test events” which aim at a developing a “lower profile Heads-Up Display with distributed counterweight for improved user interface and comfort.”
In the meantime, the Army will be using its first batch of 5,000 goggles for training—only a small fraction of the max 121,000 devices, spares and support services stipulated in the $22 billion deal.
Ioanna is a writer at SHIFT. She likes the transition from old to modern, and she’s all about shifting perspectives. Ioanna is a writer at SHIFT. She likes the transition from old to modern, and she’s all about shifting perspectives.
Finland clocked a 75% increase in wind power capacity last year, boosting the country’s renewable energy cred.
According to the latest statistics from the Finnish Wind Power Association (FWPA), 2022 was a record time for green power. Specifically, 437 new wind turbines were put into operation, delivering a 2,430MW power capacity. What’s more, wind power covered 14.1% of the country’s electricity consumption, rising from 9.3% in 2021, a period in which 141 turbines were installed.
As a result, Finland now has a total of 1,393 wind turbines with a combined power of 5,677MW — raised by nearly 43% in 2022 alone. Some 47% of the total wind power is domestically owned, and the majority of turbines have between 3 and 4.99MW power capacity.
Notably, the projects completed last year brought over €2.9 billion worth of investments into the country. This makes wind power one of the most funded sectors in the Nordic nation.
“No other industry currently brings as many annual investment euros to Finland as wind power. Wind power also brings vitality to many small municipalities, where investment targets may otherwise be few,” Anni Mikkonen, FWPA’s CEO, noted.
“In addition to investments, wind power is now increasing our country’s energy self-sufficiency at a really good pace — just when new and affordable electricity production is most needed. No other electricity generation can be built in Finland as quickly and as cost-effectively right now,” she added.
According to FWPA, the future of Finnish wind energy is looking brighter and brighter. Approximately 1,000MW of power capacity will be completed this year, over 1,200MW in 2024, and around 1,000MW in 2025 — when wind power is projected to cover at least 28% of Finland’s electricity consumption.
If this pace is kept, the country will not only strengthen its energy efficiency, but also increase its competitive advantage in the industry — in effect, attracting more capital in its wind projects and promoting local companies active in the field.
MeetKai has been around since 2018 but some of its first publicly enjoyable content hit the streets a few months ago. Now, the company is releasing a suite of software solutions and developer tools to help the rest of us build the metaverse.
“The purpose of the Time Square activation and campaign was really to test things out in the browser,” CEO and co-founder, James Kaplan, said in a video call. “With 3D spaces, there’s a question of whether the user views it as a game, or as something else.”
MeetKai Metaverse Editor – Los Angeles Chargers
Those insights have informed their subsequent outward-facing work with the Chargers, but the company has also been working on some more behind-the-scenes products that were just released at CES.
“We’re moving from an innovation technology company to a product company,” co-founder and Executive Chairwoman, Weili Dai, said in the call. “Technology innovation is great, but show me the value for the end user. That’s where MeetKai is.”
Build the Metaverse With MeetKai
At CES, MeetKai announced three new product offerings: MeetKai Cloud AI, MeetKai Reality, and MeetKai Metaverse Editor. The first of those offerings is more in line with the company’s history as a conversational AI service provider. The second two offerings are tools for creating digital twins and for building and editing virtual spaces respectively.
“The biggest request that we get from people is that they want to build their own stuff, they don’t just want to see the stuff that we made,” said Kaplan. “So, we’ve been trying to say ‘how do we let people build things?’ even when they’re not engineers or artists.”
Users of the new tools can use them individually to create projects for internal or outward-facing projects. For example, a user could choose to create an exact digital twin of a physical environment with MeetKai Reality or create an entirely new virtual space with MeetKai Editor.
However, some of the most interesting projects come when the tools are used together. One example of this is an agricultural organization with early access to the products that used these two tools together to create a digital twin of real areas on their premises and then used the Editor for simulation and training use cases.
“AI as an Enabling Tool”
The formula for creating usable but robust tools was to combine conventional building tools like scanning and game engines with some help from artificial intelligence. In that way, these products look a lot less like a deviation from the company’s history and look a lot more like what the company has been doing all along.
MeetKai Cloud AI – Avatar sample
“We see AI as an enabling tool. That was our premise from the beginning,” said Kaplan. “If you start a project and then add AI, it’s always going to be worse than if you say, ‘What kinds of AI do we have or what kinds of AI can we build?’ and see what kind of products can follow that.”
So the first hurdle is building the tools and the second hurdle is making the tools usable. Most companies in the space either build tools which remain forever overly complex, or they make tools that work but have limited potential because they were only designed for one specific use or for use within one specific environment.
“The core technology is AI and the capability needs to be presented in the most friendly way, and that’s what we do,” said Weili. “The AI capability, the technology, the innovation has to be leading.”
The company’s approach to software isn’t the only way they stand out. They also have a somewhat conservative approach when it comes to the hardware that they build for.
“I think 2025 is going to be the year that a lot of this hardware is going to start to level up. … Once the hardware is available, you have to let people build from day one,” said Kaplan. “Right now a lot of what’s coming out, even from these big companies, looks really silly because they’re assuming that the hardware isn’t going to improve.”
A More Mature Vision of the Metaverse
This duo has a lot to say about the competition. But, fortunately for the rest of us, it isn’t all bad. As they’ve made their way around CES, they’ve made one more observation that might be a nice closing note for this article. It has to do with how companies are approaching “the M-word.”
“Last CES, we saw a lot of things about the metaverse and I think that this year we’re really excited because a lot of the really bad ideas about the metaverse have collapsed,” said Kaplan. “Now, the focus is what brings value to the user as opposed to what brings value to some opaque idea of a conceptual user.”
Kaplan sees our augmented reality future as like a mountain, but the mountain doesn’t just go straight up. We reach apparent summits only to encounter steep valleys between us and the next summit. Where most companies climb one peak at a time, Kaplan and Weili are trying to plan a road across the whole mountain chain which means designing “in parallel.”
“The moment hardware is ready, we’re going to leapfrog … we prepare MeetKai for the long run,” said Weili. “We have partners working with us. This isn’t just a technology demonstration.”
How MeetKai Climbs the Mountain
This team’s journey along that mountain road might be more apparent than we realize. After all, when we last talked to them and “metaverse” was the word on everyone’s lips, they appeared with a ready-made solution. Now as AI developer tools are the hot thing, here they come with a ready-made solution. Wherever we go next, it’s likely MeetKai will have been there first.
Concrete has been described as the most destructive material on Earth. After water, it’s the most used substance in the world, with twice the usage of steel, wood, plastics, and aluminium combined.
Cement makers urgently need to reduce this footprint. To meet the requirements of the Paris Agreement on climate change, the industry needs to cut emissions by at least 16% by 2030. At the same time, the sector faces growing demand from rapid urbanization and population growth.
It’s foreboding problem. But engineers believe that graphene offers a solution.
“Just 0.01% of the material is required.
First isolated at the University of Manchester in 2004, Graphene’s 2D nature provides a unique combination of strength, flexibility, lightness, and conductivity. These properties caught the eye of Nationwide Engineering, a British construction business.
The firm’s memorably-acronymed R&D subsidiary, NERD (Nationwide Engineering Research and Development), was tasked with turning the “wonder material” into a new additive: Concretene.
The substance has already formed floor slabs in the UK. Credit: Concretene
Concretene consists of graphene that’s produced at Manchester University. Small quantities of the liquid formulation are added during the concrete mixing process.
The graphene provides both mechanical support and an active surface for the chemical reactions that occur during the cement hydration and hardening.
“Very low dosages of the material, in some cases less than 0.01%, are required to deliver substantial performance gains,” Alex McDermott, the co-founder of Concretene, tells TNW.
“This means that Concretene is commercially viable with wholesale costs to be in-line with existing additives already used in the concrete industry.”
EA’s Codemasters is getting ready to launch its first Quest-native title next week, a VR version of popular arcade racing sim GRID Legends.
GRID Legends for Quest is set to launch on January 12th, which is quite the surprise since we haven’t heard anything of it before finding the listing on the Quest Store. The game is currently available for pre-order, priced regularly at $30, but on pre-order sale for $27.
The game is said to include the full single-player story mode, race creator to design races, and online multiplayer, which unfortunately doesn’t include cross-play with flatscreen versions of the game.
The only info available for now comes from the store listing, which includes a trailer and a few images showing off some gameplay stills and the garage.
Image courtesy EA Games
Image courtesy EA Games
Image courtesy EA Games
Image courtesy EA Games
From the listing, it’s clear GRID Legends is going to be a hefty download at an expected 31.1GB, so make sure to have plenty of storage room available.
This isn’t EA’s or Codemasters’ first VR title, although it is their first Quest-native game, collectively speaking. The studios launched PC VR support for F1 2022, bringing the beloved racing sim franchise to VR for the first time. EA’s Motive Studios also launched the well-received Star Wars: Squadrons arcade dogfighter to SteamVR and PSVR in October 2020.
EA’s Respawn Entertainment holds the honor of the parent company’s first Quest-native title with Medal of Honor: Above and Beyond, which launched on Quest 2 and SteamVR headsets in December 2020.
Critically acclaimed VR survival game Song in the Smoke (2021) is getting a full overhaul when it launches on PSVR 2, something developer 17-BIT says will rival the visual quality of the game’s PC VR version.
Song in the Smoke launched on Quest, PC VR and PSVR last October—just in time to win our PSVR Game of the Year in 2021. We were impressed by its well-studied crafting depth, expressive art style, and harrowing encounters with the primeval world’s mosh of beastly creatures.
And although not a given, since PSVR 2 titles won’t share backwards compatibility with PSVR, developer 17-BIT says Song in the Smoke is definitely headed to the upcoming PSVR 2, and with more visual panache.
“It wasn’t a light upgrade – it was a ton of work and up-rezzing of so many visual systems,” creative director Jake Kazdal tells Edge in its 380th edition. “It stands alone, even compared to the highest-end version possible on PC VR – it’s honestly not even close.”
The studio says it wants to bring the PSVR 2 version of the game to players as a free upgrade to the original game on PSVR, although Kazdal admits they “still need to figure out the logistics of that.”
Dubbed Song in the Smoke: Rekindled, the game is said to be “more than a remaster,” as it’s set to feature what the studio says in a recent tweet “tons of polish and additional features driven by user requests and feedback,” calling it an “ultimate edition.”
The studio says we’re due to learn more soon. It’s not clear whether Song in the Smoke: Rekindled is destined for launch-day status for the PSVR 2 when it hits shelves February 22nd, 2023. We’ll be keeping our eye on 17-BIT in the meantime, and also adding it to our growing list of all PSVR 2 games announced to date.
Among Us has been a hit game for a while now. Among Us VR is a more recent phenomenon. Get your tasklist ready, memorize the map, warm up your button-smacking hand, and trust no one as we pilot the Skeld II through a trial run.
A Brief History of Among US VR
Game studio Innersloth released the original Among Us in 2018 as a free app game with some optional paid customization features. The immensely social game sees a team of travelers piloting a ship (the Skeld II) through space only to find that some among them are “Impostors” who sabotage the ship and kill crewmates.
Playing as an Impostor, players try to blend in with the crewmates while destroying the ship’s functions and/or murdering enough of the real team to take over. Playing as a crewmate, players need to keep the ship flying and stay alive long enough to determine which of them are Impostors.
Among Us VR is no simple port (though unofficial attempts in the form of amateur mods on social VR platforms have existed for some time). To make the game immersive, Innersloth partnered with XR game studio Schell Games, known for titles like Until You Fall, and the I Expect You to Die series.
Announced at Meta Connect and launching the following month, Among Us VR is currently available for $10 on SteamVR and the Quest Store. Doing this review, I played on my Quest 2 (review).
Among Us VR is meant for players 13 and up. Violence is cartoony but graphic and inescapable. The title is also necessarily social so the effort at protecting young players is nice, even though it doesn’t work at all ever. The gameplay is complex when executed correctly by mature players – and equally complex when operated in ways a mature player doesn’t expect.
Navigating Menus, the Tutorial, and Online Gameplay
When first booting up Among Us VR, players are prompted to enter a birth date. You can lock this in on your headset, or you can choose to require a birth-date entry on each boot. That means that to play the game you have to be at least old enough to know how to lie about your birthday. The title is intended for players 13 and up. It is played by children 6 and up.
The main menu is simple. Your standard settings options are there, as are your customization options. Change the color of your crewmate and trick it out with little hats. Some hats are free, and some hat packs are available for purchase. (Yes, I do have the tiny crewmate hat only available to people who pre-ordered the game.)
The two largest buttons that dominate the main menu are to play online and learn how to play. The learn-how-to-play option is an offline tutorial that takes you through several aspects of gameplay without other users running around murdering you.
Learning the Ropes
Because the tutorial is representative of so many aspects of gameplay and to respect the privacy of online players, all of the screenshots in this review were taken in the offline tutorial or provided by Schell Games.
The tutorial takes you through life as a crewmate, solving tasks, pushing buttons, reporting bodies, and getting murdered. Then, you experience the afterlife (dead crewmates can’t vote, communicate with the living, or report bodies, but they can still complete tasks). You also get to play as the Impostor, climbing through vents, sabotaging the ship, and killing crewmates.
Unfortunately, the tutorial is limited to two rooms on a fairly large map. It also doesn’t include all of the tasks that you’ll need to complete when playing a full game. However, it’s still a nice introduction.
The controls are smooth. All of the tasks could theoretically be hand-tracked, but movement is controlled with the controllers, so they’re a must-have. There’s also a button to bring up the ship map and do some other basic commands. The controller layout isn’t overly complex or challenging, and all major controls are spelled out in the tutorial and are changeable in settings.
Movement is smooth, and your view goes into a sort of tunnel vision while you’re moving to prevent motion sickness. If you’ve read my reviews before, you know I can get motion sickness pretty bad pretty quick, but I find Among Us VR to be pretty comfortable. Also, because everything is controller-based, you can play sitting down.
Taking It Online
There are two main options for playing Among Us VR online, one for smaller and shorter games, and one for longer and more populated matches. A shorter game might only have five players including one Impostor, while the longer games have more crewmates and more Impostors. Other than that, the gameplay is the same.
There’s no formal breakdown of how a game plays out in terms of round length or anything like that. But, there is a sort of structure. Here’s how it plays out, as I understand it:
The Impostors can murder one crewmate and sabotage one ship component per round. A round culminates in an “emergency meeting” called when a body is discovered. All of the players converge on the cafeteria to try to decide who the Impostors are, followed by a round of voting, during which the players vote out one player – who may or may not be the Impostor.
There are a few gameplay elements that make things a little trickier. For example, Impostors can still fix sabotages and report bodies. This helps them make it look like they’re really part of the team. Further, fixing sabotages usually requires standing still and facing a wall for a few seconds – a prime opportunity to get murdered by an Impostor.
Now, About My Crewmates…
The first time that I played Among Us VR, I was definitely the oldest person on deck by probably twenty years. I’m no autumn rooster, but I was definitely surrounded by spring chicks.
When this eventually became apparent, I became an immediate subject of suspicion. I felt a bit like Robin Williams in Hook when the Lost Boys rally against the only adult on their island in Neverland. I managed to survive the game, but only to see the Impostor take the ship. I wonder if this dynamic didn’t make things more interesting.
One crewmate shouted so loudly and so consistently that he knew who the Impostor was during the first round of voting that the rest of us all thought he was casting suspicion off of himself. We voted him off immediately only to find at the end of the game that he had been telling the truth.
I’ve been writing about VR since this particular Impostor was eating dirt in daycare. But Among Us doesn’t care. That’s part of the beauty of the game. I chose not to trust my crewmate. Sure he was young, and sure he got a crewmate to change color in the lobby because “nobody likes purple” but – when push came to shove – I underestimated him and it cost us the ship.
If you would rather play Among Us VR with adults, I have a sneaking suspicion that younger players favor shorter matches. I’m sure that the time of day that you play makes a big difference too. But, we’ve already seen how well I understand children.
Fun for (Almost) Any Age
All things considered, Among Us VR is great fun at a great price. So what, there are grade schoolers online? The game is VR, but it’s also a game with simple mechanics built on a social framework. Maybe in an update developers should acknowledge the “age problem” and have separate lobbies for different ages. In the meantime, grow up and play your little video game.
OVER’s Map2Earn Beta program makes creating 3D world maps more accessible to users with smartphones. Using the OVER app, they can now take photos of any physical location and generate an OVRMap. As a result, they can take part in OVER’s global mapping initiative, while also getting the opportunity to earn rewards.
Paving the Way for Richer AR Experiences
According to OVER, Map2Earn is the company’s’ “biggest project yet.” It introduces more accurate data collection capabilities and takes geolocalization capabilities to the next level.
The accuracy of GPS systems in outdoor spaces is limited to about 6 meters. However, with Map2Earn Beta, OVER has increased the localization accuracy to 20 centimeters.
This broadens the horizon for newer and more immersive AR experiences and use cases. For instance, users can have superimposed AR experiences on existing real-world structures or accurate geolocation of assets in indoor and multi-floor settings.
“When we think about the future of AR, we imagine an augmented world with contextualized 3D experiences that seamlessly merge with the physical world,” said Diego Di Tommaso, COO and co-founder of OVER, in a press release shared with ARPost. “In order to achieve this, we need a system to precisely locate the observer in space – we need geolocalization accuracy. That’s why we built Map2Earn, a system to precisely relocate you in space using computer vision that goes far beyond what can be achieved with GPS only. Such a system will finally enable the creation of the AR use cases that, as of right now, we can only just dream about.”
Its Alpha testing phase, which involved more than 1,200 maps created by 300 early users, was a success. Now, the Beta version is accessible to all users via the OVER app (available for both iOS and Android).
How the Map2Earn Beta Program Works
The Map2Earn Beta program is backed by an intuitive UX that guides the users through the capture process. Users, or – as OVER calls them – the mappers, will film each OVRLand location to generate three assets:
The location’s 3D point cloud, which delivers a precise visual reference of the location the user wants to augment through AR;
Relocalization algorithms, which use the point cloud to locate the observer and create a more immersive AR experience;
A NeRF (neural radiance fields)-generated digital twin of the mapped area, which creates a simulation of the mapped location.
The user-creator will own the 3D structure of the locations that they’ve mapped using their smartphone. As of this writing, users can view the digital twin via a virtual drone fly-through. However, OVER will be working toward making these locations freely explorable.
A New Way to Explore the Metaverse
With the Map2Earn Beta program, OVER and its users can build up-to-date, Web3-based 3D maps of significant locations. Also, the scope of the program covers both indoor and outdoor spaces.
“OVER’s vision is to create a Web3-based, community-owned, up-to-date 3D map of the most important indoor and outdoor locations in the world – the likes of which we have not seen before,” said Di Tommaso. “OVRMaps have a fundamental importance for AR. They are the portal to the AR metaverse, without which there is no way to reliably and coherently augment the physical world.”
The Map2Earn production release is scheduled for late January, when users will be able to mint their 3D maps as NFTs. Then, they can trade these assets via the OVER marketplace, as well as other decentralized marketplaces, such as OpenSea. This is one of the ways through which users can earn.
In the future, OVER will be launching a direct incentivization program. Through this, users will be able to access the so-called open-to-buy orders and acquire the maps of their desired locations.
To access the Beta program, download the OVER app on Google Play or the App Store and follow the directions, which you can find under “Map2Earn”.
Earlier this month Valve changed the longstanding format for displaying which VR headsets are supported on a game’s Steam Store page. The company says the change was made to ‘keep up with the growing VR market’.
Earlier this month some folks were alarmed to see that the ‘VR Support’ section on the right side of a game’s Steam store page—which showed the headsets and playspaces a game supported—had been removed, seemingly leaving only ‘Tracked Motion Controller Support’ to indicate that an app supported VR.
As Valve tells Road to VR, however, the information was not removed but merely reorganized and streamlined—and it seems it may have taken a bit for the changes to correctly proliferate across store pages.
“We decided to organize things a bit differently, as we found the old system wasn’t keeping up very well with the growing VR market,” a Valve spokesperson tells us. “You can now find this info in System Requirements. We also added flags for VR Only, VR Supported, and tracked motion controllers to the Features section. The changes are also aimed at giving developers more control and flexibility.”
So now instead of a game listing all supported headsets and/or VR platforms on the right side of the page, developers can choose to show ‘VR Only’ or ‘VR Supported’. Meanwhile, further down in the System Requirements section, developers can additionally specify which headsets or playspaces are supported under the ‘VR Support’ prefix.
Looking at several examples shows how this works in practice.
Half-Life: Alyx, for instance, lists ‘VR Only’ and ‘Tracked Controller Support’ on the right side of the page (and still prominently includes a notice that the game requires a VR headset). In its System Requirements we see ‘VR Support: SteamVR’, indicating that the game affirms support for all SteamVR headsets.
Dirt Rally 2 uses ‘VR Supported’ on the right side of the page, and under System Requirements we see ‘VR Support: SteamVR or Oculus PC’ (indicating that the game supports both the SteamVR and native Oculus PC runtimes). Notably the game does not list ‘Tracked Controller Support’ on the right side, meaning players cannot use VR controllers with the game but must use another input like keyboard or traditional controller instead.
While we don’t have any inside knowledge as to exactly why Valve decided to change this longstanding system, the reasons they gave do make sense from the outside. The previous system confusingly listed some specific headsets (ie: ‘Valve Index’, ‘Oculus Rift’ and ‘HTC Vive’) lumped right alongside a whole platform of headsets (ie: ‘Windows Mixed Reality’)—while ignoring more modern headsets like those from Pico or Pimax. Making this change streamlines things for Valve who would otherwise have to track and add all new SteamVR headsets as they come to market.
And further, the distinction between ‘Standing’ and ‘Room-scale’ playspace sizes has become much less important over the years; very few games require a room-scale space, even though most technically support it. That left the previous ‘Play Area’ section of the store page as something of a needless remnant (except for games that only support ‘Seated’ play).
That said, there’s no doubt the change feels like it’s coming out of nowhere. And with Valve’s minimal apparent interest in VR in the last few years, it raises questions as to ‘why now?’
In the spring of 2010, physicist Jari Kinaret received an email from the European Commission. The EU’s executive arm was seeking pitches from scientists for ambitious new megaprojects. Known as flagships, the initiatives would focus on innovations that could transform Europe’s scientific and industrial landscape.
“I was not very impressed,” the 60-year-old tells TNW. “I thought they could find better ideas.”
Greetings, humanoids
Subscribe to our newsletter now for a weekly recap of our favorite AI stories in your inbox.
As it happened, Kinaret had an idea of his own: growing graphene. He decided to submit the topic for consideration.
That proposal lay the foundation for the Graphene Flagship: the largest-ever European research program. Launched in 2013 with a €1 billion budget, the project aimed to bring the “wonder material” into the mainstream within 10 years.
On the eve of that deadline, TNW spoke to Kinaret about the project’s progress over the past decade — and his hopes for the next one.
Graphene arrives in Europe
Scientists have pursued the single sheet of carbon atoms that constitute graphene since 1859, but its existence wasn’t confirmed until 2004. The big breakthrough was sparked by a strikingly simple product: sticky tape.
Andre Geim and Konstantin Novoselov, two physicists at the University of Manchester, would regularly hold “Friday night experiments,” where they’d explore outlandish ideas. At one such session, adhesive tape was used to extract tiny flakes from a lump of graphite. After repeatedly separating the thinnest fragments, they created flakes that were just one atom thick.
The researchers had isolated graphene — the first two-dimensional material ever discovered.
The researchers donated graphite, tape, and a graphene transistor to the Nobel Museum. Credit: Gabriel Hildebrand
The science world was abuzz with excitement. Graphene was the thinnest known material in the universe, the strongest ever measured, more pliable than rubber, and more conductive than copper.
In 2010, Geim and Novoselov won a Nobel Prize for their discovery. The award committee envisioned endless applications: touch screens, light panels, solar cells, satellites, meteorology, and, err, virtually invisible hammocks for cats.
The hypothetical hammock would weigh just 0.77 mg and support a 4 kg cat. Credit: Royal Swedish Academy of Sciences.
Kinaret recognized the potential. Three years later, he was heading an EU drive to take graphene from the lab to the market.
Hype versus reality
Commercializing graphene was never going to be straightforward. Studies suggest that innovations typically take between five and seven decades to evolve from inventions to products with significant market shares. Evolution would be slow — but observers were already impatient.
As the Flagship’s director, Kinaret had to manage such starry-eyed expectations. At talks, he’d frequently refer to the Gartner hype cycle, a depiction of how new technologies evolve.
The timeline starts with a breakthrough that sparks media excitement. In graphene’s case, reporters were soon claiming the material was set to replace silicon.
“Graphene cannot replace silicon,” says Kinaret. “Graphene is a semi-metal; it’s not a semiconductor.”
When reality fails to meet such inflated expectations, interest wanes and investment shrinks. Gartner describes this stage as the “trough of disillusionment.” Graphene appears to have exited this perilous period, partly thanks to the EU’s long-term commitment.
The backers that remain tend to be more practical and persistent. Now, their target is mainstream adoption.
“That’s something we promised — and delivered.
Initially, many commercial partners were frugal with their investments. One very large European company had a budget of only €20,000 for 30 months — “just enough to buy coffee for the people working on it, but not really enough to do anything substantial,” Kinaret recalls.
To increase their involvement, the Flagship needed their trust, which was challenging as rival brands would have to work together. Nokia, for instance, would have to collaborate with Ericsson.
“One dimension of trust that people needed was to trust this is for real,” says Kinaret. “The other is that participants needed to trust each other.”
The Flagship’s current membership suggests that trust has now been secured. The proportion of companies has grown from 15% to roughly 50% today. The other half are either research organizations or universities.
Kinaret describes the growth of industrial engagement as the Flagship’s key development.
“That’s something that we promised, and it’s something we have delivered,” he says.
From lab to fab
Around 100 products have emerged from the Graphene Flagship. The vast majority are business-to-business technologies, such as thermal coating for racing cars and eco-friendly packaging for electronic devices. Consumers’ products have been slower to commercialize.
Kinaret spotlights a few of his favorites. One is Qurv, a Spanish spinoff that makes graphene-based sensors, which cars can use to detect pedestrians in fog and rain.
“There are detectors today that do the same thing, but they can cost about $500 each,” says Kinaret. “The graphene detectors could cost about $1 each. That would be a total game changer in that business.”
Qurv’s wide-spectrum image sensors could enhance computer vision. Credit: The Graphene Flagship
Another highlight for Kinaret is Inbrain Neuroelectronics. The startup is developing graphene-based implants that can monitor brain signals and treat neurological disorders.
The devices could eventually stimulate the brain to control tremors caused by Parkinson’s disease. Traditional electrodes can achieve this, but they’re far stiffer than highly-flexible graphene.
“The brain is like a lump of jelly — it keeps moving around,” says Kinaret. “If you put a stiff electrode there, it results in scar formation.
Kinaret is also excited about the prospects for fundamental science. In 2018, Graphene Flagship partners revealed that over 2,000 materials can exist in a 2D form. Not all of them are stable, but a number of them are the focus of active research.
“You can make superconducting materials.
Some researchers are exploring what can be achieved by stacking the substances in multi-layers.
“You can grow them so there is a very specific twist angle between the different layers, which means they’re slightly misaligned. This misalignment angle is a very important new parameter,” says Kinaret.
“By tuning this misalignment angle, you can make materials that are superconducting and that have very interesting optical properties. This has only been explored for roughly four years, in terms of basic research, and it’s still quite far from applications. But it offers interesting possibilities for the future.”
Mission accomplished?
Kinaret is proud of the Flagship’s achievements. He believes the initiative has surpassed its targets by significant margins.
The data appears to support his claims. The European Commission aims to turn every €10 million that’s invested into one patent application. The Flagship, says Kinaret, has more than 10 times that requirement. The targets for scientific publications, he adds, have been surpassed by a similar factor.
Kinaret’s research targets potential applications. Credit: Graphene Flagship
There are still challenges to overcome. In electronics, for instance, high-quality graphene has to be transferred from the substrate on which it’s grown and onto the system where it’s used. The Flagship can do that well manually, but automating the process on an industrial scale has proven more difficult.
Nonetheless, Kinaret reminds the team they should remain positive.
“Engineers are typically short-term optimists and long-term pessimists,” he says. “They expect progress to be much faster initially than it turns out to be, but in the end, they underestimate the impacts of new technologies.”
In the future, Kinaret expects Europe to become a graphene powerhouse. The Flagship has given the continent a head start over the US in the race toward the mainstream.
He admits, however, that laypeople still ask what graphene is and can do.
“If we get to a situation where a surprised ‘what?’ has been replaced by ‘so what?’ because it’s become ubiquitous or mainstream… then we’ll have made it.”