The next show from friends at VR Games Showcase is here, with more than 30 minutes of trailers and info on upcoming VR games and updates for Quest, PSVR 2, and PC VR. Catch the action right here at 12PM ET.
Watch the full VR Games Showcase stream below, starting at 12PM ET (your timezone here):
VR Games Showcase is presented and produced by VR industry veterans Jamie Feltham and Zeena Al-Obaidi.
The Spring 2025 show promises the following:
Major Reveals And Updates: Get an exclusive glimpse of Hitman: World of Assassination on PS VR2, see the latest from Flat2VR Studios, and much more
Studios Big And Small: Learn what’s next from some of VR’s most beloved veteran studios, and meet newcomers working on innovative titles for everyone
Updates For Social Experiences: Catch up with existing multiplayer games and experiences with new updates and surprise drops
ZA/UM Studio, the entity which at one point released Disco Elysium, a critically acclaimed detective RPG about navigating the ruins of capitalism and solving a murder, has finally revealed its next game nearly six years later. Instead of Disco Elysium 2, it’s a spy RPG still mostly shrouded in secrecy called Project[…
The launch of a brand-new Monster Hunter is always a chance to dive into a fantastical world, with exciting characters and gameplay mechanics. But it also leaves plenty of questions. For example, how long does it take to finish Monster Hunter Wilds? Well, it’s something of a trick question, as the story mode is more…
Meta will host Niantic, Creature, and Resolution to “showcase real-world examples of developers who are already leveraging our latest Passthrough Camera API” at GDC next week.
Join our experts as they share their hands-on experience and insights from developing and launching Spaceship Home, a groundbreaking Mixed Reality experience on Meta Quest. You’ll gain valuable guidance on designing immersive MR experiences that captivate and inspire.
Next, we’ll give you an exclusive look at some of the exciting upcoming roadmap items that will revolutionize MR development. Discover how these new features will make it easier to create stunningly realistic and engaging MR experiences.
Finally, we’ll showcase real-world examples of developers who are already leveraging our latest Passthrough Camera API to take their apps to the next level. See how they’re using this powerful tool to create seamless and interactive MR experiences that blur the lines between reality and fantasy.
Don’t miss this opportunity to learn from the experts and stay ahead of the curve in MR development. Join us for an informative and inspiring session that will help you unlock the full potential of Mixed Reality!
Takeaway The developers will gain insights from developers that have deployed MR as to how to make their game more playable and engaging across a wide variety of rooms. They will be introduced to the new Passthrough API which support real-time use of customized computer-vision to be deployed in games.
The summit listing comes a week after a Meta support page went live describing a “headset camera” permission which “allows an app to access the real-time passthrough camera feed from the front of your headset”.
Meta announced the Passthrough API at Connect 2024 back in September, saying that it will “enable all kinds of cutting-edge MR experiences”.
The Meta support page that emerged last week gives three examples of how Quest apps could leverage the passthrough view:
• Object recognition. Developers can create apps that recognise and use specific objects within your real-environment. For example, a digital board game that incorporates physical game pieces or boards.
• Location recognition. Developers can create experiences that respond differently depending on where the camera feed shows you are located . For example, indoors or outdoors, at a famous landmark, or in a specific type of room.
• Other machine learning functionality. Developers are able to run custom machine learning models against data from the real-time camera feed. This could be used for retexturing/shading, games involving participants who are not wearing headsets, person/animal detection, or any number of custom industrial/training use-cases.
While headsets like Quest 3 and Apple Vision Pro use cameras to let you see the real world, today only the system software gets raw access to these cameras. Third-party developers can use passthrough as a background, sure, but they don’t actually get access to it. They instead get higher-level data derived by the system, such as hand and body skeletal coordinates, a 3D mesh of your environment with bounding boxes for furniture, and limited object tracking capabilities. That means they can’t run their own computer vision models, which severely limits the augmentation capabilities of these headsets.
The exception is that on visionOS 2, Apple is now giving enterprise companies raw access to Vision Pro’s passthrough cameras for non-public internal apps, but this requires a special licence from Apple and is restricted to “in a business setting only”.
Immersed now says mass production of Visor will begin “after summer”.
Who Is Immersed? What Is Visor?
Since 2020 Immersed has offered a free app now available on Meta Quest, Apple Vision Pro, Vive Focus, and Pico that shows your PC monitor in VR and lets you spawn entirely virtual extra monitors, for up to 5 monitors in total if you pay. The Immersed app supports Windows, Mac, and Linux.
Visor is a new headset fully designed around this use case, a lightweight streamlined device rather than a generalized headset for gaming.
Like Apple Vision Pro, it has a tethered battery and is primarily intended to be used seated. The battery also contains the Wi-Fi and Bluetooth antenna.
Immersed says Visor features the XR2+ Gen 2 chipset, 4K micro-OLED displays, color passthrough, as well as eye tracking and hand tracking for a Vision Pro style gaze-and-pinch input system.
When it first announced Visor back in 2023, Immersed said the headset would ship some time in 2024. But after revealing the actual design and demoing barely functional units at the September 2024 event that we attended, Immersed’s founder Renji Bijoy admitted that general preorders wouldn’t ship until April at the earliest.
In a new blog post, Immersed told customers that a “handful” of Founder’s Edition units are already “out in the wild”, and that it expects to ship the rest “on the order of weeks, not quarters”.
Once the Founder’s Edition units have been fulfilled, Immersed says it will “shift our focus to mass production” of general preorders “in the following quarter(s)”, which it says means “after summer”.
The company ascribes this delay to “a few small but important refinements needed to achieve the level of performance we’re targeting—such as enhancing the buttons, headstrap, nose piece, and soft materials”.
Immersed has demonstrated that Visor is a working device with system software, not a mockup. But given its continually slipping shipping timeline, we continue to recommend against preordering Visor, and suggest you wait for reviews of mass produced units, however long it takes for that to happen.
Quantum Threshold is an accessibility-focused VR roguelike shooter that turns your wheelchair into your greatest weapon.
Created by Finnish studio Vaki Games, Quantum Threshold promises a cyberpunk-themed narrative using wheelchair-based locomotion designed specifically for seated play. With this post-apocalyptic world overrun by rogue AI “Techno Wraiths,” you portray a surviving wheelchair user who seeks an escape across “ever-evolving” levels.
0:00
/1:06
Both traversal and battles use an upgradeable wheelchair, where you can unlock “game-breaking” weapons and modules inspired by Risk of Rain 2 and Left 4 Dead. Loot is randomized across each run, enemies will gradually evolve, while permanent upgrades are also obtainable.
“We wanted to prove that seated VR can deliver the same adrenaline rush—no compromises. Your wheelchair isn’t a limitation; it’s your greatest weapon,” said Teemu Jyrkinen, Creative Director at Vaki Games in a prepared statement.
Quantum Threshold arrives on May 22 on the Meta Quest platform and PC VR. The studio also confirmed that a limited-time closed beta sign-up will open at a later date for select platforms.
The latest update to Waltz of the Wizard on PS VR2 shows that Aldin Dynamics’ particular approach to “natural magic” works incredibly well on consoles.
While Quest owners have enjoyed hand tracking in Waltz of the Wizard for almost five years, the feature just launched for PS VR2 headsets and Aldin’s game is the first to use it. My colleague David Heaney recently compared the tracking quality between Quest 3 and PSVR 2 with some subtle nuances pointing to differences between them.
I wanted to see what this latest version of the game felt like on PS VR2. Just how immersed can I feel without the PS VR2 controllers in my hands?
PS VR2’s Natural Magic
0:00
/0:40
I decided to do a deep dive through as much of Waltz of the Wizard as I could reach on a livestream using only my hands and voice to control the game with no controllers. We’ve reported in the past on the studio’s development of “natural” magic, which means Aldin is really leaning into the idea that when you’re in a VR headset, you’re actually a wizard with the ability to speak spells or move anywhere you want with a gesture.
The latest update on PlayStation VR2 makes it the first game to utilize the headset’s recently unlocked hand tracking capabilities. Playing this again not only brought back memories of my experiences with the original legacy demo, but also showcased just how far this has come since its debut.
Voice Input On PS VR2
0:00
/0:30
Upon stepping back into that familiar wizard’s tower, I was immediately captivated by the sight of subtitles dynamically forming in midair as I spoke, albeit with a few amusing inaccuracies. This experience encouraged me to use my voice more frequently, and I was pleasantly surprised to see Skully react in real time to some of my questions.
As players progress deeper, they gain the ability to conjure objects from thin air simply by speaking them into existence. The hand tracking elevates the experience, making spellcasting through gestures feel powerful, and when combined with Aldin Dynamics’ established technology, the result is an astonishingly immersive and magical journey.
0:00
/0:30
On one hand, this experience on PS VR2 enhanced immersion and now provides a similar experience to my time playing on Quest 3. On the other, I faced several issues that I had not seen on the Quest. During testing, my hands frequently froze mid-air, a frustrating glitch that disrupted gameplay. In those moments, I was forced to awkwardly block the headset cameras to recalibrate the hand tracking. This is an unwelcome workaround that I hope gets fixed on PS VR2.
Grabbing and holding objects introduced additional challenges. Precision is critical in a magical sandbox where spell and item manipulation is core to the experience, yet I experienced numerous missed grabs and premature disconnections while using the game’s telepath locomotion system. At times, movement would just halt altogether, breaking the enchanting immersion crafted by the developers. That said, as I continued to play, the locomotion system grew more intuitive, providing a seamless method of navigation in a controller-less hand-tracked environment.
Despite these momentary technical shortcomings, I emerged from my play session genuinely impressed with the update. Aldin Dynamics and Sony have made commendable strides with this first foray into hand tracking on the PlayStation VR2. I still prefer the Quest version due to its more stable hand tracking and the overall polished feeling, but PS VR2 players do have a fully playable hand tracked game with voice input now and it remains an enjoyable experience.
Tracking both players’ eyes and hands opens up exciting possibilities for the system going forward, and I’m eager to see which titles will adopt this technology next. As developers continue to refine their use of hand tracking, and platforms improve their implementations too, I believe this overall experience will significantly improve.
You can find Waltz of the Wizard on most VR gaming platforms.
One of Phasmophobia’s oldest maps just received a new overhaul today.
Now available on all platforms, co-op horror game Phasmophobia has introduced some notable changes to the Bleasdale Farmhouse in its latest update. This rework includes a new layout, new areas, and a “total visual overhaul” with a larger dining room than before. New fragile items have been added for ghosts to mess with, and you can also explore a new tearoom and trophy room.
0:00
/0:04
Comparison video provided by Kinetic Games
You can find the full patch notes here, which also list a known issue currently being looked into for VR users. “Players on PlayStation 5 and PS VR2 may experience visual desync when opening doors in Bleasdale. Example: The door may visually remain closed upon opening,” states the developer.
Following last October’s launch on PlayStation VR2 and flatscreen consoles, Kinetic Games recently outlined its upcoming roadmap for 2025. Alongside a similar overhaul for the Grafton Farmhouse map, this includes plans for a ‘Chronicle’ that revamps how you record evidence, a player character overhaul, a brand-new small map, and seasonal events.
Meta’s Orion AR glasses prototype is expensive to make—like $10,000 per pair, expensive. Orion’s most pricey component is undoubtedly its custom silicon carbide waveguide lenses, although Meta says it sees a pathway to “significantly reduce the cost” of that key component in the future.
Silicon carbide has been around for a while, having mostly been used as a substrate for high-power chips, owing to its better power efficiently and lower heat output. Unlike silicon, silicon carbide is much more difficult to manufacture though, with challenges stemming from its material properties, crystal growth process, and fabrication complexity.
Electric vehicles are leading the way in decreasing costs, however it’s still far from reaching price-parity with the cheap and plentiful silicon-based equivalents. Another use case could involve quantum computing, although that comes with its own unique challenges separate from what Meta hopes to do with the next-gen material.
It’s not silicon carbide’s better power efficiency and lower heat output that Meta is after though. It’s the material’s high refractive index, making it ideal to provide clear, wide field-of-view (FOV) waveguides suitable for AR glasses, like the class-leading 70-degree FOV seen in Orion. And the difference between conventional multi-layered glass waveguides and Orion’s silicon carbide-base waveguides is—for the few that have tried it—night and day.
Image courtesy Meta
“Wearing the glasses with glass-based waveguides and multiple plates, it felt like you were in a disco,” says Optical Scientist Pasqual Rivera in a blog post. “There were rainbows everywhere, and it was so distracting—you weren’t even looking at the AR content. Then, you put on the glasses with silicon carbide waveguides, and it was like you were at the symphony listening to a quiet, classical piece. You could actually pay attention to the full experience of what we were building. It was a total game changer.”
Many of the world’s top electric vehicle manufactures have adopted chips based on silicon carbide in recent years, which has helped drive the price down. Giuseppe Calafiore, Reality Lab’s AR Waveguides Tech Lead, notes “there’s an overcapacity [thanks to EVs] that didn’t exist when we were building Orion. So now, because supply is high and demand is low, the cost of the substrate has started to come down.”
Notably, silicon carbide wafers used in EVs aren’t optical-grade, as they prioritize electrical performance over optical clarity, so coopting any EV chip surplus is out of the question. Still, Reality Labs’ Director of Research Science Barry Silverstein sees a path forward:
“Suppliers are very excited by the new opportunity of manufacturing optical-grade silicon carbide—after all, each waveguide lens represents a large amount of material relative to an electronic chip, and all of their existing capabilities apply to this new space. Filling your factory is essential, and scaling your factory is the dream. The size of the wafer matters, too: The bigger the wafer, the lower the cost—but the complexity of the process also goes up. That said, we’ve seen suppliers move from four-inch to eight-inch wafers, and some are working on precursors to 12-inch wafers, which would yield exponentially more pairs of AR glasses.”
Image courtesy Meta
“The world is awake now,” adds Silverstein. “We’ve successfully shown that silicon carbide can flex across electronics and photonics. It’s a material that could have future applications in quantum computing. And we’re seeing signs that it’s possible to significantly reduce the cost. There’s a lot of work left to be done, but the potential upside here is huge.”
This wouldn’t be the first time XR headsets have directly benefitted from larger, more consumer-oriented industries taking the lead. In the early 2010s, small, low-cost displays developed for smartphones were a key driver in kickstarting the consumer VR headset revolution. For example, if you’ve ever cracked open an Oculus Rift DK2, released in 2014, you’ll find a Galaxy Note 3 display panel at its core—Samsung branding and all.
That’s not to mention a host of other components that have been lifted from the smartphone parts bin over the years, including inertial measurement units (IMUs), camera sensors, and battery technology. The parallels are there, although it seems leveraging the silicon carbide wins spurred by the EV boom still won’t be nearly as straight forward in AR glasses.
While suppliers are eyeballing photonics-grade silicon carbide, it’s still a niche within a niche that will take years to scale up. It’s effectively one of the main reasons Meta can’t productize Orion today. That said, Meta is using Orion as an “internal developer kit” of sorts as its hopes to produce a pair of consumer AR glasses sometime before 2030, priced somewhere near “phone, laptop territory,” Meta CTO Andrew Bosworth revealed in September.
Still, with such massive potential for consumer appeal, these puzzle pieces will fit together somehow. Companies like Meta, Apple, Google, Microsoft, and Qualcomm all hope to own their own slices of the next dominant mobile computing platform, which aims to replace smartphones entirely.
Apple is planning a “feature-packed release” for visionOS 3.0, reports Bloomberg’sMark Gurman, who says that Vision Pro’s operating system—and not new XR hardware—is going to be a focus at this year’s World Wide Developer Conference (WWDC).
Despite its high price and premium appeal, Apple hasn’t slowed down software updates for Vision Pro, which launched in February 2024 for $3,500.
Now, Gurman reports Apple is gearing up to showcase visionOS 3.0 at WWDC, which typically takes place in June.
“All signs are pointing to the company’s Vision Products Group shifting its resources to other form factors,” Gurman maintains. “But Apple can’t just let the Vision Pro die out. It has invested too much and needs to keep churning out the device’s visionOS updates (the third edition will be a pretty feature-packed release, I’m told).”
There’s no indication yet what visionOS 3.0 could contain, although if it’s anything like visionOS 2.0, which was announced at WWDC 2024 last June, developers will likely be able to go hands-on as soon as it’s announced.
That said, information is still thin. One possible candidate for visionOS 3.0 could address the headset’s lack of motion controllers; Gurman reported last month Apple is currently working with Sony to adopt PSVR 2’s Sense Controllers as Vision Pro’s officially supported motion controller.
As for hardware reveals (or the lack thereof) at WWDC 2025, Gurman echoes previous claims made by Apple supply chain analyst Ming-Chi Kuo late last year, who reported that multiple Vision Pro follow-up are currently planned.
Gurman notes that Apple is planning a headset containing a new M-series chip (possibly M5), as well as cheaper versions of the headset. Contrary to Kuo’s report, which maintains an upgraded M5 version of Vision Pro is coming this year, Gurman claims we won’t see a follow-up headset from Apple in 2025.
visionOS 3 will be a “feature-packed release”, Bloomberg’s Mark Gurman reports.
Gurman has a relatively strong track record of reliably reporting on Apple’s product plans years in advance, and successfully described many details of Apple Vision Pro’s hardware and software before it was officially revealed.
In the latest edition of his weekly newsletter, Gurman writes that visionOS 3, the third major version of the headset’s operating system, “will be a pretty feature-packed release, I’m told”.
Since launching Vision Pro and visionOS in February of last year, Apple has regularly updated the operating system to add new features and improvements.
The first feature update arrived in visionOS 1.1 in April, Spatial Personas. This extended the system’s realistic avatars from a 2D rectangle container into true 3D space, a step-change for telepresence technology.
In September, Apple released visionOS 2, the first major update.
visionOS 2 brought hand gestures for opening the main menu and control center, the ability to turn any photo into a spatial photo, improved hand tracking and scene understanding, WebXR by default in Safari, the ability to AirPlay your iPhone or iPad to a window, a Bora Bora virtual environment, the ability to see your physical keyboard, mouse support, guest user improvements, static 3D object tracking, train support for Travel Mode, Live Captions, new developer features, and raw camera access for enterprise.
Then, visionOS 2.2 in December brought Wide and Ultrawide modes for Mac Virtual Display, letting you view your Mac on a virtual Wide aspect ratio screen, or even an enveloping panoramic Ultrawide screen. The Ultrawide mode has 10K horizontal resolution, as if you have two 5K monitors side by side, made possible thanks to foveation.
Further, with visionOS 2.2 the audio from your Mac is now routed to Vision Pro, whereas previously it still played through the Mac.
The next significant update for Apple Vision Pro will be visionOS 2.4, set to release in April. It will bring a Spatial Gallery app, an iPhone app for remote installs, and a new iPhone/iPad-driven guest flow.
visionOS 2 was announced at WWDC 2024, Apple’s yearly software and developer event, with the first beta released later the same day. As such, we expect Apple to announce visionOS 3, and launch the beta, at WWDC 2025, expected to take place in June, three months from now.
Cards & Tankards, the free-to-play social VR card game, confirmed a new expansion will launch later this month.
Developed by Divergent Realities, Cards & Tankards is a social collectible card game that’s available for flatscreen platforms and VR headsets. The studio announced that the upcoming ‘Ashes of Ur-Enku’ expansion will introduce 50 new cards, diamond animated variants, a wide range of new cosmetics, a refreshed season pass, balancing changes, and rank resets.
“Set in the untamed lands of Ur-Enku, this expansion follows the Plundering Guild as they embark on a daring fire-hunting expedition, a forbidden technique that uncovers ancient Dwarven archives hidden beneath decades of overgrown brush. These archives chronicle experiments with Aether, a powerful primordial force—and tragically, only one explorer will emerge alive,” explained the studio.
Launched nearly two years ago, Cards & Tankards has received a considerable range of post-launch updates ever since. The most recent is December’s Version 2.4 with wide-ranging fixes, while Version 2.3 rebalanced over 50% of the game’s cards. Last March added ‘The Crusade of Sun & Stone‘ expansion, and the studio also introduced a ‘Ranked Mode‘ back in late 2023.
Cards & Tankards is available now on the Meta Quest platform, Steam, and Google Play, while the Ashes of Ur-Enku expansion arrives on March 27.