
The second half of Lost Records: Bloom & Rage released on April 15, reaching a stunning climax that will return to haunt Swann and her friends 27 years after the events that transpired during the summer of 1995.
The second half of Lost Records: Bloom & Rage released on April 15, reaching a stunning climax that will return to haunt Swann and her friends 27 years after the events that transpired during the summer of 1995.
A lone Microsoft employee is unofficially working on a native SteamVR driver to bring Windows MR headsets back to life.
In October, Microsoft started rolling out Windows 11 24H2, the latest major version of its PC operating system which removed support for Windows MR headsets. This means you can no longer use Acer, Asus, Dell, HP, Lenovo, or Samsung PC VR headsets, not even on Steam, since Windows MR had its own runtime and only supported SteamVR through a shim.
Now, Software developer Matthieu Bucchianeri says he’s working on a native SteamVR driver for Windows MR headsets, which he calls “Oasis”. The driver would add direct SteamVR support, just like a Valve Index, HTC Vive, or Bigscreen Beyond.
Bucchianeri is a very experienced developer, having worked on the PS4 and original PlayStation VR at Sony, Falcon 9 and Dragon at SpaceX, and HoloLens and Windows MR at Microsoft, where he currently works on Xbox. At Microsoft he contributed to OpenXR, and in his spare time he developed OpenXR Toolkit and VDXR, Virtual Desktop’s OpenXR runtime. He was also an outspoken critic of Meta’s previous OpenXR strategy.
Bucchianeri says that his upcoming Oasis driver is the result of “deep reverse-engineering” alongside “a combination of luck and perseverance”. He claims that while his work isn’t breaching intellectual property laws, he won’t be releasing the source code to avoid accidentally breaching NDAs “and other obligations”.
Currently the driver is only confirmed to work with Nvidia GPUs, since AMD controls its VR direct mode more strictly, but Bucchianeri is in talks with AMD about this.
Bucchianeri plans to release his Oasis native SteamVR driver for Windows MR headsets in fall for free. If the project succeeds, it could bring a wave of ultra-affordable PC VR headsets, although deep discounts didn’t help these headsets gain widespread adoption the first time around.
Adobe is making a visionOS app “powered by Premiere” for natively editing Spatial Video on Apple Vision Pro.
Announced during WWDC25 by Apple, no further details were given, but the company shared a short clip of the upcoming app in action, showing it being used to add text in front of a horse.
0:00
Spatial Video is Apple’s term for stereoscopic 3D video using the Apple HEVC Stereo Video Profile format of MV-HEVC. Spatial videos can be captured by all iPhone 16 models, the iPhone 15 Pro models, Apple Vision Pro, and Canon’s EOS R7 camera with its upcoming spatial lens attachment.
Spatial videos can be viewed on Apple Vision Pro, Pico 4 Ultra, Meta Quest headsets via Meta’s phone app, or any VR headset if you convert the file to SBS 3D.
There are already visionOS apps on the App Store for editing spatial videos, such as SpatialCut, so we’ll be curious to see how Adobe’s compares, and whether it integrates with Adobe’s Creative Cloud.
You can also edit spatial videos on a Mac, using Apple’s Final Cut Pro.
At WWDC25 Apple also announced visionOS 26, set to be the biggest Vision Pro software update yet, and you can read all about it in our article:
Will games like Assassin’s Creed Mirage & Resident Evil Village come to Vision Pro later this year?
Apple Vision Pro can theoretically run almost all iPad apps by default, unless developers opt out.
So far, developers of AAA games available on iPad have chosen to opt out, and this may be down to performance limitations. While Vision Pro uses the same M2 chip as iPads that can run these games, the added overhead of visionOS and all its features mean developers have reported that the headset is unable to handle demanding titles.
One issue developers have run into is the lower available memory of visionOS. But in visionOS 26 Apple is increasing the platform’s memory limit, and the company claims this enables “high-end iPad games” on Vision Pro.
0:00
One of the only currently available AAA titles is the NBA 2K series.
However, Apple did not announce any specific “high-end iPad games” newly coming to the platform. We’ll keep an eye out for any arriving later this year, when visionOS 26 actually arrives as a stable release.
For more about visionOS 26, read our article covering the many changes set to arrive in Apple’s biggest Vision Pro software update yet:
One of the biggest pieces of Vision Pro news from today’s WWDC is that the headset will be getting support for PSVR 2 motion controllers and a Logitech motion stylus with the release of visionOS 26. Now we’re learning more of the details, including the fact that developers will be able to publish Vision Pro apps that ‘require’ motion controllers, and those which make them ‘optional’. The change is a surprising shift compared to the hand-tracking only approach that Apple launched the headset with.
In a recorded developer session released at WWDC 2025, Apple went into more detail about upcoming support for motion controllers on VisionOS 26.
One of the interesting things we learned is that developers will be able to publish Vision Pro apps with an ‘optional’ or ‘required’ motion controller designation. Developers will make this choice, and it will show up in the App Store so users know what to expect.
This of course opens the door to Vision Pro content which works exclusively with motion controllers. While it would be nice if hand-tracking was supported in every single app, the reality is that most existing VR games (ie: on platforms like Quest and PC VR) are built specifically for motion controllers, and would need to be significantly redesigned to support hand-tracking.
In VisionOS 26, Apple will allow developers to publish Vision Pro apps which ‘require’ motion controllers, meaning developers will be able to publish their existing VR content on Vision Pro with significantly less effort than if hand-tracking was required.
When building apps for motion controllers on Vision Pro, developers will be able to choose between two different tracking modes: Predicted and Continuous.
The Predicted mode will offer the lowest apparent latency by estimating where the controller will end up in future frames based on its present movement. This mode will likely be best for games with lots of player motion.
The Continuous mode will offer the highest precision by not estimating future position. This will prevent ‘overshoot’ when a user suddenly changes the direction of the motion controller. However, this will come at the cost of higher apparent latency. This mode will likely be best for apps that want optimal precision, like art and productivity.
Despite talking about these two tracking modes, as far as we know, Apple has yet to share any specific info on how much latency developers can expect when using motion controllers on Vision Pro.
For more on VisionOS 26, check out our overview of major changes here.
The post Vision Pro Will Allow ‘Optional’ and ‘Required’ Designations for Apps Using Motion Controllers appeared first on Road to VR.
Pickle Pro, from Resolution Games, is the first announced Apple Vision Pro game to use the PS VR2 Sense controllers.
If you missed the news, earlier today Apple announced and released the first developer beta of visionOS 26, which includes support for the PlayStation VR2 Sense controllers.
Resolution Games is a veteran XR studio behind dozens of top titles across all major headsets. Apple contracted the studio to build Game Room for Vision Pro’s launch and the Gears & Goo tower defense game that released earlier this year, both on its Apple Arcade subscription service. Resolution has also ported its flagship cross-platform title Demeo to visionOS.
The studio’s next Vision Pro game is called Pickle Pro, and it’s the first visionOS title to announce support for PS VR2 Sense controllers.
Resolution says Pickle Pro is “the ultimate pickleball training and competition app”, with “hyper-real physics”.
0:00
The announcement comes just over three months after Resolution added pickleball to its cross-platform VR/MR game Racket Club, and the studio says Pickle Pro will use the same physics engine.
Further, it says Pickle Pro’s bots are trained on thousands of hours of gameplay data from Racket Club, using machine learning to make them “smarter, more realistic, and more challenging with every match”.
There’s no release date yet for Pickle Pro, but visionOS 26 is set to arrive later this year.
Soon, Apple Vision Pro owners will be playing games using officially supported tracked controllers in their hands from Sony.
The controllers, of course, will be sold separately, as Apple was sure to note in its WWDC 2025 press announcement. The feature is coming to visionOS 26, set to arrive roughly one and a half years after the headset launched.
Though still gated for now behind a $3500 entry fee for the headset itself, as well as getting your hands on the additional hardware, direct support for PlayStation VR2’s controllers opens up a relatively straightforward path for porting the work of numerous VR developers from the default input system on Quest, PC VR, and both generations of PlayStation VR system. At least, it’s likely going to be a more direct path than building for a gaze and pinch hand tracking only interface.
Until now, most VR game developers have considered building for Apple’s platform a non-starter. That’s because of the combination of few players and the absence of both the haptics and low latency tracking provided by controllers like these. PlayStation VR2’s controllers feature resistive triggers and advanced haptics developers could use with Sony’s controllers for the first time in standalone VR, though we haven’t identified yet whether those features are supported yet. Still, many developers will be put off by the limited player base on Apple headsets, of course. Some, though, will see this as a major indication Apple is looking to create a real market here for future games in VR and mixed reality. Overall, direct integration of Sony’s tracked controllers starts to change the game for Apple and prospective creators.
0:00
I recently tested Ping Ping Club on Apple Vision Pro and found a game that works as well as it can with hand tracking. Unfortunately, the title falls apart the moment you try to get competitive, and that happens because current generation hand tracking is still pretty limited. Third party manufacturers have rolled their own version of the open source ALVR software for PC VR streaming, thus enabling players to use the Surreal Touch controllers for PC VR games. This essentially offers a route to play a top-tier competitive game like ElevenVR streaming from a nearby PC.
A small number of highly skilled development groups like Resolution Games, Schell Games and Triband have been able to secure development agreements on Apple devices for exclusive titles featuring the company’s breakthrough gaze-and-pinch interface. In fact, Resolution just announced it would be making Pickle Pro for “training and competition” in pickleball on Apple Vision Pro with compatibility for the PlayStation VR2 controllers.
“Thanks to our shared physics engine and the deep data pool from our award-winning game Racket Club, Pickle Pro is shaping up to be the most authentic pickleball simulator and training app out there,” Resolution Games’ Tommy Palm said in a prepared statement. “All our bots are trained on real gameplay data—from beginners to pros—making them smarter, more realistic, and more challenging with every match. Without the thousands of hours players have put into Racket Club, the machine learning bots in Pickle Pro simply wouldn’t be possible. We’re very excited about the potential here.”
A Resolution representative declined to say whether other controller-dependent games in their library would be making the jump to Vision Pro.
Vision Pro owners, meanwhile, have used the brute force of open source to connect their PC VR games to Vision Pro. On the outside looking in, many developers with limited time to focus on different platforms, as well as cash-strapped potential buyers, await Valve’s entry to the standalone VR market, even as Meta forges ahead making big budget home-grown titles for Quest headsets like Batman: Arkham Shadow and Deadpool VR. None of these trends change because Apple suddenly supports an official tracked controller solution, but it does start to change what could be on the horizon.
I reached out to one of the developers of ElevenVR to ask them about the new Apple feature and whether it changes how they approach multi-platform development.
“We’d love to be on every device possible but there is a huge consideration to effort and time and people it can reach – which is why we aren’t on every headset possible, ” the developer explained over direct message. “If there is a hardware combo that can support the game we’d love to be on it if we have enough time to prioritize it and it makes sense to do so…as each additional platform takes away from others.”
The developer pointed out players have already requested a version of the game installable on phones so they can be used as a spectating camera. The dev says there’s no timeline for release of a spectator camera for iPhone, and was 100 percent clear that there’s no guarantee they’ll be bringing their work to iPhone or Apple Vision Pro. As the developer points out, it’s believed Sony sold more PlayStation VR2 headsets, which come bundled with the controllers, than Apple has shipped Vision Pro devices, none of which have controllers included in the box.
We’re reaching out to developers to see whether Apple’s latest feature changes any of their plans and we’d love to see some discussion in the comments below digging into the considerations at hand.
Overall, what does the VR market look like in about a year to a year and a half? Come Christmas 2026, what options do people have to buy headsets and what games will be available for them? And what will be the dominant input method on those devices?
It’ll take time to have an effect, but a lot of developers are about to realize the VR landscape has changed in a pretty dramatic way.
VisionOS 26 is Apple’s next major update to Vision Pro. Expected to be released to the public later this year, Apple says developers will be able to start testing the new version starting today. The update adds some major changes, including support for PSVR 2 motion controllers and a spatial stylus from Logitech.
Apple previewed visionOS 26 today during WWDC 2025. The new version technically follows VisionOS 2, but adopts the company’s new version naming scheme which appends the upcoming year to the software name, rather than a version number—hence VisionOS 26 (which will be publicly released in time for 2026).
The new software update for Vision Pro brings some major changes and enhancements.
Since the day Vision Pro was announced (and Apple confirmed it would not ship with motion controllers), developers of existing XR content have been asking for motion controllers for more precise and immersive control of virtual content.
Now that wish is finally coming true. Rather than making its own motion controllers though, Apple is partnering with Sony to add support for PSVR 2 Sense controllers on Vision Pro in VisionOS 26.
Interestingly, Vision Pro is also getting official support for a new Logitech Muse motion stylus, which looks similar to the existing Logitech MX Ink (which is compatible with Quest).
Since day one, Vision Pro has supported gamepads like Xbox and PlayStation controllers for playing flatscreen content. Now VisionOS 26 is adding ‘breakthrough’ for gamepads, which means users will be able to see the controller in their hands even when their view is otherwise occluded by a fully virtual scene. This is similar to the way the headset allows both people and keyboards to ‘breakthrough’ the virtual view so users can talk to people in the room with them and type more easily.
VisionOS 26 adds support for widgets. Similar to widgets on other Apple devices, they are small applets that provide glanceable information. On Vision Pro, however, users will be able to anchor them around their home and they will purportedly always stay in place any time you put on the headset. This includes being able to mount them to walls.
Users could previously place app windows or spatial content around their home, which would stay in place. However, those windows would be rearranged any time the user recentered their headset. Widgets, on the other hand, should always stay where they’re placed.
From what we understand, widgets on VisionOS 26 will support both custom-made widgets for the platform (including panoramas and spatial photos), as well as existing widgets built for iOS and iPadOS.
Apple’s ‘Persona’ avatars are based on a scan of the user’s head which is taken from the headset itself. They are already the most lifelike real-time avatars available in a headset, but Apple is giving them an impressive visual upgrade in VisionOS 26.
The new Persona update will improve the appearance of skin, hair, and eyelashes, according to Apple, greatly reducing the ‘ghostly’ appearance of the current version of avatars.
In VisionOS 26, Apple is making it easy for developers to build co-located experiences. That means two Vision Pro users in the same physical space can share a synchronized experience where virtual content appears in the same physical location for both users.
With SharePlay and FaceTime, remote users can also join these experiences, allowing a combination of physically co-located and remote participants.
Spatial photos are getting… more spatial in VisionOS 26. While the prior version allows users to turn their 2D photos into 3D photos (or what Apple calls ‘spatial photos’), this adds stereoscopic depth, but not parallax. That means if you lean left or right, the image just moves with you, instead of making it appear like you can look around objects in the scene.
In VisionOS 26, the spatial conversion will now attempt to add real volume to the photo, offering both stereo depth and parallax. Apple is calling this a ‘spatial scene’, to indicate the addition of parallax.
Given that the photo was originally captured in 2D, the conversion will have to make guesses about what the scene should look like from different angles. While there will certainly be limitations to those guesses, we’ve been really impressed with Apple’s first attempt at automatic conversion of 2D photos to spatial photos, so we’ll be interested to see how well the process works for spatial scenes.
Apple says that a new ‘Spatial Browsing’ mode is coming to Safari in VisionOS 26. This will purportedly work on any webpage that can normally work with Safari’s existing ‘Reader’ view.
In addition to converting the page’s text for easier reading, the Spatial Browsing view will turn images on the page into ‘spatial scenes’ (volumetric photos) as covered above. Apple hopes this will make web browsing feel more immersive.
Web developers will also be able to define an immersive background for their website, so Vision Pro users can have a relevant immersive scene that appears around them when browsing a specific webpage.
While Vision Pro has long supported spatial (3D) video playback, and Apple Immersive Video (Apple’s own wide field-of-view immersive video format), VisionOS 26 will support other common immersive video formats.
That includes 180°, 360°, and, as we understand it, arbitrary wide field-of-view video content, either monoscopic or stereoscopic.
Apple says this will mean wide field-of-view footage from popular action cams like GoPro and Insta360 will play seamlessly on Vision Pro.
PHOTO of formats
Developers will also be able to play all supported formats in their apps, or stream them from the web.
Apple says that hand-tracking on VisionOS 26 will be able to run up to 90Hz which should result in a noticeable reduction in latency, compared to the 60Hz tracking of the current VisionOS 2.
There’s lots of smaller tweaks and improvements coming to VisionOS 26, but these are the biggest of the bunch.
VisionOS 26 is available starting today as a developer beta. Apple has not yet said it the public beta and full release will come alongside the public betas and release for its other platforms like iOS and macOS.
The post Apple Reveals Major Vision Pro Updates Coming in VisionOS 26, Developer Beta Available Starting Today appeared first on Road to VR.
Microsoft’s new ROG Xbox Ally has a lot to prove in the market of handheld consoles. The Switch 2 was only days old when the company officially unveiled its competitor during the Xbox showcase. With PlayStation’s Portal mostly being a streaming device that offers a very specific, incredibly niche player benefit…
The Switch 2 has a ton of new features and a much more detailed list of settings you can tinker with than its predecessor. Here are four you should change right from the jump to improve your overall experience with Nintendo’s new Mario Kart World machine. Just go into the settings menu inside the gear icon near the…
visionOS 26 will also bring support for “macOS spatial rendering”.
According to Apple, this will let you “use the power of your Mac to render and stream immersive content directly to Vision Pro”.
While the company hasn’t yet said how this will work, it sounds like it will be the Apple equivalent of wireless PC VR, leveraging the power of your Mac for rendering.
The open-source tool ALVR already lets you use Apple Vision Pro as a SteamVR headset, but SteamVR doesn’t support macOS, having dropped support for the platform in 2020.
As well as supporting Apple’s own desktop operating system, macOS spatial rendering sounds like it will be deeply integrated into the company’s technology ecosystem.
Apple Vision Pro’s M2 chip is more powerful than what’s available in any other standalone headset, but that chip is only the base of Apple’s offering, with the M2 Pro, M2 Max, and M2 Ultra offering progressively greater performance. And of course, the M2 series has since been superseded by M3 and M4, with Apple’s M3 Max offering 3x the multithreaded CPU performance, more than 5x the GPU performance, and up to 512GB of unified memory.
Thus, macOS spatial rendering could allow for rendering significantly higher fidelity virtual experiences with deeper simulation and interactivity, leveraging this extra power.
As well as its potential for enthusiast prosumer gaming, we expect enterprise companies will have particular interest in macOS spatial rendering, allowing them to for example render photorealistic 1:1 recreations of products and components they’re developing. Thus, macOS spatial rendering could be significant competition for companies like Varjo.
For more about visionOS 26, read our article covering the many changes set to arrive in Apple’s biggest Vision Pro software update yet.
At WWDC today, Apple announced the headlining features of visionOS 26, its next big OS release for Vision Pro. Among them is a new revamped spatial photos feature that ought to make them even more immersive.
Vision Pro launched with the ability to view spatial photos, captured either with the headset itself or with iPhone 16, 15 Pro and Pro Max. These spatial photos created a sense of depth and dimensionality by combining stereo capture and applying depth mapping to the image.
Now, Apple says it’s applied a new generative AI algorithm to create “spatial scenes with multiple perspectives, letting users feel like they can lean in and look around,” essentially ‘guessing’ at details not actually captured on camera.
With visionOS 26, Vision Pro users will be able to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers will also be able to use the Spatial Scene API to add the feature into their apps.
To show off the new AI-assisted spatial photos feature, real-estate marketplace Zillow says it’s adopting Spatial Scene API in the Zillow Immersive app for Vision Pro, which lets users to see spatial images of homes and apartments.
Apple’s visionOS 26 is slated to arrive sometime later this year, although the company says testing is already underway.
The post Spatial Photos on Vision Pro Are Getting a Volumetric Upgrade for Greater Immersion appeared first on Road to VR.
The PlayStation 4 was released nearly 13 years ago. In the world of video games, it is a dinosaur. And yet, in 2025, during Summer Game Fest’s many shows and directs, new PS4 games were being announced alongside PS5 and Switch 2 games. Turns out this old machine still has some life left in it, though that might be a…
Marvel’s upcoming miniseries Ironheart may only have six episodes, but its latest trailer proves it’s going to pack in all of the magic and machinery it can. In a newly released trailer, we not only get to see Riri Williams’ latest suit, but we also see her unique take on an Iron Man favorite.
Apple’s next big Vision Pro update, coming in visionOS 26 later this year, is adding the ability for developers to create co-located AR experiences so users can interact with the same app, 3D model, and more.
There probably aren’t a lot of friends and families out there that have one Vision Pro between them, let alone two, making the feature drop most likely something enterprise can get behind, as well as the most ‘pro’ of prosumers.
It’s not just a co-location update though, which would mean you only interact locally. Apple says the upcoming visionOS 26 feature will let users do everything from watch the latest blockbuster movie in 3D and play a spatial game together, but also collaborate with coworkers from near or far.
As Apple is wont to do, it’s also integrating the feature with FaceTime, so you can add users from across the globe. FaceTime has had Personas since Vision Pro was released in early 2024, letting users interact using a lifelike avatar across all of the company’s ecosystem of devices.
To boot, Apple showed off the feature with Dassault Systèmes, the engineering and 3D design company, which is using its own 3DLive app to connect remote colleagues and let them visualize 3D designs in AR (seen above).
Apple’s biggest competitor Meta has allowed developers to create co-located apps for consumers since late 2024, when the company released its Horizon OS v71 update, which enabled local multiplayer setups through nearby headset discovery.
This story is breaking, as we’re still learning about the feature. We’re currently at WWDC, so check back for more on all things Vision Pro.
The post Vision Pro Update to Finally Let Developers Make Co-located AR Experiences appeared first on Road to VR.
Monster Hunter Wilds certainly loves to hide things for players to discover. From exclusive weapon mechanics to unmapped areas, there is always something you’ve yet to learn about. The recently added Grand Hub is no exception, as the discovery of a humongous Downy Crake made waves across the community. Here’s how to…
Apple today announced at WWDC that Vision Pro is getting spatialized Widgets, coming along when visionOS 26 drops later this year.
On basically all of Apple’s devices, Widgets are designed to offer up personalized and useful info at a glance.
Now Apple says Vision Pro is also getting spatial Widgets too, which will let you place a variety of these mini-apps around your house which reappear every time you put on Vision Pro.
Apple says Widgets in visionOS 26 are “customizable, with a variety of options for frame width, color, and depth. Beautiful new widgets — including Clock, Weather, Music, and Photos — all offer unique interactions and experiences.”
Essentially, you’ll be able to decorate you space with things like spatial photos, clocks with distinctive face designs, a calendar with your events, and also quick access to music playlists and songs so you can, say, keep your favorite track in a specific part of your room.
Notably, Apple says developers will be able to create their own widgets using WidgetKit. There’s no word on exactly when visionOS 26 releases, although the company says we can expect it sometime later this year.
This story is breaking. We’re currently at WWDC today, and will report back when we learn more about all things Vision Pro.
The post Vision Pro’s Next Big Update Will Add Anchored Widgets That Live Around Your House appeared first on Road to VR.
Apple just announced visionOS 26 for Apple Vision Pro at WWDC25.
visionOS26 will bring PlayStation VR2 Sense controllers & Logitech Muse stylus support, much more realistic Personas, spatial Widgets, volumetric Spatial Scenes, local SharePlay, and much more.
0:00
While you might have expected the next version of Vision Pro’s operating system to be visionOS 3, Apple has switched to a unified year-based naming system for all of its operating systems. While visionOS 26 will launch later this year, most of its lifecycle will span through 2026.
Here’s everything coming in visionOS 26:
As rumored for months now, visionOS 26 will add native support for the PlayStation VR2 Sense controllers, which Sony will sell separately from the headset.
Apple says this will bring “a new class of games” to Vision Pro.
0:00
PS VR2 controller support will include 6DoF positional tracking, capacitive finger touch detection, and “vibration support”. It’s unclear whether precision haptics will be supported, nor the unique resistive triggers of the PS VR2 Sense controllers.
In visionOS 26, Apple says Personas, the platform’s face-tracked realistic avatars, have been “transformed to feel more natural and familiar” thanks to “industry-leading volumetric rendering and machine learning technology”.
0:00
Apple claims these new more realistic Personas have “striking expressivity and sharpness, offering a full side profile view, and remarkably accurate hair, lashes, and complexion”. The company has also expanded the eyewear options for your Persona to include over 1000 variations of glasses.
Apple says the new Personas are still generated in a matter of seconds via holding the headset up to let it scan your face.
visionOS 26 will bring persistent Widgets to the platform. You position these widgets in your physical space, and they reappear every time you put on the headset.
0:00
Apple says visionOS Widgets are customizable with options for “frame width, color, and depth”.
0:00
Built-in widgets will include Clock, Weather, Music, and Photos, and developers will be able to build their own using WidgetKit.
Since the launch of Apple Vision Pro the headset has been able to capture and display 3D photos, which Apple calls Spatial Photos, and visionOS 2 added the ability to convert any 2D image into a Spatial Photo using machine learning.
visionOS 26 is set to go much further. It introduces Spatial Scenes, which leverages “a new generative AI algorithm and computational depth to create spatial scenes with multiple perspectives, letting users feel like they can lean in and look around”.
0:00
Spatial Scenes can be viewed in Photos app, Spatial Gallery app, and Safari, and developers will be able to add them to their visionOS apps using a new Spatial Scene API.
Currently, it’s not possible to build visionOS apps that let multiple Vision Pro headsets automatically see the same objects and interfaces in the same locations in the same physical space, a feature known as colocation on Quest and Pico.
0:00
Dassault Systèmes 3DLive app.
visionOS 26 will bring this capability to Vision Pro, leveraging the existing SharePlay technology and APIs that today let Vision Pro owners share experiences remotely.
visionOS 26 adds support for the upcoming Logitech Muse accessory, a spatial stylus for Apple Vision Pro.
0:00
Apple says Logitech Muse “enables precise input and new ways to interact with collaboration apps like Spatial Analogue”.
It looks to be very similar to the existing Logitech MX Ink spatial stylus for Meta’s Quest headsets.
visionOS 26 is set to bring significant upgrades to Safari on Vision Pro.
The new Safari will allow web developers to embed 3D models in web pages, and the user can “manipulate” these 3D models directly in the page.
0:00
As a user, you’ll also be able to “transform articles on Safari, hide distractions, and reveal spatial scenes that come alive as they scroll”.
visionOS 26 will add native support for traditional 2D 180° and 360° video, not just Apple’s own 3D Apple Immersive Video format.
0:00
Apple says this allows you to easily watch content captured on affordable 180 and 360 cameras from companies like Insta360, GoPro, and Canon.
This news is breaking, and this will be updated to reflect the developing news as Apple reveals more details of visionOS 26.
Apple ‘Personas’ on Vision Pro are already the most likelife real-time avatars you can find on any headset today, but in the next version of visionOS, they’re taking another step forward.
Apple today announced that its Persona avatars for Vision Pro will get a major visual upgrade with the launch of visionOS 26, due out later this year.
Personas on Vision Pro are generated on-device after users take a short scan of their face using the headset. Once generated, the avatar is used for social experiences like FaceTime.
Currently, they’re the most lifelike real-time avatars available on any headset today. Although they impressively capture subtle motion from the user, they have always felt somewhat blurry or ghostly.
VisionOS 26 promises a big visual update that will greatly reduce that ghostly look, and present a more complete view of the user’s head, including a “full side profile view.” Apple is also promising more realistic hair and lashes, and more than 1,000 variations of glasses, so glasses-wearers can find something that looks just right.
Although visionOS 26 will be available as a developer beta starting today, it isn’t yet clear if the Personas upgrade will be available in the first version, or roll out in later versions of the beta.
Beyond the visual upgrade to Personas, visionOS 26 will also make improvements to how social experiences work on the headset. New developer tools will allow for the creation of co-located virtual experiences; meaning two headset users in the same physical space will be able to see a shared virtual experience that’s visually anchored in the same space for both. That same system will allow for remote participants to join as Persona avatars, making for a mixture of in-person headset users and remote participants in the same virtual experience.
The post Vision Pro is Getting a Major Visual Upgrade to Its ‘Persona’ Avatars appeared first on Road to VR.