Meta’s WorldGen AI system generates trimesh 3D worlds from text prompts, though the company doesn’t think it’s ready for Horizon Worlds yet.
Meta first teased that its Horizon Worlds creation tools would get the ability to AI-generate entire 3D worlds back in May, when announcing the related AssetGen 2.0 model. Then, in June, the company revealed that this feature would be called Environment Generation, teased example generations, and said it would launch “very soon”.
Environment Generation launched in August, but it was (and remains) only capable of generating a very specific kind of island, a very limited scope compared to the goal of generic world creation.
What Is Horizon Worlds Desktop Editor?
Horizon Worlds Desktop Editor is a flatscreen Windows PC application Meta released in early access in February, alongside deprecating the in-VR creation tools of Horizon Worlds.
The editor offers the ability to import 3D assets, images, and sound files, place them in a 3D landscape, and implement game logic and other functionality using TypeScript, a popular offshoot of JavaScript. These worlds are then immediately playable and multiplayer-capable in Horizon Worlds.
In the US, UK, Canada, EU, Australia, New Zealand, creators can also AI-generate 3D meshes, textures, skyboxes, sound effects, ambient audio, and TypeScript.
You can download Horizon Worlds Desktop Editor here.
At Connect 2025 in September, Meta teased an overhaul of its Horizon Worlds creation tools, called Horizon Studio, which hasn’t yet launched. The tease depicted an AI Assistant capable of generating just about anything a creator wants, including entire worlds, specific assets, custom NPCs, and specific gameplay mechanics, in a matter of seconds or minutes. But it’s unclear whether what Meta was showing was notional or representative of real technology it was waiting to deploy.
That brings us to WorldGen, the new AI system Meta published a paper for.
Meta describes it as “a state-of-the-art end-to-end system for generating interactive and navigable 3D worlds from a single text prompt”, leveraging a chain of 2D and 3D techniques, rather than being a single model.
“WorldGen is built on a combination of procedural reasoning, diffusion-based 3D generation, and object-aware scene decomposition. The result is geometrically consistent, visually rich, and render-efficient 3D worlds for gaming, simulation, and immersive social environments.”
To be clear, this is not producing a Gaussian splat like World Labs’ Marble, nor an interactive video stream like Google DeepMind’s Genie 3.
Meta’s WorldGen creates a layout of traditional trimesh 3D assets, making it fully compatible with traditional game engines and rendering pipelines. And it also includes a navmesh for collision detection and NPC traversal.
0:00
/1:28
Here’s the underlying sequence WorldGen goes through after you input a prompt, according to Meta:
(2) Reconstruction 1. Image-to-3D base model 2. Navmesh-based scene generation 3. Initial scene texture generation
(3) Decomposition 1. Part extraction with accelerated AutoPartGen for scenes 2. Data curation for scene decomposition
(4) Refinement 1. Image enhancement 2. Mesh refinement model 3. Texturing model
So why isn’t WorldGen rolling out in Horizon Worlds Desktop Editor, or at least being announced as a launch feature for Horizon Studio?
0:00
/0:35
0:00
/0:35
0:00
/0:35
0:00
/0:35
Meta says it’s not satisfied with the fact that WorldGen currently only produces 50×50 meter spaces, and that it takes a long time to do so. The company says it’s working to address both limitations.
It seems like a greatly upgraded future version of WorldGen will be necessary to deliver on the promise of Horizon Studio that Meta teased at Connect, and given the rate of advancement in AI, it’s very possible that the company will be able to achieve exactly that sometime in 2026.
AI can bring real-world objects into VR as 3D assets in seconds, with Meta’s new SAM 3D Objects model setting a new standard for quality.
It has been possible for years now to generate a 3D model of a real-world object by capturing dozens of images of it from surrounding angles, leveraging traditional photogrammetry techniques. Epic’s RealityScan, for example, takes around 15–45 minutes of cloud processing, while Apple offers an on-device Object Capture API for iPhone Pro models that takes around 5 minutes.
But over the past year or so, advanced AI models have emerged that can produce 3D assets from a single image in a matter of seconds. And while they don’t offer the same quality of photogrammetry, the quality has steadily improved with each new model release, mirroring the overall rapid advancement of AI.
0:00
/1:21
EchoTheReality on SideQuest, which uses an old AI model from 2024.
For an example of how this applies to VR, Takahiro “Poly” Horikawa published a Quest app on SideQuest earlier this year that uses hand tracking to let you frame a specific real-world object and take a photo of it, leveraging Meta’s passthrough camera API. This image is then provided to Stability AI’s Stable Fast 3D API, based on the TripoSR model, and the result is spawned as a virtual object beside the image capture spot.
TripoSR is now 18 months old, though. And a few days ago, Meta launched SAM 3D Objects, the new state-of-the-art model for generating 3D assets from a single image.
0:00
/0:11
Meta SAM 3D Objects
You can test out SAM 3D Objects for free in your web browser on the Meta AI Demos page. Just provide it with an image and you’ll be able to select which object you want to convert to a 3D model. Seconds later, you’ll see a 3D view where you can pan around the object with your mouse or finger.
Meta’s site isn’t designed for mobile screens, so you’ll probably want to use a PC, laptop, tablet, or VR headset. Also note that the model is only designed for inanimate objects, not people or animals.
This free public demo does not let you download the 3D model. But SAM 3D Objects is open source, available on GitHub and Hugging Face. That means developers should be able to host it on a cloud computing platform that offers GPUs, and use it to provide the experience of that EchoTheReality demo but with higher quality output.
Social VR platforms, for example, could let you conduct show-and-tell for objects in your real room in a matter of seconds. Or decorate your home space with items you crafted in the real world. Meta has no announced plan to add this to Horizon Worlds, but it would seem like a natural future step, complementing the Hyperscape worlds it just launched.
Marble, an AI model from World Labs, can turn a single image into a volumetric scene that you can view in WebXR in a matter of minutes.
World Labs was founded by Fei-Fei Li, one of the pioneers of modern AI, best known for creating the ImageNet dataset that helped enable the rapid advancement of computer vision of the past 15 years.
As with almost all of the remarkable advancements in 3D reconstruction over the past few years, Marble generates Gaussian splats, fitting thousands of semitransparent colored blobs (Gaussians) in 3D space, so that arbitrary viewpoints can be rendered realistically in real-time. And both its variety of supported input types and the speed of its output are, to date, unprecedented.
While other splat generation systems like Meta’s Horizon Hyperscape and Varjo Teleport require hundreds of input frames and hours of processing, in its simplest mode Marble can generate splats from a single input image or text prompt in a matter of minutes.
For more advanced outputs, if you pay for the $20/month subscription Marble can take multiple images as input, or a short video, or even a 3D structure, using a tool World Labs calls Chisel.
Chisel lets you lay out a scene with crude 3D shapes, as you would in a game editor, and then use a text prompt to turn it into a detailed volumetric scene.
With the subscription, Marble outputs support interactive editing, expanding, and the ability to combine multiple worlds together. And you can export as a high-quality traditional 3D mesh, though this takes multiple hours of conversion time.
Because of the unique capability set of Marble, World Labs describes it as a “first-in-class generative multimodal world model”.
On the Marble web app you can generate your own scenes for free, and view the output in VR via WebXR using the web browser of your headset.
0:00
/0:49
Testing Marble with a single image of the Steam Dev Days 2014 VR room.
Trying out Marble on Quest 3 and Apple Vision Pro, by turning a single image of the Steam Dev Days 2014 VR room into a volumetric scene, I found the quality to be noticeably inferior to Meta’s Hyperscape worlds and Varjo Teleport, more akin to Niantic Scaniverse. While the details directly brought in from your input image are relatively detailed, the further away you move from this, the more of the typical Gaussian splat visual artifacts you’ll see.
And of course, the elephant in the room here is that details beyond the image frame are hallucinated, so will be very different from what was actually there behind the camera, unless you provide multiple input images.
Still, all this aside, the ability to generate volumetric scenes in minutes from a single image or sentence is remarkable, and that you can then edit them with a combination of an editor UI and natural language even more so.
Further, the ability to then export these scenes as traditional 3D worlds, with geometric steerability via Chisel, seems like it could have huge potential for VR developers to build environments for their interactive apps and games.
You can try out Marble at marble.worldlabs.ai. Note that if you don’t pay, any scenes you create will be publicly listed. You’ll need the $20/month subscription to create a private scene, alongside unlocking the advanced creation, editing, and export features.
Demeo x Dungeons & Dragons: Battlemarked is a mostly natural crossover and a fitting evolution for the VR tabletop RPG. Read on for our full review.
Resolution Games created something special with the original Demeo, offering a compelling social VR experience with the turn-based dungeon crawler. Though it gradually evolved through post-launch updates, the initial release was rather bare and it’s a testament to the concept’s replayability that I’d keep coming back for more. Four years later, Demeo x D&D takes its potential even further.
The Facts
What is it?: An official crossover between Demeo and Dungeons & Dragons that supports up to four players with cross-platform multiplayer. Platforms: Quest, PC VR, PS VR2 (Reviewed on Quest 3) Release Date: Out now Developer/Publisher: Resolution Games Price: $29.99
Demeo x D&D delivers that same moreish strategy with a more refined package, boasting two sizable campaigns that took my party roughly six hours each to beat. There’s considerably more here than what the original Demeo offered at launch, so you’ll be busy for a fair while. Using Wizards of the Coast’s famous Forgotten Realms setting across Neverwinter and Icewind Dale is an undeniably great fit.
For the unfamiliar, Demeo emulates the tabletop experience by giving you figurines for each character that you can physically move across tile-based maps; hand-tracking controls remain supported on Quest, though controllers offer better precision. Co-location is also pleasingly available on Meta’s headset too, letting your whole party sit around the same digital board together.
This time, Resolution’s swapped the basement setting for a more modern second-floor room. You’ve got that same freedom to change your board positioning with minimal fuss, while artwork for the first game’s campaigns gives your background environment some nostalgic decoration. Getting up close with each map shows crisp visuals on Quest 3, bringing the digital tabletop fantasy to life well.
Screenshot taken by UploadVR on Quest 3
Like before, movement and skills work well with this turn-based strategy card battler. You have two action points per turn and some ability cards, like healing potions and poison antidotes, can be freely used if you’ve got points left, since turns automatically end when you run out. Movement, attacks, and more powerful abilities require one point, forcing you to consider each move carefully. Watching your carefully planned strategies pay off feels quite rewarding, though you still need to roll the dice to land a hit. Crits and fails haven’t gone anywhere.
Six character classes are currently available with unique moves, offering familiar choices between a fighter, paladin, sorcerer, rogue, ranger, and bard. It’s well balanced, as each class comes with its own strengths and drawbacks; setting off fireballs as the sorcerer never gets old with crowd control, nor does the paladin smiting his foes into oblivion in delightfully over-the-top fashion.
They could benefit from a greater range of voice lines, though; there are only so many times a bard can use the same vicious mockery insults before they get stale, even with decent voice acting. I’d like to see some wider options for character creation, too. Much of the joy in D&D comes from creating your own heroes, but right now, you’re stuck with limited cosmetic adjustments using the existing base character for each class.
Comfort
Demeo x Dungeons & Dragons: Battlemarked uses a third-person, tabletop perspective, making this a comfortable game to play for newcomers or anyone susceptible to motion sickness. As such, many common comfort options aren’t here because they aren’t necessary.
Moving across the board is done by hitting one of your controller’s triggers and pulling yourself to a location, while rotating the board requires doing this with both hands. Hand tracking support is available exclusively on Quest, though I found controllers to be more precise throughout. Steam and PS5 also have optional flatscreen modes.
Out of the few options here, a vignette can be activated while moving. Quest also supports mixed reality, letting you play off a digital board while viewing your real-world surroundings.
Battles remain challenging, though usually not overwhelmingly so like it could sometimes feel before. Enemies don’t spawn nearly as often in Demeo x D&D. You often need to find the way out or defeat the boss, collecting gold to buy new cards along the way from the local bazaar or after individual stages in longer dungeons. My co-op partner and our two hirelings – we each controlled one of these extra characters – only really struggled as we reached the first campaign’s end.
It’s worth clearing out all the enemies and clearing side quests, as you’ll gradually earn more XP that unlocks new abilities, and you pick one of three options from three separate categories each time. That’s based on your chosen class and primary/secondary abilities you hold proficiency in, such as strength or constitution. This delivers useful upgrades like extra hit points, less damage from specific attacks, healing if you kill enemies, and more.
Screenshot taken by UploadVR on Quest 3
You can also reuse previous characters too, giving some nice continuity for these otherwise standalone campaigns. What’s slightly annoying is that hirelings don’t level up with you, which leaves you disadvantaged in a campaign’s later stages if you’re playing solo or without a full team of four. This leaves some of your party stuck at level one and that gradually feels more unbalanced as you progress, so I’d love to see Resolution address this in a future update. Days after launch, I’m also encountering connection issues that keep interrupting games even after the hotfix. Infrequent enough that it’s not a major issue, though no less annoying when it does happen. At least you can jump back into a session easily enough.
This is roughly the extent of D&D’s gameplay influence here, since ability checks are mostly limited to one-off actions that only have a marginal impact. Battles often limit this to avoiding obstacles or traps, while outside of combat throws in a few choices with NPCs – usually with side quests – on how best to deal with enemies. You can’t choose a specific character to handle checks either, meaning you’re stuck using the party leader or whoever activated an event. Perfectly fine with traps but for story situations, continuously failing rolls can get frustrating when another party member is proficient with the required checks.
How Does It Compare On Steam & PS VR2?
For the majority of this review, my co-op partners hosted a game in flatscreen mode on Steam while I joined via Quest 3 natively. However, I’ve dived in a couple of times on both PS VR2 and SteamVR as well, connecting to the latter with my Quest 3.
Minus the Quest-specific features (mixed reality, hand tracking, and co-location support), I can’t really say I’ve noticed much difference when playing across PS VR2 and Steam beyond a perceived resolution increase. Everything works well and for PC VR, I encountered no issues with either Virtual Desktop or Steam Link via Quest 3.
For reference, my desktop uses an Intel i9 16-Core Processor i9-12900 (Up to 5.1GHz), 32GB RAM – Corsair VENGEANCE DDR5 5200MHz, and a 16GB Nvidia GeForce RTX 4070 Ti Super. You can find the minimum and recommended specs on the Steam page to learn more.
Battlemarked largely sticks to the original Demeo’s established mechanics with the appropriate Dungeons & Dragons set dressing, which feels fitting enough and evolves upon the original game well. But it’s these moments where I believe Resolution could take slightly better advantage of what such a crossover can provide.
I’m not expecting Baldur’s Gate 3 levels of branching narrative, but Dungeons & Dragons is all about theater of the mind. A good DM won’t just let anything fly; an even better one will give you choices while subtly guiding you on a certain path. Choosing a DM-less system is understandable given the base game it’s working from, though I’d love to see more meaningful story choices beyond some side quests. What’s here is a deliberately simplified take on Wizards of the Coast’s tabletop hit, though I’m still having a great time with friends.
Screenshot taken by UploadVR on Quest 3
Demeo x D&D is a great way to introduce newcomers to the Forgotten Realms that’s highly enjoyable for more veteran players of both. Returning to these iconic locations in a new way continues to intrigue me, scratching an itch I’ve had since leaving my regular Dungeons & Dragons campaign two years ago. Progress saves as you advance, and reaching each chapter’s end ultimately feels worth it for that sense of accomplishment.
I can only hope it’ll be a similar story when Resolution Games begins releasing additional campaigns via future DLC. Given the lengthier nature of Embers of Chaos and Crown of Frost, I’m hopeful for what comes next. As an added touch, unlocking lengthier missions as one-shot dungeons upon completing them is a welcome touch for those of us after something a little more brief.
Demeo x Dungeons & Dragons: Battlemarked – Final Verdict
Demeo x Dungeons & Dragons: Battlemarked is a fitting evolution that’s both newcomer-friendly and expands upon the original Demeo well. This crossover packs more expansive campaigns, better difficulty balancing with enemy spawns, lovely visuals, and a greater story focus that better complements these gameplay systems.
I do wish this offered a little more gameplay freedom to better fit D&D. Further narrative freedom would leave your decisions feeling more impactful, leveling up hirelings would help solo players, and I’d love a more expansive custom character creator. Still, Demeo x D&D gets a strong recommendation from me and if you enjoyed Resolution’s older hit, you’ll feel right at home here.
UploadVR uses a 5-Star rating system for our game reviews – you can read a breakdown of each star rating in our review guidelines.