This AI Tool Can Texture 3D Scenes Automatically

SceneTex generates textures accurately. 

By now, we've seen plenty of AI tools that can generate 3D models. However, empty meshes are not enough for a good-looking scene. So meet SceneTex – a method that generates high-quality textures for 3D indoor scenes based on text prompts. 

If you have an indoor scene already, you can describe the style you want to see, and SceneTex will put some textures on top. What makes it interesting is not only the time you can save on texturing but also how well the textures fit the models. 

As you can see from the comparisons in the video, it is easy for similar programs to mix images so that one object's texture would stretch to another. According to the creators of SceneTex, their solution doesn't have such a problem, or at least it's less prominent than in competitors' apps.

So how does it work? The target mesh is first projected to a viewpoint via a rasterizer an RGB image with the proposed multiresolution texture field module is rendered. Each rasterized UV coordinate is taken as input to sample the UV embeddings from a multiresoultion texture.

After that, the UV embeddings are mapped to an RGB image via a cross-attention texture decoder. Finally, the Variational Score Distillation loss is computed from the latent feature to update the texture field.

You can find more technical details in the paper here. The tool's code is also available, so you can potentially add its features to your own projects.

Image credit: ByteDance

Image credit: Jingxiang Sun et al.

Image credit: Yawar Siddiqui et al.

Also, join our 80 Level Talent platform and our Telegram channel, follow us on InstagramTwitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more