Texture map

Introduction

An Ongoing project that transfer from multi-view image data to continuous rendering model.

Implement on HUMBI dataset with Prof. Hyun Soo Park on UMN, which is not yet completed.

Method

  1. Generate SMPL mesh reconstruction (realistic 3D model of the human body) from multi-view images

tex1

  1. Generate IUV maps

    tex2

    x, y: the spatial coordinates of collected points on the image

    I: the patch index that indicates which of the 24 surface patches the point is on

    UV: coordinates in the UV space. Each surface patch has a separate 2D parameterization

    I tried 2 approaches:

  • Mapping:

    Use the UV map provided by Densepose to get UV coordinates for each vertices on the SMPL model. Then project UV coordinates on each view based on the camera parameters. However, it has to differentiate vertices on the front side and vertices on the back side, to get the projection.

    tex3

  • Use Densepose to generate IUV map for each view images with multiview images directly.

tex4

  1. Unwrap texture for each view for multi-view data

tex5

  1. Integrate view-specific textures to form a complete texture of the subject

tex6

  1. Make up for the gaps and artifacts of the texture

  2. Wrap the texture on the SMPL mesh model

TODO

  1. Fill up gaps and generate missing parts in the texture
  2. generate transition textures between two adjacent views

Plan to finish the task by this semester.