[16-889] Learning for 3D Vision, Spring 2022

Assignment 1

 

Excuse the visual artifacts in the gifs due to compression!

1. Practicing with Cameras

1.1. 360-degree Renders (5 points)

360-degree render of the cow mesh

 

1.2 Re-creating the Dolly Zoom (10 points)

Dolly zoom effect of the cow mesh

 

2. Practicing with Meshes

2.1 Constructing a Tetrahedron (5 points)

Constructing a tetrahedron requires 4 faces and vertices each.

Constructed tetrahedron mesh

 

2.2 Constructing a Cube (5 points)

Constructing a cube requires 12 faces and 8 vertices.

Constructed cube mesh

 

3. Re-texturing a mesh (10 points)

The colors used were: color1 = [0.0, 1.0, 0.0] and color2 = [1.0, 0.0, 1.0].

Retextured cow mesh

 

4. Camera Transformations (20 points)

 
R_relative[010100001][001010100][100010001][100010001]
T_relative[000][303][003][0.50.50]
R_relative = [ 0.8529, -0.1504, 0.5000], [ 0.2952, 0.9288, -0.2241], [-0.4307, 0.3388, 0.8365], T_relative = [-3.0, 2.0, 3]

T_relative and R_relative describes an affine transformation (only R, T) for the camera with respect to the world origin.

 

5. Rendering Generic 3D Representations

5.1 Rendering Point Clouds from RGB-D Images (10 points)

The left and center gif correspond to the separate pointclouds. The gif on the right shows the merged pointcloud.

 

5.2 Parametric Functions (10 points)

Points sampled from the parameterized surface of a torus. The gifs were created with 100 (left) and 500 (right) samples respectively.

 

5.3 Implicit Surfaces (15 points)

Points sampled from the implicit function of the torus surface. Running marching cubes gives a nice mesh as shown

 

Rendering a mesh is easier because pointclouds need to be converted to a 3D primitive (sphere, cube) first before they can be rasterized thus incurring an additional compute + memory overhead. Since meshes come with face information, the shading and blending step calculates the interpolated color/feature values for each face; while this step has to be done for each 3D primitive corresponding to each point. This implies that rendering large meshes would be faster than rendering large pointclouds (with same no. of vertices).

6. Do Something Fun (10 points)

6. 1 Adding material properties

One can add material properties to change the manipulate rendering of a mesh. Increasing the shininess component allows for specular highlights on the mesh.

A mesh with an applied material with specular color = [0.5, 0.5, 1.0] and shininess = 800

 

6. 2 Render surface normal maps

To render surface normal maps of meshes, one has to write a custom shader leaving the rasterization step as it is. The shader retrieves the face corresponding to each pixel (assuming no blending) and calculates the normal of each face (by averaging the corresponding vertex normals).

Skull mesh with rendered surface normal map

 

(Extra Credit) 7. Sampling Points on Meshes (10 points)

From left to right: 10, 100, 1000, 10000 uniformly sampled points from the cow mesh (shown below)
Original cow mesh