Skip to content

Latest commit

 

History

History
102 lines (62 loc) · 6.84 KB

File metadata and controls

102 lines (62 loc) · 6.84 KB

0️⃣ Before

Note

Please, read the README file. The following document / informations might not be accurate.

We are using the following:

  • WebGL (Web Graphics Library), which, as MDN Web Docs define "WebGL [...] is a JavaScript API [...] to create 3D and 2D graphics within any compatible web browser without the use of plug-ins".
  • HTML, HyperText Markup Language
  • Typescript is a strongly typed programming language that builds on JavaScript.
  • "Webpack is a module bundler. Its main purpose is to bundle JavaScript files for usage in a browser [...]"

1️⃣ Basics

Note

The GPU is strong at parallel tasks / multi-tasking and uses Shaders to control the Rendering Pipeline. Different Graphics APIs (OpenGL, WebGL, DirextX, Vulkan, etc) allow us to communicate with the GPU.

A simplified view of the Rendering Pipeline:

  • Vertices: Coordinates of each vertices.
  • Vertex shader: Vertices location, color or texture sent to the GPU.
  • Rasterize: Which pixels are within the perimeter of the vertices and needs to be colored.
  • Fragment shader: Color the canvas.
  • Final display: Render the final canvas.

The web page (index.html) contains a <canvas> element. You get the canvas element in main.ts and check if the WebGL2 context is supported by your browser.

Multiples ways exists to render a triangle/point on screen, here one:

This process is a barebone render. You can evolve this with more complex shaders or further functions and steps to automate this process.

2️⃣ Animation 🔁

To add movement or animation, we have to call a function each frame using requestAnimationFrame(). Animation is used to add movement, size changes, rotation and others real-time effects.

delta time (dt) is calculated using performance.now(). It get the time difference between this frame and the last one in ms (miliseconds).

3️⃣ Matrices

Note

Matrices, Vector and Quarternions are tricky and time-consuming to understand. Libraries are available like glMatrix to ease the process.

In class.ts, the Vector3 class is used to store 3 position coordinates [x, y, z]. Quaternions are used for rotation to avoid Gimbal Lock, while matrices are still used for transformation of the world and camera space, handle scaling, object translation, and more.

4️⃣ A world in 3D

With 3D objects, vertices tend to overlap. So, indices are used to specify the drawing order to avoid having multiples vertices on the same coordinates.

5️⃣ Textures

Tip

PNG files have their origin at the top, while WebGL set it at the bottom.

Vertice color is replaced by fragColor = texture(sampler, textureCoord) in the fragment_shader. The sampler is used to decide the displayed color for each texture's pixel (blending might happen). Also, multiple samplers can be used. Moreover, textureCoord represents the coordinates of each pixel called [u,v] (also called [x,y]). textureCoord is an attribute; it needs to be in/out from the vertex_shader.

Texture blending is possible by multiplying the texture() function by another texture() function.

Mipmap is required by WebGL. Indeed, it can be auto-generated by WebGL with gl.generateMipmap(gl.TEXTURE_2D) or manually set.

6️⃣ Mipmaps

Mipmaps are LODs (Level of details). In fact, they represent the shared pixel across the texture uv on a [0,0] (bottom left) to [1,1] (top right) coordinates. However, each level of mipmap decreases by two the amount of pixels. In the end, a higher mipmap level results in a more pixelated result.

NEAREST modifier consider only the closest mipmap, that looks like the original. Yet, LINEAR will take both the mipmap before and after the closest one to the original. Therefore, with NEAREST and a mipmap of 4.43, the result will be 4, while with LINEAR, the result will be 4 and 5.

7️⃣ Texture Arrays or Texture Atlas

First, we used a Texture Atlas. It is a way to store our textures in a single image file. However, this needs us to manage the [u,v] position for each texture, to know its location on the Texture Atlas. Thus, the problem with Texture Atlas is that High Mipmaps can 'bleed' their pixels while downscaling onto other textures.

For Texture Atlas we use gl.texImage2D(Target, Mipmap_Level, Internal_Format, Width, Height, Border, Format, Type, Source) because we only use one 2D image.

Actually, a solution to avoid texture bleeding with Texture Atlas is Texture Arrays. This technique use a function called texStorage3D to create a "pile of texture" in 3D. Its arguments are a Target (TEXTURE_2D_ARRAY), a Mipmap_Levels (1), Internal_Format (RGBA), Width (128), Height (128) and Images_Count.

Then, we can add textures one by one using an async function texSubImage3D and specify it depth. Depth is a new argument for the [u,v] coordinates, it allows us to pick an image at the specified depth from our "pile of images".

Warning

⚠ Because we do not want to let the async function texSubImage3D wait, we need to preload our images to reduce loading time.

Note

You can still use a Texture Atlas and divide it into individual textures to make a Texture Array.

8️⃣ Loading a model

Models are stored in .gltf and .bin files.

As the Blender 3.3 Manual states : "glTF™ (GL Transmission Format) is used for transmission and loading of 3D models in web and native applications. glTF reduces the size of 3D models and the runtime processing needed to unpack and render those models. This format is commonly used on the web, and has support in various 3D engines such as Unity3D, Unreal Engine 4, and Godot."

Model binary data is stored in a .bin file. Then, .gltf file states where to find that data. So, we need to create a way to read that data like in model.ts. In addition, we create an interface called GLTF to ease the process of locating/storing the data (in buffers).

9️⃣ Lightning 💡

Warning

⚠ TODO: Explain this part. It is a tad complex.