You are on page 1of 2

Depth Testing

Unless we check the depth of our primitives, some faces that should be hidden from view might not
be. This can
produce unexpected results, as we saw in Figure 1-15. Enabling depth testing is easy and involves
calling this:
gl.enable(gl.DEPTH_TEST);
31CHAPTER 1 N SETTING THE SCENE
We will also clear the depth buffer in our setupWebGL function:
gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT);
In Figure 1-16, you can see a more expected result.
Figure 1-16. After enabling the depth test, everything looks as it should
In this chapter we have shown how to color a 3D mesh. In Chapter 3, we will come back to this last
example
and apply texture and lighting to it.
Summary
In this chapter, we have made great strides going from a blank canvas to a moving 3D object. Even
though this
was the first chapter, in a lot of ways it was a tough one because we needed to introduce so many
new concepts
at once. So congratulations on making it this far and now we can build upon our new skills in the
forthcoming
chapters. In the next chapter, we will dive into the details of the OpenGL Shading Language
(GLSL) and start
exploring the capabilities of vertex and fragment shaders. We’re just getting started with what
WebGL can do!
32CHAPTER 2
Shaders 101
In this chapter, we will be covering the GL Shading Language (GLSL) in depth. Topics that we will
cover include
v an overview of the WebGL graphics pipeline
v the difference between fixed functionality and modern-day programmable shaders
v the role of vertex shaders and fragment shaders within the GLSL
v how to create and use shaders within a WebGL application
v a detailed overview of the GLSL including its primitive types and built-in functions
v examples of procedural fragment shaders
Graphics Pipelines
A graphics pipeline consists of the steps that an image goes through from initial definition to final
screen
rendering. This pipeline is composed of several steps done in a predefined order. Components of the
pipeline can
be either fixed in functionality or programmable.
Fixed Functionality or Programmable Shaders
The more traditional graphics pipeline has a fixed implementation. The initial image definition
would be the set
of vertex location points and information associated with these points such as color, a normal
vector, and texture
coordinates. With fixed functionality, operations are done in a set order. You can disable some
elements such as
lighting or texturing, but not modify how the underlying lighting or texturing calculations are done.
The graphics
pipeline of OpenGL before version 2.0 used fixed functionality only.
Fixed functionality, as its name suggests, is quite rigid. It allows for quicker and easier generation
of
images because lighting formulas and shading are already built into the system. However, it limits
what we can
accomplish because we cannot override these settings. OpenGL Fixed functionality had separate
pipeline steps
for vertex transformations and lighting. This is now all done within the vertex shader (VS) and
fragment shader
(FS). Similarly, texture application, color summation, fog, and alpha testing were all discrete steps.
Now these
components are done within the FS.
A high-level view of how the WebGL API, programmable and nonprogrammable components of the
pipeline
interact is shown in Figure 2-1.
B. Danchilla, Beginning WebGL for HTML5
© Brian Danchilla 2012
33CHAPTER 2 N SHADERS 101

You might also like