You are on page 1of 4

Sean FIgel

CST 325
Computer Graphics
12/15/2018
CST 325 Final Paper

Deconstruction of the Genesis Torpedo from Star Trek II

The scene begins with a single sphere in world space with a directional light to the right

of the sphere is presented to the viewer. Shading has been applied to the sphere to

provide a fully lit surface on the right of the sphere and then the ambient light being

added to the shader being applied to the center and left sides of the sphere. This

lighting could be recreated using raytracing similar to what we did in module 2. As the

scene progresses the view perspective moves towards the sphere by changing the z

axis variable associated to the view perspective. During this translation of the camera

matrix the torpedo object that was outside of clip space begins to move towards the

sphere by gradually translating the world space of the torpedo object until it equals the

ray intersection of a point on the sphere object by calculating a ray from the torpedo

object to a point on the sphere. During the translation towards the intercept point on the

sphere the torpedo is then rotated around the intercept point before continuing towards

the intercept point. This could have been achieved by changing the creating an origin

point at the intercept point to use as a rotation point for the torpedo.

Upon impact the point of interception creates both a point light source at the interception

point as well as a slightly transparent sphere that expands outward. As the sphere

expands the point light loses its illumination and the sphere becomes more transparent.
Sean FIgel
CST 325
Computer Graphics
12/15/2018
The interception point then changes again as particle objects begin to expand following

a vector that is moving away from the point of interception. The interception point also

has a different texture applied to differentiate it from the rest of the planet texture. As the

fire begins to move across the surface of the sphere, it looks like this effect could be

done through Phong Shading. The edge of the light could be calculated based off of the

z value of the pixel in relation to the point of impact of the torpedo. Pixels that are further

from the interception point will be black and begin to brighten the texture of the planet

until reaching the point that the next flame texture has expanded to.

As the camera rotates around the planet, the depth of the craters on the texture could

be created using a shadow texture on planet to provide increased depth as the angle of

the view perspective changes. Then the fire moves to overtake the camera the texture

of the planet is changed by increasing the color towards white with the fire effect

perhaps being created using vectors with a gradient color of white at the starting point of

the vector to a dark orange color at the end of the vector.

Once the view perspective has changed from a perspective projection to an

orthographic projection of the planet the rendering changes from a simple sphere object

to more jagged polygons. As the view perspective is moving through the scene the view

volume is changing to limit what the rasterization process needs to project to clip space.

Looking towards some of the edges in the scene there are some artifacts present in the

scene which could have been caused by the algorithm as some pixels could be

matching cases in the shader where the pixel should be shadowed but could also be

giving a lighter texture producing a slight flicker on some of the polygons. During the
Sean FIgel
CST 325
Computer Graphics
12/15/2018
scene there also seems to be some portion of the code for the scene that when the view

matrix reaches a specific point in world space a section of pixels are removed in the

shape of a triangle from the object closes to the camera. I think this could have been

accomplished using a variation of the Painter’s Algorithm. By removing the object that is

the closest to the view perspective the next closest object will then be visible through

the opening created. Since the view perspective and view volume is moving very rapidly

through the world space the geometric processing for the cliffs and mountain ranges

allows for more jagged as they can be created with less polygons to save processing

time.

The water filling up the lower sections of the planet could be created using a plane with

a simple light blue texture and slowly increate the y axis value of the plane in world

space as the scene continues with a maximum limit on the y value for the plane. There

are also different textures being presented with the sections of the land that are not

being covered by the water plane. This could be created by using scanline rendering on

the frame. During this ambient light is being increased as the scene of the planet

creation continues to give the effect that light is being defused more and more as the

atmosphere is being created on the planet. The skybox texture changing from black with

white dots to simulate space to a blue color by slowly decreasing the transparency of

the blue color to give the view a progressive ambient lighting effect as well.

When the scene transition changes to the final scene there is a jarring jump in where

the view perspective is moved to a specific point in world space. This movement in

world space happens just after the 52 second mark in the clip. This could have been
Sean FIgel
CST 325
Computer Graphics
12/15/2018
created by coding a specific view matrix value for the view perspective to be at a

specific time frame. The objects within world space are the same between the frames

before and after the jump so the only thing that the cameras location with world space.

The snow might be created by using a second camera to provide a different perspective

view of the scene to then provide a minimum and maximum value for the pixels within

that view. If a pixel is in clip space above a specific value then the pixel will be colored

white. Since there seems to be a linear line through all of the mountains that are above

a specific point in the scene.

As the camera moves away from the planet the field of view begins to change creating a

look of zooming out over the water plane. This solid color of the water plane texture is

then used to translate the scene back to a view outside of the planet and to a more

perspective projection. The sphere in world space now has a texture of clouds and land

masses, a direction light is illuminating the surface of the sphere from the bottom right of

world space with the calculation of the illumination being done through ray tracing. The

view matrix during this is being transformed and rotating slowly to provide the effect of

moving away from the planet and out into space.

You might also like