You are on page 1of 255
CHAPTER Interpolation and Character Animation Syllabus Interpolation and Character Animation : Trigonometry: The Trigonometric Ratios, Inverse Trigonometric Ratios, Trigonometric Relationships, The Sine Rule, The Cosing Rule, Compound Angles, Perimeter Relationships. Interpolation : Linear Interpolant, Non-Linear Interpolation, Trigonometric Interpolation, Cubic Interpolation, interpolating Vectors, Interpolating Quatemions. Curves : Circle, Bezier, B-Splines. Analytic Geometry : Review of Geometry, 2D Analytic Geometry, Intersection Points, Point in Triangle, and Intersection of circle with straight line, Syllabus Topic : Trigonometry 5.1 Trigonometry Trigonometry is associated with the measurement of 3 sided polygons. Trigonometric functions occur in transforms, vectors, geometry, quatemions and interpolation. In this session we will study some basic features which the user should aware, As we know trigonometry is used for measurement of angles and the two units used for angle measurement are degrees and radians, The degree unit of measure derived from defining one complete rotation as 360°. Th: radian is an angle which is created by circular arc and its length is equal to the radius o! circle; also its angular measure does not depend on any arbitrary constant. The perimeter of a circle is 2ar, where 2x radians correspond to one complete rotation th" is and | radian corresponds to 180/2°, it is approximately 57.3°. You should keep in min! the following relationships between radians and degrees : F=90", m= 180°, 2m, 2n = 360° 1 Interpolation and Character Animation Syllabus Topic : The Trigonometric Ratios 4.1 The Trigonometric Ratios ~ Triangle possessed some inbuilt properties, like the of ratio of sides and their associated angles. If such ratio if known in advance then it is easy for solving triangle problems to calculate ratio for unknown lengths and to calculate angles. . = The trigonometric ratios are abbreviated as sin, cos, tan, cosee, sec and cot. You can see “an Fig. 5.1.1 aright-angled triangle; the trigonometric ratios are given by : O = opposite H = hypotenuse A = adjacent z in (6) = 2 A 9 5 sin(®) =F cos (B) = if tan (B) = 4 1 _ cosec (B) = sn =H O-m® Hypotenuse, Opposite a vn ‘Adjacent = Fig. 5.11: right-angle trlangle = The limit for sin and cos is + 1, and tan has limits +00, The signs of the functions in the - four quadrants are as follows : ” tl) -|4 -|+ 51.2 Example = In the Fig, 5.1.2 a triangle is shown with one known angle and the hypotenuse. The triangles other sides are calculated as follows Cc b Fig. 5.1.2: Triangle with unknown n and b “Wy Game Pr 1g (MU B.Sc. Comp.-Sem-V)_5-3__Interpolatlon and Character Anima, 1 | sin (50°) na sin (50°) h = 10sin(50%) = 10x 0.76601 = 7.66 B= cos (50) b = 10c0s (50°) = 10x 0.64279 = 64279 ————————— ‘Syllabus Tople : Inverse Trigonometric Ratios Neen ee a ec ee eee 5.1.3 Inverse Trigonometric Ratios There is an associated ration with every angle and to convert one angle into other ang), we need functions. The functions sin, cos and tan are used to convert angles into ratios. The inverse functions sin~1, cos~1 and tan-1 are used to convert ratios into angles. Fo, instance, sin(45°) = 0.707, and sin”'(0.707) = 45°. ‘The functions sin and cos are cyclic functions, but the inverse functions return angles over a specific period. ‘Syllabus Tople : Trigonometric Relationships 5.1.4 Trigonometric Relationships ~ There exists a close relationship between the functions sin and cos, and they are correctly related by cos (B) = sin (B+90°) Pythagoras theorem is used to get other formulae like, _ sin tan 8) = cos @) sin? (B) + cos’(B) = 1 1+ tan’ (B) = sec’) 1 +cot?(B) = cosec”(B) c Fig. 5.1.3: An arbitrary triangle rogramring (MU B.Sc. Comp, 54 __inte Syllabus Topic : The Sine Rule 1.5 The Sine Rule ‘The sine rule relates tothe side length and the angles of a triangle. In the Fig 5.1.3 side a is apposite angle A, and side b is opposite angle B, etc. ‘The sine rule states Syllabus Topic : The Cosine Rule 5.1.6 The Cosine Rule ‘The cosine rule is sin'(f) +c0s"(B) = 1, relationship forthe arbitrary triangle is depicted in Fig, 5.1.3. In fact, there are three versions : @ = b'+c'—2be cos (A) bP = cf +a?— 2000s (B) c= a +b? 2ab cos (C) There are also 3 relationships a = beos(C)+e.cos (B) b = ceos (A) +acos (C) ¢ = acos (B)+bcos (A) Syllabus Tople : Compound Angles 5.1.7 Compound Angles tan (AB) = Let's see the addition and subtraction of two different angles as well as the multiplies of the similar angle fortwo sets of compound trigonometric relationships. Some most common relationships are given as follows : sin (A+B) = sin(A) cos (B) + cos (A) sin (B) sin (A-B) (A) c0s (B) -cos (A) sin (B) cos (A+B) = cos (A) cos (B) + sin (A) sin (B) 0s (A-B) = cos (A) cos (B) - sin (A) sin (B) tan (A) + tan (B) 1 + tan (A) tan (B) pop eer cre Se 55 Ieee SE arAe Ais, ‘ sin (2B) = 2 sin (B) cos (B) cos (2B) = cos” (B)- sin’ @) cos (28) = 2c0s"(B)-1 cos (2B) = 1~2sin’ (B) sin 3B) = 3 sin (6) -4sin’ (B) 05 (38) = 4cos’ (B)- 30s (B) cos (B) = 31+ (28) sia’ B) = 5(1-cos@) Syllabus Tople : Perimeter Relationships 5.1.8 Perimeter Relationships ~ Lastly, refer Fig. 5.1.3; now we can see the relationships which combine angles with th perimeter of a triangle : $= Fa+b+9 «() = «() = «(== (8) «2 (8) = ¢ 8 we " et sin(A) = 2 8 (S—a) (sb) (s—c) sin (B) = 2 e-ae ed sin(C) = 2 8 (sa) (sb) (sc) yy come Foe (MU B.Sc. Comp.-Sem-V) 5-6 _Interpolation and Character Animation — Syllabus Tople : Interpolation en ana ee Interpolation ee ee en ae Interpolation is a collection of methods used to solve the computer graphics problems. ‘The interpolant is used to change one number into another. For example, for changing 2 into 5 simply add 3, but this isnot very useful \ ‘The interpolant’s genuine function is to change one number into another in, maybe, 10 equal steps. Thus if we begin with 3 and continually add 0.3, this generates the sequence 333.63.94245485.15.45.760. For the operations like translation, scaling and rotation of an object, these numbers can be used. You can also use these numbers for moving the virtual camera, or for changing the camera place, for coloring or brightness of a virtual light source. 1 We need some formula to repeat the above interpolant for different numbers. In this section we are going to study this and also find out solutions to control the spacing between the interpolated values. ‘As we know in animation we move the object and often change the object speed slowly, 0 interpolant will be useful for this. Let’s stat with the simple interpolant i. linear interpolant. ‘Syllabus Topic : Linear Interpolant 5.2.1 Linear Interpolant A linear interpolants are used to create equal space between the interpolated values for equal changes in the interpolating parameter. > _In the above example we have incremented by 0.3 and it is computed by subtracting the __ first number from the second and the result is divided by 10, ie. (5 ~ 2/10 = 03. This “Method works but itis not flexible form. So, we have to state the problem differently, There are 2 given numbers m, and np, these ‘umbers represent the start and final values of the interpolant. We want an interpolated Value controlled by a parameter that varies between O and 1. > When ¢ = 0, the result is m1, and when ¢ = 1, the result is m,. A solution to this problem is as follows : n= n,+t(m,—-n) Sem-V) _5-7____ Interpolation and Chara Game (MU Animation ifn =2,n,=4 andt=05 then 1 n= 24+7(4-2) =3 6.23) This is a central point. Moreover, when t= 0, n =n, and when = 1, n= My it shows tha we have a sound interpolant. But you can express it differently : n= n+t()-0) (5.23) n= 0, +t, -tn, (6.2.4) n= n(1-t)+mt (5.25) = Its possible to design an interpolant that takes arbitrary portions of my and ny but jx produces an arbitrary results. This interpolant is used is used in graphics software. — For example, take a task of moving object between two locations (x), Y1 z;) and (zp Yp %). The interpolated position for this moving object is given as follows : x= x (1-D4mt y=y(-O+y2t z= y(I-)+yt (5.26) for O emmpolate foece pints a pce You can se in fig $22, how ro post 0,0) and (1) a5 0 De mesa Py # ta (ouve that react othe vasa ast angen wer At the start point (0, 1) the tangent vector (-5 Of" amd at Che Sad pow (2 I) te Uangent Vector is [0 -$]' The 1 umerpokins and y interpola ae goven as follows 2-23 t 1] e sau) -3 3 2-07 Lb reWevol yoy odds t ® e od 6 2-2 1 177 0 yau -3 3 2-) ! yelerm go a 4 oo ® t o e ° -s ai 1 peor Game Progamming (MUBSe.Comp.-Sem¥)_5-13_ Ierpolation and Chara -1 -05 0 0s 1 18 x Fig. 5.2.3: A Hermite curve amid the points with tangent vectors Tt tum into -77 aaa = (fern: P+ 13 -5t y= (ery| 9 |=-7see oJ — When we plot these polynomials over the range 0 <1 <1, we get the curve shown Fig. 5.2.3. Syllabus Tople :Interpolating Vectors 5.2.3 Interpolating Vectors — To this point we have been interpolating between a pair of numbers. In this section we using interpolants for vectors. As vectors consist of magnitude and direction and wl interpolation between two vectors take place then ensure that these both quantities preserved. - For example, if we are interpolating the x and y components of the vectors [2 3)": [4 7]", the amid vectors would preserve the change of orientation and it ignores change in magnitude. — If you want to preserve both, first understand how the interpolation should operate. In Fig. 5.2.4 you can see two unit vectors V, and V, separated by an angle 0. interpolated vector V is defined as a proportion of V, and a proportion of Vz: Fig. 5.24: Vector Vis derived froma parts of V1 and b parts of V2 V = aV,+bV, (5.2.28) Let's define the values of a and b such that they are a function of the separating angle 8. Vector Vis from V1 and (1 ~ 1)from V2, and it s obvious from Fig. 5.2.4 that using the sine rule a b sin(1=1 ~ sin(®) cam and moreover m = acos (18) (5.230) n= bcos ((1-0)') 6231) min=1 (6.2.32) From equation (5.2.27) _ asin (0) 3) a (a-9) 622) From equation(5.2.30) we get asin (10) cos ((1 -1)) cos (18) + sin (1-1) el past solving we get a = 2 : sin a b = 008 ~~ sin (8) Se, V)_6:15_Inorpolation and Charactr Anima, Hence, the final interpolant is _ sin(1-o),, sin 6) Vie sind ¥'* sing ¥2 5234) To check how it works, take one simple example, interpolate between two Unit vecto,, ta val [5 a : 7 [1 0) anc Eu and the value of 0 is the angle between the two vectors 135°, Equation (5.2.32) is used to interpolate the x and y components separately : sin (=) 135%) sin (11350 ( +) Va = 7 sn r3se) — * + sin 1359) * (“2 _ sin (1-1) 135°) sin (t135°) ( 1 ) Vy = sin 1359) %* sin 135°) * LY, ‘The interpolating curves are depicted in Fig. 5.2.5(a) and the positions of the interpolatey vectors, and a trace of the interpolated vectors is depicted in Fig. 5.2.5(b).. ‘The following are the two observations on equation(5.2.32) : 1, Ifthe angle @ between the vectors is not known and not given then itis calculated py using the dot product. 2. ‘The range of @ is 0 <8 < 180°, because when 6 = 180"the denominator falls dovin g zero, To confirm this we will repeat equation(S.2.27) for @= 179" ‘The Fig. 5.25 shows the result, it tells thatthe interpolant works normally over this range ‘One more degree, however, and it fill Till now, we have only taken into account the unit vectors. Now let's consider the vector, of different magnitudes and see how the interpolant responds, For example we take following input vectors: v= [5] oe[{] a goeaam oy on_nret game ramming (MU B.Sc. Comp, - Sem-V) {Interpolation and Character Animation, a x Fig. 52.5: (8) Curves of the two parts of (2.47), SpA eth re biel vectors between not anal J vi a (6) Interpolating amid two vectors 179°apart. ‘The Fig. 5.2.6 shows the separating angle 0 = 90°, andthe result, Observe how the initial Iength of V1 decreases from 2 to 1 over 90°. 12 i 08: > 06 04 024 4 a ne Fig, $.26 : Interpolating between two vectors [2 0)’and (0 1)". —<—<—<$<—$< Syllabus Topic: Interpolating Quaternions —_ eee 5.2.4 — Interpolating Quaternions = The interpolants too works with quaternions. That is interpolated quaternion q for given ‘two quaternions g, and q, is given by : _ sin(=0') sin (6) 5.2.35) = sin) * sin @) & Gan = If the interpolating vectors, and the angle 0 between the two vectors is not given then, you will get it by using the dot product formula : ws 00 @) = Ttyl 238] Pin ma!) 4 es FF WW Game Programming (MU B.So.Comp.-Sem-V)_5-17_Interpolaon and Character Anim, XMtY +A 0088 = TW Ayyl = Tointerpolate quaternions, to get the 6 4D dot product of the two quaternions is take, = ek 005 (8) = Tat tgil SFR BtY +L 6088 = Tgy I Mga ~ Iunit quaterions are used, cos (8) = 5,8) 4+%, + Vth (5.2 Now, we will see how to interpolate between a pair of quaternions. Let’s say there are uaternions q, and q, and they rotate 0°and 90°about the z axis respectively : 4 = [oSsn¥,0.0.0] a = [on™ sin 0.0.11] This becomes a = [1,[0,0,0}] G2 = (0.7.71, (0, 0, 0.7071) ] ~ Now, let's find the value of 8 : cos (8) = 0.7071+0+0+0 @ = 45° ~ Here, 1= 0.5, the interpolated quaternion is : . (#) Pat. ad sin( sin( = Faas Us (000114 Gq (07071, 10007071) } q. = 0.541196 [ 1, [0, 0, 0} ] + 0.541196 (0.7071, [0, 0, 0.7071] ] q = (0.541196, [0, 0, 0} ] + { 0.382683, (0, 0,03.82683] } 4 = (0.923879, [0, 0, 0.382683] ] ~The interpolated quaternion we got is a unit quaternion as the sum of the squares sq) root is 1. It must rotate a point about the z axis, halfway between 0° and 90°, ic. 45°, Let's ct with a simple example. Take a point (1, 0, 0) and subject it to the standard quaten operation : game Programming (MUB.Sc.Comp.-Sem'v) 5-18 __ Interpolation and Character Animation, p=qq' (5.2.38) To reduce the arithmetic work substitute a = 0,923879 and b = 0.382683. So, 2, (0,0, b]) and q°' = {, [0,0,~bl P = [2,(0,0,b)}x 0, 1, 0,0)}x fa, (0, 0,-b]] P’ = (0, a,b, 0]]x (a, (0, 0,-b]) P= (0,fa'-b%, 2ab, oy) P’ = (0,(0.7071, 0.7071, 0 ‘Thus, (1, 0, 0) is rotated to (0.7071, 0.7071, 0), and itis correct! —_— Syllabus Topic : Curves a3wWVVaSS SS or 5.9 Curves In this session we will try to understand the mathematics behind 2D curves. aS Syllabus Tople : The Circle ee eo ga1 The Circle = The equation of circle is vey = RK (63.1) = Here Ris radius. Circle equation is not suitable for drawing the curve, To determine the coordinates of any point on the circumference in terms of some parameter two functions are required. The following equations are used for it : x = Reos(t) y= Rsin() OStS2n (6.3.2) ~The parameter t can be vary over the range 0 to 2x, the curve of the circumference can be trace out or he circle is isolated for selecting an appropriate range of t. 532 The Ellipse = The equation for an ellipse is 2a x freer Ray Rain Where, Ry are a major radii and Ris minor radii foes ley fant © i ote ty Weel ot ew eel Cet ere! Btw PH! more me mein the poteemd comme Pema ae | mentetget ein atin othe mutans. it Mt ther geen te lh gemma jp — Fer cnmegte, Fs comenior » Sammy of 5 Jaca sank some gene tw precio of ame Ge omtraten Home em oF the amnheen in ihe ahh aw oh Mime + ema er fetel he pert | 6 epemny saat ead of he A ww enreapeene © te pooeneliny preter pes 8a Dee nest mamta * stows sees set aly cmtinraion f bps and | gat tS game and Day. te 6 2 Ot The ras emer 3 appiion wo ) boys amd} gate, Re ee Se protabeiay 0 * Te Be ene we Biguae 54s) 5 gamemanur fr Praca s eangin The perme of amd Pm Bagation (5.4) omargs ae stow un Tani 14.5 Sar date whee a edi > We get the Lompicie berms ptm orm when he ee oe oe ee ew ombuned bo hows we Tate 4) The agement propel a Rene sane me Mey ee RBI. tach a ep eas wy ape Te a = ae 8 l-uee ei aah (fy reanie we ca os Be Deal ena of) a me 2 we clus Be quae d-wau fhe Tabi $42 Eapecmnenn of tas orm tea ot 0 — ee a Be iteee “ - Gamo Pi (MU B.Sc. Comp_-SemV)_5.10_interpolaton and Character Arima, ~The parametric form is given as follows : x = Ray cos (t) y = Rggsin() OStS2n 634 a Syllabus Tople : Bezier Curves ee 5.4 Bvezler Curves SA BrezlerCurves 0 Biezier's name is associated with the theory of polynomial curves and surfaces, 5.4.1 Bernstein Polynomials ~ Bernstein polynomials are used by the B’ezier curvess, and itis given as follows ; ato = (S)éa-o Gan) - Here (") is shorthand given for the number of selections of i different items from » distinguishable items when the order of selection is ignored, and equals G-)nl (6.42) - For example, 3 x 2 x 1 is the shorthand for 3! . ~ When Equation (5.3.3) is evaluated for different values of i and n, we get the pattern of numbers shown in Table 5.4.1. = This pattern of numbers shown in table is known as Pascal's triangle. The patter describes the coefficients found in binomial expansions. For example, the expansion of (+a)" for different values of nis : Table 5.4.1 : Pascal's triangle I N{oji{2 [3 |4 [slo o}l tifa 2 41f2}1 3[1/3]3 [4 4|if4i6 [4 |i 51] 5]10} 10}5 | Le L116] 15 | 20] 15] 6] 1 (x+a) = 1 (ta)! = In¢la (xtay = Ix + 2axt dat (x+a) = Ix 43074 3e?x + La? (ea) = Ix'+ dar? + 60x24 da x La! It shows Pascal’s triangle as coefficients of the polynomial terms. Pascal, also acknowledged additional qualities in the numbers, in that they gave the odds governing combinations For example, let's consider a family of 6 children and now give the probability of any gitl-boy combination, Here, sum of the numbers in the 6th row of Pascal's triangle is taken: 146+15+20415+6+1 = 64 The number | at the beginning and end of the 6th row corresponds to the probability of getting 6 boys or 6 girs, ie, 1 in 68, The next number 6 shows next most likely combination: S boys and | git, or S girls and 1 boy, ie. 6 in 64. The centre number 20 applies to 3 boys and 3 girls, for which the probability is 20 in 64, ‘Thus, the (*) term in Equation (5.4.1) is a generator for Pascal's triangle. The powers of ¢ and (I ~ in Equation (54.1) emerge as shown in Table 5.4.2 for different values of n andi. We get the complete Berstein polynomial terms when the two sets of results are combined. It is shown in Table 54.3. The important property of these terms is they sum to unity, which is an important feature of any interpolant. The sum of (I ~ 4) and ris 1, (d-p4+0" = 1 (5.4.3) Because of this reason we can use the binomial expansion of (11) and ¢ as interpolants. When n = 2.we obtain the quadratic form (-2d-or (544) ‘Table 5.4.2 : Expansion of the terms t and (1 -) I 0 1 2 3 4 Translation | (1-1) 1-0 | a-9* t td-y]ta-? | ao t ta-9[fa-o' fra» fant It] 1d-p ie} 2d-9 | 10-97 3ea—o[ aa—9 | 1a-0° 4e(1-yfer ao | aed |d-9" °o o1 02 03 04 05 06 07 08 09 1 Fig. 5.4.1 : The graphs for quadratic Bernstein polynomials In Fig. 5.4.1 the graph is given for three polynomial terms of Equation (5.4.4). The (1-1)! graph begins at | and ends to 0. The graph starts at O and rises to 1. - The 2r (1 - #) graph starts at zero and reaches a maximum of 0.5 and returns to zero. So, there will be no influence on central polynomial term at the end-points where t = 0 ang tel — These three terms can be used to interpolate between a pair of values as follows : V = V, (1-0 +2(-p+Vye (5.45) - If V, =1 and V, = 3 we get the curve as shown in Fig. 5.4.2, But there is nothing preventing us from multiplying the middle term 21(1 - 1) by any arbitrary number Ve: V = V, (1-0 +V, 2001-9 +V,e (5.46) 35 34 257 °o 01 02 03 04 05 08 07 08 09 1 Fig. $42 : Bernstein interpolation forthe values between 1 and 3 i cee ORS or. EEL ESS Comp.-Son¥) $22 _Intpoaton nd Characor Animation 38. ee 3 25 2 15, 1 05, 0 0 01 02 03 04 05 08 07 08 09 1 Fig. 5.43 ‘ vena Interpolation forthe values between 1 and 3 with Ve = 3 —$—_ 3 25: at 13 1 05+ oa ren | 2p ile seers ee cleo soas 101] Fig. 5.44 : Bernstein interpolation for the values between 1 and 3 for different values of V For example, if Vo =3 we get the graph as shown in Fig, 5.4.3, the value of Vc determines the shape of the curve between two values. Different types of graphs for different values of Ve are depicted in Fig. 5.44. If the value of V, is set in between Vand V, we get very interesting For example, when V, = 1 and V, =3 and Vo-=2, we get linear interpolation between V, and V,, as shown in Fig. 5.45. 35 3 254 2 1ST 1 05 | oi 1 sd (Olea 0.2 as 0A 6 ee O.e) ete een ya Fig. $45 : Linear interpolation by a quadratic Bernstein interpolant (uu 8.Se. Comp.-Sem-V)_523_Interpoaton and Character Arimay., 5.4.2 Quadratic B’ezler Curves ~ Quacatic BYezier curves are generated with the help of Bemstein polynomias ,, interpolate between the x,y and z coordinates associated with the stat- and end.poi, forming the curve. ~ For instance, draw a 2D quadratic B’ezier curve between the points (I, 1) and (4,3) using the following equations : xe 10 -p ex 2p ede y = 10-0 +y. 200-430 (543) — Important thing is the values of (x. y,). These values are completely up to us. The contra) vertex position determines how the curve moves between (1, 1) and (4, 3). ~ A Brezier curve has interpolating and approximating qualities: the interpolating feature make sure that the curve passes through the end-points and. the approximating feature illustrates how the curve passes close to the control point. - To demonstrate this, In fig. 5.4.6, consider x, = 3 and y, = 4, you can see how the curve has intersected the end points and misses the control point. — _BYezier curves has two important features, that are : the convex hull property, and the end slopes of the curve. The convex hull property means the curve is always contained within the polygon connecting the end and control points. ~ Here, the curve is inside the triangle and it is created by the vertices (1, 1), (3, 4) and (4, 3). The slope of the curve at point (1, 1) is equal to the slope of the line connecting the start point to the control point (3, 4), and the slope of the curve at point (4, 3) is equal the slope of the line connecting the control point (3, 4) to the end-point (4, 3). — Following are the important two points to note: © You can position (x, y,) anywhere, there are no restrictions. © Add the coordinates forthe start, end to create 3D curves. 45 4 as 3 25 2 15 1 os 0 0 1 2 3 4 5 Fig. 5.4.6 : Quadratic B’ezler curve between the points (1, 1) and (4, 3), with (3, 4) ‘as the control vertex y ramming (MU B.Sc. Comp.-Sem-V) 5-24 _ Interpolation and Character Animation 543 Cublc Bernstein Polynomials Quadratic curves are so simple; we need to join a large number of curves to build a complex curve with peak and valleys, ‘A cubic curve, supports one peak and one valley, which make construction of more complex curves simple, When n= 3 in equation(5.4,3), we get the following terms : (1-00 = (-ye3d-y?e3at doe? (5.48) which can be used as a cubic interpolant, as V = V, (1-1) Vy 3t(1-' + Vat (1-0 + V20° (649) Cubic curve also has the sum to unity, the convex hull and slope properties. The Fig. 5.4.7 depicts the graphs of the four polynomial terms. ‘There are two control values, Vand V.., and can be set to any value, independent ofthe values chosen for V, and V;, To show this, let's 12 1 08 06: o4 02 0 0 01 02 03 04 05 06 07 08 09 1 ig. 5.4.7 : The cubic Bernstein polynomial curves od 0 01 02 03 04 05 06 07 08 09 1 Fig. 5.4.8 : The cubic Bernstein polynomial through the values 1, 2.5,~25, 3 = Consider blending between values 1 and 3, with Vz, and Vzy set to 2.5 and -25 | correspondingly. I WP Won (MU B.Sc. Comp.-Sem-V)_5-25 _Interpolation and Character Ar ~The Fig. 5.48 shows the blending curve. Now, associate the blending polynomials and y coordinates : X= x (1-0)+xy3t(1- +x 30 (I-eat y= yO) + yq 3-0 +yq3t I -D+ ye ( Evaluating (5.4.10) with the given following points: Gy) = (1) Gar Yed = 23) (ar Ye) = B,-2) Bry = 4,3) ~ We got the cubic B'ezier curve as shown in Fig. 5.4.9, with some rules between | and control points. To check the consistency of t Bernstein polynomials set the valy (yy) = (1) Ka Yau) = @, 1.666) a Ya) = (3,2.333) (yo) = (4,3) Fig. 549: A cuble B'ezler curve 0 1 2 x? 4 5 Fig. 5.4.10: A cuble B’ezier line - Where, (x, ¥.1) and (2 Yq) are points one-third and two-thirds between the si final values. In quadratic the single control point was central point between the s! end values, 0, the linear interpolation we get is shown in Fig. 5.4.10. | gy gem bomen orp.-Sem-V)_ 5.26 _Inepolaion and Character Animation ‘Mathematicians have devised an elegant way of abbreviating equation(5.4.7) and (5.4.10). ‘The equation(5.4.7) gives three polynomial terms for creating a quadratic B'ezier curve, and (5.4.10) describes the four polynomial terms for creating a cubic B'ezier curve. ‘The quadratic equations are known as second-degree equations, and cubics are called third-degree equations. In the original Bemstein formulation, Bi) = (Jea-ur 64.11) Here, n is a degre of the polynomial, and i has values between O and n, creates the individual polynomial terms, ‘These polynomial terms ae used to multiply the coordinates of the end and control points. If you store these points a vector P, a point p(t) on the curve is written as w= (*)ia-o"2, forosisn 6.4.12) Or a p= D (*)ra-op, (6.4.13) Or a po = Y BOR, (5.4.14) i=0 = Forexample, a point p(t is represented on a quadratic curve by pi) = 10 (1-0? p+ 20! =n! p, +10 1-0" p, (5.4.15) = You will find out Equation (5.4.13) and Equation (5.4.14) used in more advanced texts to describe B’ezier curves. 5.4.4 A Recursive B’ezler Formula ~ The Equation (5.4.13) clearly illustrates the polynomial terms needed to construct the blending terms. It is possible to find the new formulation that guides us towards an understanding of B-splines with the help of recursive functions. ~ Toinitiate, we have to express (*) in terms of lower terms, and as the coefficients of any Tow in Pascal's triangle are the sum of the two coefficients immediately above, so, we ca” a. st ee r Game | (MU B.Sc. Comp. -Sem-V)_5-27 Inter atlon and Character Animat, a) _ (a-1),(n-1 () =( i ert) 6.4.16) Therefore, write as : Bry = (mi) ta-o4(47]) a0 BO = 6.4.7) ~ When with all recursive functions process is terminated when the degree is 2er0, Thy, B9 (t) = 0 and BY (t) = 0 for j<0. 5.4.5 B’ezler Curves Using Matrices = Matrices provides compact notation for algebraic formulae. Now, we will see hoy Bemstein polynomials provide themselves to this form of notation. Recay Equation (5.4.7), that describes the 3 terms related with a quadratic Bernstein polynomia}, These are expanded to 0-240) 2-290 64.18) and the product of two matrices id written as: ee [een]}-2 2 0 6.4.19) ite 0 0 ~ This means that (5.4.9) can be expressed as given below 1-217 [M1 Vz [te1]][-2 2 0]+1 Ve (5.4.20) 1 0 oJ Ly, and (5.4.10) as i =t1 Pi pit) = [0t1}}-2 2 Of+]} Pe (5.4.21) 1 0 oJ Lp, ~The point p(t) is any point on the curve, and P, is the start point, P, is the control point and P, is the end point. The same improvement is used cubic B’ezier curve, which has the following matrix formulation: 13-3170 3-6 30 mi= 5 Gs “a (6420) ee is - The drawback of Bezier curves is when an end or control vertex is repositioned, the whole curve is modified. This drawback is overcome by B-splines curve. ‘Game Programming (MU B.Sc. Comp.-Semv) 5.28 I Linear Interpolation ‘The following intepolant is used to interpolate between two values that is Vo and V, Vi) = Vo(1-H+V, for OStS1 for OStS1 ‘The linear blending function is B; 1). It's subscript i is used to reference values in the not vector. To calculate the influence of the three values on any interpolated value V(1) the blending function is used as follows : 1 VO = By Vo+B) V+ BL OV, Let's try to understand the Bending funtion, take a specific value of B, (I the value of tis ess than value of tor grester than value of tthe function B, (1) must be zero. _ When ty $ 1S ty, the function must return a value reflecting the proportion of V, that influences V(t). ‘Throughout the duration; 15. 1 os: 0 ° 1 2 3 4 5 Fig. $5.3 : Four corve segments forming a B-spline curve = Wcan be proved by differcatiating the basis functions : . a3 46-3 By) =e (55.7) . oI Bi() = (658) w Programming MUES:Corp.-Senv) 532 __ Interpolation and Character Anmaton 9+ 643 y = SERS (559) Bit ($5.10) ‘When we evaluste Equation (3.5.7) - (55.10) for r= 0 and = 1, we get the slopes 0.5, 0, -015, 0 forthe joins berween B, B, B,, B, The third level of curve continuity, C*, makes sure that the change rate of slope at the ead of one basis curve matches that of the following curve. It can be confirmed by differentiating Equation ($.5.7) ~ (55.10): By) = -t41 6S.) Bra 2 (55.12) Bi) = -3+1 (55.13) BS (S514) Table 5.1: Continuity properties of cubic B-splines T T Coico 1 Con B,()|0 | v6 Bao Jos By) {0 1 8.0] u6|28|a@|os Jo | aw] | | |-2 Bw) 2s | 16 B00 |-05 Br |-2|1 B,0{u6]o |siw{-o5jo |azw|i |o 15.3 Non-Uniform B-Splines Uniform B-splines are constructed from curve segments, where the parameter spacing is at the same intervals With the support of knot vector Non-uniform B-splines provide extra shape control ead the option of drawing periodic shapes, \54 Non-Uniform Rational B-Splines The advantages of nonuniform B-splines and rational polynomials are combined in Non uniform rational B-splines (NURBS).. = Gane (MU B.So. = Sem-V) 5-33 __Inlorpolation and Character Animation = The periodic shapes like circles are supported by NURBS and they exactly descr, ‘curves associated with the conic sections. They also describe geometry used in the ‘modelling of computer animation characters. = NURBS surfaces also have a patch formulation and play a very important role in surface modelling in computer animation and CAD. Syllabus Topic : Analytic Geometry 5.6 Analytic Geometry ~ In this session we are going to study elements of Euclidean geometry. Here the main focus is to understand the definitions of straight lines in space, 3D planes, and how poins of intersection are computed. ~ Additionally, we will also see the role of parameters in describing lines and line segments, and their intersection. —— Syllabus Toplc : Review of Geometry 5.6.1 Review of Geometry - As we know that parallel lines don't meet, and the sum of the internal angles of a triangle is 180+, but it is only in specific situation, Such rules break immediately the surface o space becomes curved. - So, let's have a look on some rules and observations that apply to shapes drawn on a flat surface. 5.6.1.1 Angles = By definition, 360° or 2x (radians) measure one revolution, You can see in fig. 5.6.1 an examples of adjacenU/supplementary angles (sum to 180°), opposite angles (equal), and complementary angles (sum to 90°). Fig. 5.6.1 : Examples of adjacent/supplementary, opposite and complementary angles ramming (MU B.Sc. Comp -Sem-¥) 5:34 ‘and Character Animation 56.12 Intercept Theorems ‘you can see in Fig. 5.6.2 and Fig. 56,3 the scenarios that involve intersecting lines and parallel lines, that give rise to the following observations : First intercept theorem : ae a (5.6.1) Fig. 5.6.2: 1" intercept theorem _ Second intercept theorem : (5.6.2) Fig. 563 : Second intercept theorem 56.1.3 Golden Section ~The golden section represents the ideal ratio of height and width of an object and is used in art and architecture, | = Its origins stem from the interaction between a circle and triangle and give rise to the following relationship : b =} (5-1-0618) 563) ~ The rectangle depicted in Fig. 5.6.4 has the proportions. Height = 0.618 x width ‘The important thing is most widely observed rectangle -the television screen-bears nO relation to this rato, Bs: Interpolation and Character Animation Fig. 5.6.4: A rectangle with height and width equal to golden section 5.6.1.4 Triangles ~ There are rules related to interior and exterior angles of a triangle and these rules are useq to solve all the geometric problems. ~ In the Fig. 5.6.5, two diagrams are given; these diagrams identify interior and exteriog angles. Fig. 5.6.5 : Relationship between interior and exterior angles ~ From the fig. 5.6.5 we know that sum of the interior angles is 180+, and the exteriog angles ofa triangle are equal tothe sum ofthe opposite angles : a+B+0 = 180° 648 a+8 8 = a+B 5.6.1.5 Centre of Gravity of a Triangle ~ A median is a straight line that joins a vertex of triangle to the mid-point of-the opposite side. ~ After drawing all three medians they intersect at a common point, that point is triangle’s centre of gravity. The centre of gravity divides all the medians in the ratio 2 : 1. y Game Programming (MU B.Sc. Comp.-Sem-V) _§-36 __ Interpolation and ‘Character Ani Fig. 5.6.6 The three medians ofa triangle intersect as its centre of gravity 5.6.1.6 Isosceles Triangle An isosceles triangle shown in Fig. 5.6.8 has two equal sides of length J, and equal base angles a. The triangle’s height and area is given as follows : (5.6.8) Fig. 5.6.7: Intersecting of 3 medians ofa triangle as its centre of gravity c c 2 2 Fig. 5.68 : Am isosceles triangle has two equal sides | and equal base angles a 5.6.1.7 Equllateral Triangle . = An equilateral triangle consists of three equal sides of length J, and equal angles of 60 . ‘The triangle’s height and area is given as follows : he £, As Bp S69) 5.6.1.8 Right Triangle | >The right triangle alitude and area is given as follows : ab zt (5.6.10) be =z ane Gamo (MU B.Sc, Comy \)_ 587 __Inlerpolaton and Character Animation 5.6.1.9 Thales Theorem The theorem of Thales states that the right angle of a right triangle lies on the citcumcina). over the hypotenuse, 5.6.1.10 Theorem of Pythagoras ~The Theorem of Pythagoras we have already seen in earlier topics. Itis given by : devel +S.6.1)) ~ From the above equation we can show that sin’(a) +c0s"(a) = 1 ++(5.6.12) 5.6.1.1 Quadrllaterals = Quadrilarerals consist of four sides. The rectangle, square, trapezoid, parallelogram ang thombus are the examples of the quadrilaterals. ~The sum of interior angles of the quadrilaterals is 360°. The most familiar shapes are square and rectangle. 5.6.1.12 Trapezold ~The fig. 5.6.9 shows the trapezoid , it has one pair of parallel sides h apart. The mid-ine ‘mand area are given by a+b Asmh (56.13) Fig, 5.6.9: A trapezold with one pair of parallel sides 5.6.1.13. Parallelogram = A parallelogram is created from two pairs of intersecting parallel lines. It has equal opposite sides and equal opposite angles. ~The height and diagonal lengths are given as follows : b= b-sina ++(5.6.14) = esi ae oe (5.6.19) area is given by, A= ah (5.6.16) some Progamming UBSs.Comp.-Semv)_ 5.36 _ifepoation and Character Animation 6.1.14 Rhombus A rhombus isa parallelogram with 4 sides of equal length a. The area is given as follows : A= asin) Ah (5.6.17) 56.118 Regular Polygon (n-gon) Fig.-5.6.10 shows part of a egular n-polygon with outer radius R,, inner radius R, and cedge length an, ‘The relationship between the area, an, R, and R, for different polygons. Set Fig. 5.6.10 : Polygons part presenting the internal and outer radii and the edge length. 56.116 Circle ~The circumference and area of a circle is as follows : C = nd =2nr (5.6.18) 2 Asnt =f (3.6.19) Where, d= 2r. = The area between the two concentric circles is known as an annulus, and its area is given by j A= mR -#)=S(D'-d) Where, D=2R and d=2r, = Thearea of the sector of acircle is given by a As—— wve(5.6.20) ae a ( - The area of the segment of a circle is given as follows, 2 = J (@-sin(a)) (ais in radians) hose k 1 Game Pro (MUBSc Comp -SemV)_559_loterpotaton and Charactes Anmata, ~The area of an ellipse with major and minoe radi a and bis given by, As mb (S62) — ‘Syllabus Topic : 2D Analytical Geometry —— 5.7 __2D Analytical Geometry In this section we will study the familiar descriptions of geometric elements and metho, for computing intersections. 5.7.1 Equation of a Straight Line = The equation of a line has a slope m and c the intersection withthe y-axis, a8 shown in fig 5.7.1, This is called the normal form. y Fig. §.7.1 : The normal form of the straight line is y = mx +¢ = Thetwo points (x,y) and (x, y:) are given, we can state a (S11) +572) It provides, Kom ow, tH - As these equations have their uses, there is more general form is that is much more suitable : ax+by+e = 0 (5.13) 5.7.2 The Hessian Normal Form = The fig.5.7.2 depicts a line whose orientation is controlled by a normal unit vector n= (a b)'. If point Ptx, y) is any point on the line, then p is a position vector whet p= [xy] and dis the perpendicular distance from the origin to the line. Fig. 5.72 The orientation of tne coated by a normal vector n and distance d ‘Thus, d Tp = esta) d = Iiplicos(a) = The dot product np is a-p = Unillipticas(a)=ax+by 2ST) it means ax +by = diinil (5.7.6) Aslinil=1 wecan write ant by-d = 0 Where, (&y) ~ Itis a pointon the line and b- The components of a unit vector normal tothe line d-- Perpendicular distance from the origin tothe line. ~The distance d is positive when the normal vector points away from the origin, else itis negative. Hessian normal form involves a unit normal vector, we can include the vector's direction cosines within the equation : X cos (a) +y sin(a)-d = 0 5.78) ‘Where, is the angle between the perpendicular and the x-axis. 57.3 Space Partitioning The Hessian normal form offers a useful method of partitioning space into two zones: Points above the line in the patton, that includes the normal vector, and points in the pposite partition. jamming (MU B.Sc. Com; = The equation given axtby-d = 0 Interpolation and Character Ani 5.79) = A point (x, y) on the line satisfies the equation, but if we the point with another poing (ry 9») which is in the parton in the direction of the normal vector, it creates the inequality ax tby,-d > 0 (5.7.10) ~ Equally, @ point (x ys) that is in the partition opposite to the direction of the normal vector creates the inequality ax, tby,-d < 0 (STA) - Hessian normal form's space-partitioning feature is used in clipping lines against polygonal windows. 5.7.4 The Hesslan Normal Form (HNF) from Two Points ~The two points (x, y,) and (x y,) are given, now calculate the values of a, b and d for the Hessian normal form as follows : y-¥i x-% * Hence, Q-yax And xAy-y Ax—x,Ay+y, Ax ~ Ibis the general equation of a straight line, For the Hessian normal form, ~ Thus, the Hessian normal form is given by x Ay ~y Ax~(x, Ay~y, Ax) Ax’ + Ay’ ~ Therefore using Equation (5.7.14), X(-1)-y(1)-(0x-1-1x1) -x-y+l ax-y+! Vr Fe =0 This is general equation for line; now convert it into Hessian normal form +5712) (5.7.13) (5.7.14) +(5.7.15) ‘The selection of signin the denominator predicts the two directions for the normal vector, and the sign of d -_ Syllabus Topic : Intersection Points: — Intersection Points 564 Intersectlon Point of Two Straight Lines ‘The following are the two line equations of the type axtbyy+c, = 0 axtbyte = 0 (58:1) ‘The intersection point (x, y,) is given as follows : 61% ~ Cry : 2 aba, oe ~The equation is linearly dependent if the denominator is zero, and indicates that there is no intersection. 5.8.2 _ Intersection Point of Two Line Segments - The shapes and objects edges are represented by the line segments. Let's see how to compute the intersection of two 2D-line segments = For instance, there are two line segments given by their end points (P,P) and (P,~ P,)- If position vector is located at these points by us then the following vector equations are used to identify the point of intersection : P = P, +t(P,-P)) P, = Py +s(P-Py) (5.83) |= Here, the parameters ands vary between 0 and 1. So, the point of intersection is : | P, #t@,-P) = Pp +8(%-P) 654) ~ Thus, parameters s and fare given as follows : @,-P)+t@-Pd) S=P=PD) @,-P)+s(,-P) te ee) +(5.8.5) WW came Progamming (MU B.Sc. Comp.-SemV)_549_ Interpolation and Character Animation = From the above equation we can write (X)-X)) #8 (Xj-%3) (%-%) _ Qs- +8 1-99) ie nT) G85 Itgives 2, Ye- Ys +H Yi YO +H“) = Gly Gi m)= = OID The same way, X41 Q5-¥) +H Wy-Y +H V2-¥) § = “Qyayd O= m= =m) O2-) GAN) Syllabus Tople : Points Inside a Triangle 5.9 _ Points inside a Trlangle = Many times we test that the point is inside the triangle, outside the triangle or itis touching the triangle, We can examine this operation in two ways. = The first way is related to finding the area of a triangle. 5.9.1 Area of a Triangle Assume thatthe triangle is build anticlockwise points (x,y), (tp ya) and (xy). The area of the triangle is given as follows : 1 1 A= (%-%) Vs- YW) - 7 - XV) V2) 3 -%D) V3 - Yd) 1 = 2 5-™) Os- 0) Simplify, A= F 1 0:-¥9+m0)-W +m 0-99] Further simplify to x yl As| hh (591) » ys 1 ame Progamming MU B.S. Cr 5-44 Interpolation and Character Animation ‘After finding the area of triangle we have to check whether the point is inside the triangle cor not. Consider the testing point if p, so we should check the following conditions = 1, Ifthe area of triangle (P,, P, P) is positive, then point P, must be to the left of the line (P,P). 2. If the area of triangle (P,, P,P, is positive, , then point P, must be to the left of the line (P P). 3, If the area of triangle (P,, P,, P) is positive, , then point P, must be to the left of the line (Ps, Pi). 59.2 Hessian Normal Form ‘The Hessin form is used to verify whether a point is inside the triangle, outside the triangle or touching a triangle by representing the triangle’s edges. We also test at which part the point is placed in. If normal vector is pointing towards the inside of the uiangle, then any point inside the triangle gives a positive result when tested against the edge equation. ‘There is no need to ensure that the normal vector is a init vector in the following calculations : | F911: the point soit inl aay tthe et asthe Boundary traversed in an anticlockwise sequence, | > Consider Fig. 5.9.2, a triangle is formed by the points (1, 1), (3, 1) and (2, 3). With reference to Equation (5.7.14) calculate the three line equations : 1, The line between (1, 1) and (3,1): O(c = 1) +2113) -ty+2 0 (5.9.2) 0 (MU B.Sc. Comp. -Sem-V)_ 5-45 tion and Character Animatn Fig. 5.9.2 : The triangle created by three line equations expressed in the Hessian normal form Multiply equation (5.9.2) by -1 to reverse the normal vector : ay-2 = 0 2. The line between (3, 1) and (2, 3): x-3)+(-I)l-y) = 0 2x-6-1+y = 0 d+y-7 = 0 Multiply equation (5.9.4) by I to reverse the normal vector: -2r-yt+7 = 0 3. The line between (2, 3) and (1, 1) : (-2)@-2)+(-1I)G-y) = 0 ~Ut4-34y = 0 -2r+y+1 = 0 Multiply Equation (5.9.4) by 1 to reverse the normal vector : 2x-y-1=0 Thus, triangles three line equations are given as: dy-2 = 0 -tx-y+7 = 0 de-y-1=0 Here, we are interested in the sign ofthe left-hand expressions : dy-2 2e-y47 deny-d +693) ++(5.9.6) ys ye Programming (MU B.Sc, Com, VW) 5-46 Interpolation and Character Animation You can test it for any arbitrary point (1, y), 1, Hfall the points are positive, then the point is inside the triangle. 2, Ifone expression is negative the point is outside, 3, Ifone expression is zero, the point is on an edge 4, If two expressions are zero, the point is on a vertex. For example, let’s consider the point (2,2). The three expressions given in Equation (5.9.4) are positive, which confirms that the point is inside the triangle. The point (3,3) is outside the triangle, which is confirmed by two positive results and one negative. ~ Finally, the point (2, 3), which isa vertex, gives one positive result and two zero results. Syllabus Toplc ; Intersection of a Circle with a Straight Line 5.10 _ Intersection of a Circle with a Straight Line = It is not possible to calculate the circle's intersection with straight line. Let's start with the circles equation with the normal form of the line equation : tale Maye and y= mee = If we substitute the line equation in the circle’s equation, we get two intersection points as follows: -met fF (1+m)-¢ lem" ctmyr(i+m)-¢ Ya = laa Review Questions 10.1 Explain trigonometry and trigonometric ratio? ‘What is the sine and cosine rule? What is interpolation? Explain linear interpolant? Explain nonlinear interpolation? Explain trigonometi interpolation? Q.10 Qn 12 a.13 a4 Q.15 Q.16 Q.17 5.47 __ Interpolation and Character Animation Explain cubic interpolation? How to interpolate Quatemions? Explain Bezier curve? Explain Quadratic Bezier Curves? Explain B-splines? What are uniform B-splines? Explain Non-Uniform B-Splines? Explain the following terms with respect to geometry a. Angles b. Intercept theorem ¢. Golden Section d. Triangles e. Centre of Gravity of a Triangle {. Isosceles Triangle g. Equilateral Triangle h. Right Triangle i. Thales Theorem j. Theorem of Pythagoras Write short note on 2D analytical Geometry? Write short note on intersection point? How you will decide the point is inside the triangle or outside the triangle? Chapter Ends... ood erat Introduction to Rendering Engines syllabus Introduction to Rendering Engines: Understanding the current market Rendering Engines. Understanding AR, VR and MR, Depth Mappers, Mobile Phones, Smart Glasses, MD's, Syllabus Topic : Understanding the Current Market Rendering Engines 6.1 Understanding the Current Market Rendering Engines Few of the current market rendering engines ae discussed inthis section, Understanding the current market Rendering Engines ; Fig. 6.1.1 : Understanding the Current Market Rendering Engines ¥ Programming (MU B.Sc. Comp. -Sem-V)_62 oduct to Rendering Egy > 1. 3Delight ~ 3Delight is a 3D rendering programming that is intended for most extreme similarity ig industry benchmarks, as it utilizes the Render Man shading language. Its reconcitistog into real 3D modelling solutions like Maya makes it a well known alternative for feature film impacts — In Maya, clients can pick between various rendering calculations (REYES) and pay following, giving filmmakers more noteworthy adaptability without the need to purchase another application. ~ AS is not out of the ordinary, this 3D rendering software supports physically baseq materials; different measures incorporate OpenExr and OpenVDB. For single clients, ‘Delight is free. 3Delight is an extraordinary 3D rendering software for enhancements since it coordinates well with different projects. > 2 Arion - Arion is an unbiased 3D rendering programming that permits rendering light sources independently for finish adaptability in after creation. It delivers the results at high speed because of its help of both CPU and GPU based rendering. Notwithstanding, the GPU mode requires perfect hardware by NVidia. ~ For practical skins, Arion highlights sub-surface scrambling that is both physically-based and exceptionally configurable. Both the sun and sky simulation are additionally physically-based. = Toadd to the authenticity, physical focal point impacts like ISO, f-stop and film move can be utilized as a part of this 3D rendering programming, AOVs can be proficiently composited on account of the help of OpenExr. Its great yield alternatives make Arion of the best 3D rendering programming on this ist. > 3. Amold — Amold is quick CPU based ray tracing renderer that was created for the VFX studio Sony Pictures Image works. It is prepared to do continuous rendering, implying that when yo do changes amid the rendering, they are instantly incorporated into ongoing computations. - This, thus, accelerates the procedure to build up the desired look. With this 3D rendering programming, you can make about any pass imaginable, giving you a huge amount of control over the last look in the compositing stage. y Game Programming (MU B.Sc. Comp, -Sem-V) 6.3 Introduction to Rendering Engines Being one of the favoured 3D rendering programming solutions in the impacts industry. ‘Amold underpins volumetric, Furthermore, its material editorial manager is node based making the procedure natural, Numerous presets are accessible out of the case and extra nodes are composed by the dynamic client network. Its extensive variety of highlights makes Amold extraordinary compared to other 3D rendering programming for enhancements. » 4, Artlantis Artlantis is a 3D rendering software that has been produced with an eye on the necessities of planners and fashioners. It gives a wide choice of preset like indoor/open air lighting for a fast turnaround, The materials inside Artlantis are physically situated so as to accomplish photorealistic comes about. Likewise, among the propelled shaders are brilliant materials that broaden the scope of conceivable outcomes scenes can be lt. The built-in render manager inthis 3D rendering software enables you to appropriate the registering load equally among your nearby PCs, limiting the general rendering time. = Artlantis is accessible ina rendition for high-determination still pictures and another that ccan create high-determination 3D renderings, iVisit360 displays, VR Objects, and arrangements, Because of the many output formats, Artlantis is a magnificent 3D rendering software for designers who need to inspire their clients. > 5. Clarisse ~ Despite the fact that not a granddaddy of 3D rendering software, Clarisse would already be able to demonstrate various blockbuster credits for itself, similar to the new Star Wars films. ~ The explanation behind this achievement Clarisse is that it isn't just a quick 3D rendering software, yet it additionally urges specialists to set up their scenes cleverly and cost- successfully. For example, scenes are fabricated, lit and rendered from resources. These benefits can be adjusted in outsider software which encourages the pipeline mix of Clarisse, ‘The rendering capacities of Clarisse are exceptionally best in class. For example, it encourages the reconciliation of rendered pictures and sequences from other 3D rendering Software by profound picture output. To separate it, Clarisse outputs depth data for each Pixel, Inroducton to Rendering Engng, = You can Jeam Clarisse for nothing with the Personal Learning Edition. The renders mae with the PLE are restricted to 2560x1440 determination and may not be utilized f, business purposes. Clarsse is the best 3D rendering software for business studios thay endeavor to streamline their generation pipeline. % 6 Corona ~ Corona is an (un)biased photorealistic 3D renderer, that has turned out to be mainstream, because of the convenience. Following the saying "toning it down would be best," Corona offers a less jumbled UI to chop down setup times. ~ Despite the fact that a CPU based solution , the intuitive rendering ability of this 3p) rendering software can likewise accelerate work processes. This 3D rendering software jg adaptable with respect to authenticity. ~ Wonders that require much processing power like caustics can be debilitated specifically, In addition, materials can be made that curve to the imaginative vision, not physical reality. — _Inthis way, a material can be blue when seen straightforwardly yet have alternate shading when reflected or refracted, Because of its awesome ease of use, Corona is appropriately a profoundly famous 3D rendering software. > 7. FelixRender - _ FelixRender is cloud computing 3D rendering software. It frees experts who don't have the assets from the limitations of their equipment by giving an interface that permits simple transfer of scene records. Important fields incorporate structural representation and advertising. ~The renders are done rapidly as lightning and are prepared for download. For security all transferred records are 256 bit encoded. In the engine, it use Maxwell Render, a unbiased top notch 3D rendering software. ~ Furthermore, FelixRender gives informal community highlights like sharing of scenes, materials, and resources with partners. To populate expansive and complex scenes, FelixRender includes its own particular kind of 3D display instancing to the bundle. FelixRender is advantageous cloud 3D rendering software for experts. > 8 FurryBall ~ Another very fascinating biased/unbiased GPU renderer is called FurryBall. It presents to you a live dynamic review. This implies you don't have to wait for conclusive renders to see the after effects of your progressions, taking into consideration a significantly speedier work process. you Programming (MU B.Sc. Comp. 65 Introduction to Rendering Engines ‘The designers go for the necessities of computer animated movies and embellishments rendering. Hence, this 3D rendering software is likewise accessible as a module for Maya, Cinema 4D, and 3ds Max. The plugging likewise bolster all the uncommon segments and outsider modules of the principle programs that control particles, liquids, and hair, for ‘example, Shave and a Haircut, In the event that you need to test drive this 3D rendering software, download the free form that incorporates 20 hours of rendering, FurryBall is a magnificent decision on the off chance that you are not happy with the worked in standard 3D rendering software of Maya, Cinema 4D, or 3ds Max. 4 9% Guerilla Render Guerilla Render is an unbiased 3D rendering software that as of now has many element movies to its name. To help approve a shot before the last render, the program highlights dynamic rendering inside its viewport On account of physically based rendering, it is anything but difficult to deliver photorealistic stills and arrangements. Among the rendering highlights, you can discover subsurface diffusing. The lighting arrangement of Guerilla ~ Render is exceedingly adaptable, taking into consideration example lighting in particular AOVs. Moreover, the program offers full OpenEXR bolster which encourages the compositing stage by choosing layers and protests in the last rendered picture record. - The 3D rendering software is free for single workstations. The free form of Guerilla Render is the ideal 3D rendering software for youthful experts and autonomous craftsmen. + 10, Iray = Iray is a biased GPU renderer software by NVidia. Created it for NVidia's CUDA innovation, Iray is adapted at any individual who doesn't have master information in rendering. ~ Consequently, this 3D rendering software is appropriate for creators who require constant outcomes in photorealistic quality at a sensible cost. The pertinent fields incorporate design, building, craftsmanship, and promoting. ~ Albeit produced for constant use situations, this 3D rendering software brings 2 large number of the highlights you would anticipate from a CPU solution. Like instancing, caustics, subsurface dissipating or volumetrcs, W came Programming (MU 8 Se. Comp. Sem) _6 Invedueton to Renden Eig, = To accomplish its superior photorealism, Iry utilizes physically-based materials, effectively reeeae their appearance properties, light outow of surfaces, and scrambjng and ingestion properties of volumes. ~ + Light sources can be segregated in rendering to increase full control of the last piece, y, usability makes Tray the ideal 3D rendering software decision for accomplishing persuading brings about no time. > 11. Keyshot ~ _ Keyshot is a 3D renderings software like no other on this list. It doesn't tie in with 3p, displaying solutions. Rather, itis an unadulterated independent solution that goes for the most easy to use rendering process conceivable. ~ Keyshot has a far reaching library of physically exact materials that are relegated to thy 3D document by intuitive. The whole procedure happens progressively, i.e. while you sey up your scene with cameras, materials, and lights, the program always refreshes the rendered picture. - This element of Keyshot enables you to work considerably quicker than numeroys different solutions. The usability and ongoing rendering make this 3D rendering software a magnificent decision for designers and modellers who need to feature their work without sitting idle learning complex new software. > 12. Blender Blender is effortlessly the most adaptable and entry on this rundown, and in numerous respects, it thinks about positively to top digital content creation tolls like Maya, Cinema 4D, and 3Ds Max. Right up ‘til today it remains as one of the best open-source development projects ever considered. = Blender is completely included, offering a total scope of displaying, chiselling, activity, surfacing, painting, and rendering apparatuses. ~The product is adequate to have created various great short movies and is being used by a few expert studios, Blender was scrutinized right off the bat for having a befuddling interface, however don't give obsolete protests a chance to control you away. ~The product was given an exhaustive upgrade about a year prior and rose with a crisp interface and a list of capabilites that goes for equality with the best. y Game Programming (MU B.Sc. Comp.-Sem-V) 6-7 Introduction to Rendering Engines While you don't generally observe Blender in any Hollywood impacts pipelines where Autodesk and Houdini ae profoundly imbued, Blender has relentlessly cut outa specialty in movement illustrations and representation, like where Cinema 4D exceeds expectations. Syllabus Topic : Understanding AR, VR and MR 2 __Understanding AR, VR and MR 2.1 Augmented Reality Making an Augmented Reality encounter is a multi-organize process, from the improvement of the 3D models to the innovation that powers the image tracking and depth sensing. Each progression is a natural procedure. 3D craftsmen utilize a few programming programs like Sketch Up, Cinema 4D, Blender and numerous more to make 3D models. ‘The procedure begins with a rough sketch then it goes through various series of approval that aides in refining the thought and hence prompts finish image. After the image is acknowledged, the modelling starts, Following stage isto apply the model's skin, which is called the texture map. Texture map offers authenticity to the model and can be styled to fit the requirements of the task. After the completion of the 3D model, it experiences the rendering process for a total AR encounter. 6.2.1.1 Applications of AR Applications of AR 1. Medical 1. Medical ~ Angmented Realty gives shrouded data tothe surgeon like rate of the pulse, condition ¢¢ the individeal's organ, circulatory strain and so forth. ~ Avgmented Reality can be helpful by enabling a surgeon to peer inside the patien 4, joining one wellspring of pictures like a X-ray with another, for example, video. It cz, Ukewise improve review of a FET us inside a mother's womb. > 2 Military ~ Augmented reality can be utilized to show the genuine battlefield. Arcane built up 2 snimated territory that could be utilized for military intercession with the assistance of augmented reality. ~ For instance, a soilders, could utilize extraordinary sort of indicators to check differen sorts of objects to caution them of potential threats. > 3. Marketing ~ Car industry had right off the bat utilized AR for different promoting purposes. Differeat organizations had printed extraordinary Kind of flyers that were perceived by webcans. They would make a 3D show that was then shown on the screen. = Comparable sort of approach was utilized for showcasing purposes in different fields Another showcasing methodology was produced utilizing the shoes where the cliea wears an exceptional match of socks and at that point when the client walks before 2 camera he can see on the screen his picture donning the combine of shoes he wishes to anempt or purchase. > 4 Navigation Augmented reality can be utilized for different route gadgets. Information can be appeared on the windshield to show meter, landscape, destination directions, climate, traffic and road conditions and cautions to potential dangers in their path. = It can likewise permit connect watch-standers to continuously monitor vital data like 2 ship's heading and speed while moving through the bridge or playing out some other task. yom toes BS Coro. Sen) 68 insoducton to Rendenng Engines » 5. Entertainment Augmented reality is used in the field of entertainment for pools, football fields, race racks and different games condit product placements and For instance, Fox-Trax framework utilized for ice by Quake, Game City and numerous more ones, spors like (swimming ), virtual advertisements, as more different AR recreations like 9 6 Education and Training Installed markers can be utilized 2s a par of educational materials which when checked by an AR gadget produces data in multimedia format which causes the student to think about different subjects . + 7. Office — Cooperation in office spaces is zone where AR may be important by giving effort among various associztes in a work force by methods for gatherings with the help of bona fide and virmal individuals = Its endeavours join conceptualizing end wade social affairs utilizing fondamental representation with the belp of smart progressed whiteboards, touch screen table and shared diagram spaces. 6.2.2 Virtual Reality = Eyes are a certain width isolated, which suggests when some individual looks challenge, each eye sees a to some degree indisputable perspective of that that question. Their mind then strategies these two pictures fo outline a 3D picture. VR gadgets work comparebly.. ~The underlying section of the system is reproducing both the left and right eyes’ purposes of see with two one beside alternate pictures. Ths is known as ‘stereoscopic imaging’. __ = The second some portion of the system is to use an OLED show of high assurance what's more, adequately immense to make an immersive field of vision. At the moment that | these are joined a radical new world in confined prior tothe eyes. | = The immersive picture is the gadgets capacity to track the head development with n0 | slack or inactivity as the customer looks at the virtual world. This requires a couple of | sensors, including gyro meters, accelerometers what's more, that is only the start. For Viral Reality headsets, they screen the right position of customer's head over the ‘six degrees of adaptability’ i. two headings of pitch, yaw and roll. (wu BSc. SemV)_610 Introduction to Rendering Engng, | 6.2.2.1 Applications of VA Fig, 6.2.2: Applications of VR > 1, Modeling, designing and planning ‘Virtual Reality permits the user to see the virtual objects in real time and space in a virtual environment. For example, the models developed by the Fraunhofer Institute Virtual Design, Virtual Kitchen ~ tools for interior designers who can visualize their sketches, By using the tools you can change the texture, color and the objects position observing them at once to see how the whole neighbouring would look like. 2. Training and education Utilization of VR is expanded in civil industries also because they are safe and give lower ‘operating cost than the actual flight training. In this manner, astronauts were trained utilizing virtua reality as itis more proficient and they can likewise perform hazardous tasks in space without any fear. Another application is pharmaceutical where they permit study and preparing of performing surgeries of different parts of body. Another case is Virtual classes; where student introduce at wherever can examine with virtual teachers. 3. Tele-presence and Tele-operation Tele-presence technology allows the user to perform the medical operations in a distant widely different condition with the assistance of VR user connections. ‘The Nan controller undertaking demonstrates an alternate nature of tele-presence in distant broadly unique working conditions. | y Game Programming (MU B.Sc. Comp.-SemV) 6-11 Introduction to Rendering Engines ‘This framework that uses 2 HMD and force feedback dealing with expertly gives a man of science to see a magnifying lens a chance to see, have a sentiment of and make utilization of, something with the highest point of the agent of condition. > 4, Cooperative working By making the use of network based virtual network it is possible for the remote users to collaborate with each other, __ Information is shared for the cooperative working by using the highest bandwidth, For example- CO-CAD desktop system that permits a group of engineers to work together within a shared virtual workspace, 5. Entertainment The large use of virtual reality is increased in the field of entertainment due to constant pardware power improvement and falling prices of hardware. 623 Mixed Reality - Mixed reality is a blend of Virtual reality and augmented reality. It is also known as hybrid reality. Mixed Reality gadgets work by making a 3D guide of client's environment and filtering the physical condition so the gadget knows how and where to put the mechanized content into that space sensibly while enabling the customer to team up with itutilizing movements. - MR encounters welcome mechanized content into client's continuous condition, enabling the client to work together with them. ~The visualizations can carry on like certifiable questions and connect’ with surroundings around them because of the utilization of straightforward focal points, comprehension of physical situations and sound. ~ For instance, think about the outsiders smashing through the rooftops and discharging Tockets, weapons at the client. It isnt just inhale taking yet in addition an exciting background Wig. AN) Apyeationy of MR A. Industries, ‘The first MR stem was developed by Steve Reiner tn 1099 at Columbia University ng aystom directs the users for the basic maintenance and repairing operations Rhee, 1 ce exhibited MR framework that was circulate and eooperation arranged, Theie system fuses mobile phones and likewise NC terminals, elated With 4 server an! mysticism Daves setting affirmation structure to render the information én the setup fiting tw the chent and the situation. One moee application Magic book was utilized to fabricate ID models of items on the ages of the book, Diverse points of the madels ean be seen by moving the book sink seeing from various angles. 2. Medicine ‘To visualize the medical images like CT seans, MRI of lfferent body parts of patients the Mixed Reality 1s very useful, Tt helps to direct the surgeon to operate the patient, These medical images are displayed on the monitors that are in the operating toon. has been observed that accuracy rato is increased by using these technologies. Collaboration Ik is another zone of utilization for mixed reality, Here for instance, a remote expert oles direction to an operator on field 2 63 Mobllo Phonos se operator maken lian ponte wives yeaity fe eae he aHtaation aeons Nn nd fonwatt HEH the expert whiele investigations HE andl ables the expert An The manner tveanig Te Vistial show a the experts mntyedt weality Coamewirh a the yowint channel Vor anetioanal wtitsationy, Wel that gecammyenstad a feamewalrk white tnapiney the pationts Hovty cats by itera cameras an attr Hak abuawext ot Migh eter oat show oF PDA aft portals at his ple 44 Navigatlon Mined eeality fy nse in tigation forthe ayistanes prose Here, the aperatar abrerve Aho virtual symbols anchoved to the wa wand Tis Very similar fo a compar where the signs demonstiate the best purstlle contre without getting dntluenced by the intudiction at the device We cond bo ated! fo manage visitors inyile the Hamiye ar attuctites ar for explorers tn tanothet vity § Entertainment A Cow energizing applications were reconmtendedt, tn awhtel Handlers: must coniect wth ‘engineered characters or gadgets whieh show ap in thelr real envisonmnent Mixed! reality ean be utilized ay a part of games whieh help to butht the competition and additionally coordinated effort among, the players, Vor Instance, MI variant of the Quake computer game, the Human Bae man pane, Moot Lander gam, and so forth SEE rn EE Syllabus Toplc : Mobile Phonon ‘The phones are using AR, VR, and MR , for example the game Pakémon Go, K books Much realistic, The Apple has ARK, and Google hay ARCore Uh Augmented Reality the following are the major pus, 1, Project Tango 2 ARKit 3, ARcore Introduation to Rendering Engng, = Project Tango is a serious push on mobile devices. It has relevant sensors ang computation kit, It was a bulky phone, For better quality and standard lot of dai. gathering related to the environment is needed. So advanced phones with standard sizeq cameras and sensors are needed. = Apple launched ARkit which permits you to create supreme augmented realy experiences for iPhone and iPad. They have added Maps app to navigate around yoy, local city. ~ _ ARKit apps will work in the iPhone 6s and any newer iPhone running iOS 11. ~ For AR you have to get an app first . Examples of AR app are Pokemon Go, IKea for tg visualize how furniture will look in home before buying it = The applications of mobile phones in AR,VR and MR are: 1. Itis used for gaming Used to view furniture in home before buying it Simulation for car driving For seeing menu in plate before ordering. yeep To explore hypermedia news stories that are located at the places to which they refer and to get a guided campus tour that overlays models of earlier buildings 6. Used to interact with remote users. 7. Used in military for augmentation ofa battlefield scene —__ The following are the mobile devices which supports AR, VR and MR: [company | VR phones Apple iPhone 6, iPhone 6 Plus,iPhone 6s, iPhone 6s Plus, iPhone 7,iPhone 7 Plus Asus Padfone Infinity, Padfone X, ZenFone ,ZenFone AR, ZenFone Selfic, ZenFone Zoom Blu Life Pure XL, Pure XL, Vivo 4.8 HD, Life One X2 BQ ‘Aquaris ESs,Aquaris MS Coolpad | MAX Elephone | P9000, R9,Vowney GIONEE | S6s, M6, M6 Plus f Game. Progamming (MU B.Sc. Comp, - Sem-V) Introduction to Rendering Engines 08 _| DROID Maxx, DROID Maxx 2, DROID Turbo XLTE, DROID Ultra Maxx_| Company YR phones Gon Nexus 6,Pixel Phone, Pixel XL Phone onett | Si¥e6 Pe — 10 Evo, Bolt, Butterfly $, Desire Eye, One 801¢, One 802w, One A9,One E8,One E9, One E9 Plus, One E9s, One M8,One M9,One M9 Plus, One Max 803s, One ME,One X9 quawei | Ascend Mate 2,G7 Plus, GX8 LTE, Honor 6, Honor 7, Honor 8, Mate 8, Mate 9, Mate S, Nexus 6P, Nova, P8, yundai_| Acro Plus iNew V3 ate Tafinix | X552 ZERO 3, Ocean _| X8 Jiayn _| G4 Flagship, G6 32GB, $3, $3+ Karbonn | Quattro L52 VR, Titanium $19 Kogan _| Agora 4G, Agora 6 Plus Lava___| Xolo Black LeEco _| X502 Lenovo _ | IdeaPhone K910, K4 Note, K920 Vibe Z2 Pro, Lemon X3 Lite, Vibe K4 LG G4, G5,K Series Escape 3, K Series Phoenix 2 GoPhone, K10, K8, K8V, 15000, Q Series QUO, Stylus 2, Stylus 3, V10, V20, X Series X Cam Linshof | i8, LYF Earth 2, Water 10, Water 7 Meltu | Mas, M6 Meizu | MI note, M3 note, M3s, M3x, MXS,MS75, MXSe, Pro S M576 | Mleromax | £484 Canvas 6 Pro, Yureka $ TD-LTE Dual SIM Moy Poe | Ms —____— Motorola _| DROID Maxx, DROID Manx 2, DROID Turbo XLTE, DROID Ultra Maxx Ps W came Progamming MU B.Sc. Comp.-Semv)_6-16 Inteductiont Rendering Engng Syllabus Tople : Smart Glasses 6.4 Smart Glasses ~The Canadian researcher Steven Mann isa inventor of smart glasses. He majorly works in wearable computing area, Smart glasses are the wearable computers that add information to what the wearer sees. These are the computerized intemet connected glasses iy, transparent heads up display. ~ Smart glasses gather the information from internal and external sources. Smart glasses include all the features of smart phones. It supports GPS, Bluetooth and Wi-Fi, The activity tracking functionalities like calories bummed distance walked etc are also added in smart glasses. Smart glasses have various display techniques like Curved mirror, laster technologies, waive guide or light guide based technologies, Virtual Retinal Displays, Technical Illusions CastaR . Fig. 6.4.1: Smart Glasses The smart glasses with one display can display the information and are smart in sensing, actuation and processing. = But its disadvantages are: it cannot produce 3D conten, itis capable of creating a virtual or diminished reality, and it has fully exhausted the possibilities of augmented reality. Smart glasses with two displays are used for Augmented reality, Virtual reality and Diminished reality y Game Programming (MU B.Sc. Comp.--Sem-V) 6-17 Introduction to Rendering Engines. ‘The auses and the advantage of smart glass are as follows 1, Ithas camera to capture the images and to record the video 2, Itishands free 3, _ It is used for personal use. 4, eshows the clear sight 5, Itis used for navigation, 6. Italso notifies us about the events 7. It works like memory assistance, oat Applications of Smart Glasses ‘Applications of Smart Glasses 1. Medioal rl 2 Salty 3. Education| Fig. 64.2; Applications of Smart Glasses > 1. Medical - Smart glasses are used in field of medical to track the medicine consumption. For the hearing impaired it gives subtitles, = It is also used to distract from pain in the physical therapy. It has Software adjustable seeing aid, It has lenses to major the blood sugar. > 2 Safety | Smart glasses are used to issue warning in danger as well as for accident detection and | Teaction. Police used it for Video & Audio stream and for surveillance by government > 3. Education With the help of smart glasses you can see the living history, one can augment the Professor, For the training purpose Sophisticated simulations are used. (MU B.Sc. Comp. - Sem- 618 Invodueton to Render Engng In physics you can experience the virtual objects. Itis also used for virtual classroom, > 4, Productivity One can stream the video to colleagues, instructors, experts or trae, one can easy watch the instructions during the work. It is also used for realtime translation. It helps to direct the warehouse employees and to augment the construction sites wig, ‘model. One can easily monitor the employee movement via smart glass, > 5. Sports In sports smart glasses are used for performance measurement and comparison, Communication can be easily done using smart glasses. 6.4.2 Disadvantages of smart glasses 1. The data inaccuracy ‘As battery drain out quickly so needs more charging. 3. Itis not feasible for prescription eye-ware users. 4, Lack of availability. 5, Smart glasses are expensive. 6. It leads to ack of privacy and accidents Syllabus Topic : HMD's HMD's Head-Mounted Displays (HMD) gadgets are utilized to see the individual data that can give data in a way that no other show can. HMD's are utilized as distant data sources, they showed video can likewise be made receptive to head and body developments, duplicating the manner in which we see, explore through, and investigate the world. This extraordinary capacity fits applications, for example, Virtual Reality for making artificial environments , to therapeutic perception as a guide in surgeries, to militay vehicles for survey sensor imagery, to airborne workstation applications lessening size, weight, and control over ordinary displays, to air ship reproduction and training, and tor settled and turning wing flight show applications. Ina few applications, for example, 1. The medicinal HMD is utilized exclusively as a hands-off data source. 2. To really receive the rewards of the HMD as a component ofa flight application, be Game Programming (MU BSc. Comp.-Sem-V)_ 6419 Introduction to Rendering Engines that as it may, it must be a piece of a Visually Coupled System (or VCS) that incorporates the HMD, a head position tracker, and a designs motor or video source As the pilot turns histher head, the tracker transfers the introduction information to the mission PC, which refreshes they showed data as needs be. This gives the pilot a horde of ongoing information that is connected to head introduction. 3, Ina settled wing warrior, a rocket's sensor can be slaved to the pilot's head observable pathway, enabling the pilot to assign targets from the forward viewable pathway of the air ship. In a helicopter, the pilot can point sensors, for example, forward-looking infrared (FLIR) and fly during the evening. Fig. 65.1 : Head Mounted Display ~ MD consists of one or more image sources collimating optics, and a means to mount the assembly on the head. The current VR HMD's consists of 4 basic parts: 1. A display image source 2. An optical system 3. Ahead mount 4. Aposition tracker The display image sources are usually Liquid Crystal Display (LCD) or Cathode Ray Tube (CRT).These sources presents an image to the viewers. In case of binocular viewing there is need of two displays that is one for each eye. The optical system permits to place the screen close to the eyes and head for compactness. The image is magnified so that it will appear big and also provides a wide field of view (FOV), Introduction to Rendering Engin = The head mount is the base for mounting the components. The position tracker monite the head position of the user. This information is used by the compuler to generate mage matching to the user's head position and orientation, Q.1 Write short note on current market rendaring engines? Q.2 Whats AR? Explain its application? Q.3 Whatis VR? Explain its application? Q.4 What is MR? Explain its application? Q.5 Write short note on smart glasses? Q.6 Write short note on mobile phones? 2.7 White short note on Head Mounted Displays Chapter Ends, ag Peru Unity Engine: Multi-platform Publishing VR + AR syllabus Unity Engine: Multi-platform publishing, VA + AR: Introduction and working in Unity, 2D, Graphics, Physics, Scripting, Avira, Timeline, Multiplayer and Networking, UI, Navigation and Path finding, XA, Pi Syllabus Topic : Itreduction and Werking In Unity 74 Introduction and working In Unity Unity Editor to create 2D and 3D games, apps and experiences. 74.4 Steps for Unity Installation = Unity is an open source software and its free to download for that you have to it download from its official website https://unity3d.com/get-unity/download/archive. Download the latest unity editor setup from the link given above. eT) 7 the following step... Unity download archive tenn Porn th tuo re wt sherpa ote waar Pasa tv tert ong i SP al A ‘sommes etn a ron orig wn Mi oyennemmmp te ner eng Term appar ‘olson hore ea wera tnd tt pre ay oe Litendebdeetetebentateretincmmatipionntinniapaarenanial (et torte canary wh bneror pt emo oa cra ERIK eel a et (ep etn enn bn gg om pre fan TNomn reget pe yt met ee ets mT aah mts Ute tay201020 ii (ane + x Now click the Download(Win) button drop down menu will appear select the choice appropriate for you oma Ure Ne Um wept ‘Unty 201820 wade ‘Unay 20101 wacaoe Unity 201817 Unty 201818 naan ming (MU B.Sc. Comp. -Som:V) 7-3 __Unity Engino: Mull-pat games Installer makes use of download assistant which helps you download component of unity editor which you want, ‘Choose which fstures of Uy 5.3.41 (32-bt) you want to instal ‘heck the components you want to install and un components you don't want to Install, Chick Next to continue, alae cee Sele components tonal: | Ey 2) Monodevlep Space required: 093.140 | bon ral 4, Chose the components to install and click Next, $. If you don’t know which component to install leave the default selection and click Next. Then follow the installer instructions, 6, Then let installer download and install unity in your PC, After that launch Unity Game Engine. pny Game ing (MU B.Sc, Comp.-Sem-V) 7-4 __Unity Engine: Mult-plat. Publ. VA + ap ‘Syllabus Tople : 20 72, 2 Tk — Afier Launching the unity the very first screen appears look like : Quntyy Hello! Sign into your Unity Account {met hn ee et cn ee yore = ‘Then you want you can create a unity login id and login or simply click work offline, = fier thatthe soreen which appears will look like = ys rogramming (MU B.Se. Comp, - Sem) Unity Engine: Multiple Publ_VA.+ AR os he eet Goto Project tab and click new. Then following form will appear fill the details in that and make sure to select the 2D radio button for 2D projects. Then click the create project. projects team 300.29 [saramnPoaear OG ensevnysnnyes Hk, After that a screen will appear whichis showing the 2D environment for our project. Go ahead and try to explore. gener Sek ee) ms Game (MU B.Sc. Comp.-Sem-V)_7-6 __Unity Engine: Muli-pat. Publ. VA 4 a) (a) Hierarchy : In field hierarchy here you will be adding objects, cameras, light source and stuff like that in your scene. (b) Assets : In field Assets materials for game are stored like images, fonts, scripts, scene; text files, XML files, music, sounds, videos etc. (6) Scene : In field Scene View you can see what's going on in the scene itself. Assets can b added here can dragging them and even you can make changes according to you here (d) Inspector : In field Inspector here we can modify, add and remove components and th Properties of the objects in the scene. (c) Play Buttons : In field Play Buttons used for running game in the editor to check you game working, (6) Console Tab : Here all the output messages, errors, warnings ‘and debug messages an shown, ‘Syllabus Tople : Graphics 7.3 Graphics 7 Adding and Managing Assets =~ Inthe screenshot given below you can see a basketball image. = Drag that image into your unity assets folder as shown, Programming (Mt SSS ONES Comp. Sem)_ 7 Unity Engine: Mult-pat. Publ. VA+ AR = Till now we have just added the ball into our project but notin the scene. To add it drag it from the asset view to the scene view. This is stored as sprites when we are working in 2D. It is the way in which unity remember that we are working with the image to use in 2D. Afier adding the image to the scene you might observed the change in inspector wi options like Sprite Renderer and Transform they are important components. Transform is used to define the postion of the image in the game world as every obj. has position rotation and scale property try playing with them will give you bee, understanding, Sprite Renderer is used to control how the sprite is handled on the scene. Now if you clic, the play button to test you will see the image you have added with a blue background, The background is added by the camera if you want to change the background you can simply click the main camera and change the background color from the inspector. Programming (MU Be. Conp, 9 _Unity Engine: Muliplat. Pub VR+ AR, o_o 5 4 _Physles a ae payin 2D game deelopment ines thse pins: Vectors ah In unity we have two classes named Vector2 and Vector3.They are containers for numeric values (qostly float values, Vector? contain 2 values and Vector3 contain 3 values. While working in 2D use of Vector? is more often than of Vector3, Physics In 2D game ‘development 2, Rigid-bodies and Forces 3, Colliders and Colisions Fig. 7.4.1 — While writing code in unity instead of defining two positions X and Y separately we can specify using Vector2 which will contains variables, g& Forexample - Declaring instance, Vector? locate; ~ Assigning values to the locate vector where its assumed ballX and ball are the position coordinate of the ball which are already declared locate = new Vector2(halX , ballY); - Vectors are used best when we are working with value that intended to change location, rotation and direction of something, > 2. Rigid Bodies = In your project select your sprite ball and then click on the Add Component button. Then 0 to Physics 2D component per EweTe Programming (MU B.Sc. Comp.-SemV)_7-10__Unity Engine: Multa PUBL VA 4 gp = Inthat then select Rigidbody 2D. = You will notice that new row of Rigidbody 2D property is added below the sprite renderer in inspector. | = Atthis point if you will run the gave you will notice that ball falls down. | As due to gravity feature of rigid body. Try playing with it if you make the gravity 0 it ‘won't fall. This feature is useful to apply force on rigid body. + 3. Colliders and Collisions Rigid body feature isnot able to detect the collision with other objects. collisions we have to add box colliders. For this again go to Add Component -> Physics 2D -> Box Collider 2D.Then you will notice Green poundary around the ball sprite that is box collider boundary and new row of box collider will be added inthe inspector. | ¥ Hl Box Collider 20 } Then click on edit collider option £y| Edit Collider Materia! ‘None iPhysies Material 20) Is Trigger O Used By Effector) Used ay Composite Auto Tilag Offse size Edge Radius i Game Programming (MU B.Sc. Com ) 7-42 _ Unity Eny Publ. VR + ay ‘Then you will notice that image now has some handles to adjust the size of the collig region. This is how you add collider to your game objects or sprites to detect the coltisig between them, Syllabus Tople : Scripting 75 Scripting In Unity C# scripts are used most widely by the developers. In we write the logic to mo the game object and like that feature required in game are defined in scripts programming it properly Let's create a script for that right click in the assets area and goto Create -> C# Script y Programming (MU BSe.Comp.-Som) 7-19 _Unity Engine: Mult-pat. Publ. VA + AR ‘That will create a new [ile with name newBehaviourScript rename that to Move and press center. The ball and any object in the game is referred as game object in the unity, Now double click on it then a editor will open having default predefined code in that. You will see that your base class automatically inherits from the base class MonoBehaviour. 15.1 MonoBehaviour MonoBehaviour is a class that all script will inherits it property. Such as MonoBheaviour contains the definition ofthe position of the gameObject(gameObject transform position x/y/2). = MonoBehaviour also contains definition for the Start) and Update() methods, (a) Start() method This is run in the script once when the gameObject is initialized or it became active.Example we can make use of start method to set the initial value of bullet count for agun game object. (b) Update() method Unity call this method 60 times per second (ie. 60 frames per second). In this the main functionality of code take place, Example detecting input, adding forces, adding score, spawning enemies or bullets ete = See the bellow code screenshot and also go through the comments. 1 using Systen, Collections; 2 using Systen.Coltections.Generic; 3 using UnityEngine; 4 5 public class Move : HonoBehaviour { to a gun r of bullets body2d for k to this picture can get a better Miinenever Aideating Public int bullet: Wectaring the dullet © Unty Engine: PUbLVR « ay private Rigiddody20 gun, Jideclaring a Rigi@ady2d instance. void Start({ bullet = 5 = GetComponent(): v an ac 1c 2 Update(){ Af (Input.GetNouseButtonDown (0)) Z/1f the left mouse button 3s pressed... Instantiate(bullet); s/Create a dullet sprite. We'll be dealing with //Thais later on. ‘Syllabus Topic : Animation 7.6 Animation | = In Unity you can perform animation on Game Objects and characters, These animation | axe reargetable and have full control on animation at runtime. Unity also provide Humanoid animation for one character model onto another. The following image shows? view of rypical Animation State Machine in Animator Window. Programm SSCL BSe.Comp.- Sem) Unity Engine: Multiplat. Publ. VR + AR, 7.6.1 Animation workflow = Unity’s animation system is based on Animation Clips that contains the change in object position, rotation and other properties over tmne > Animation clips are organized in structured flowchart known as Animator Controller. This controller acts like a sat that heeps the track of which clip should currently ge or blend together, be playing, and when the animations should (a) (b) Fig. 7.6.1 : How the various parts of the animation system connect together You can see in the above diagram the animated clips are imported from the external source. In the image motion captured humanoid animation is imported. In the Animator Controller animation clips are placed and arranged. The states are shown as nodes connected by lines. Animator Controller is placed as an asset in the project window. The rigged character model's configuration is mapped to Unity's common Avatar format and the mapping is stored as an Avatar asset, While animating a character model , it has an Animator component attached you can ste in the Inspector the Animator Component which has both the Animator Controller and the Avatar assigned. These are used together to animate the model Legacy animation system Unity had s legacy animation system before Unity 4. If you need to use when working with older content created before Unity 4 you can use Legacy animation system. Animation from external sources ‘You can import the animation from external sources same way as regular 3D files. The following diagram shows the imported FBX 3D Asset having an animation titled ‘Run’ | Vso re Be Co SemV)_7-17__Unity Engine: Muli-plat Pt R+AR > {i Animator Controllers > fi Bouncing Cube > Prefabs > Gi Robot Arm > Scenes > Gi Scripts = In some cases object to be animated isin same file or in other cases it is in separate file. Sometimes you have a library of animations which is to be used on various different ‘models in your scene. = When you import muhiple animations, the animations can every exist as separate files within your project folder, or you can take out multiple animation clips from a single FBX fle if exported as takes from Motion builder or with a plugin / script for Maya, Max or other 3D packages. () Importing animation files First import the animation in Unity before using it. can import native 3D Studio Max (.max), Maya (amb or ma), and Cinema 4D (.c4d) files, and also generic X fils. lonally, if these imported clips contains lots of bones with lots of keyframes, the atity of information can look tremendously complex. For example, the following image of a humanoid running animation looks like inthe Animation window: Game Programming (MU B.Sc. Comp. -Sem-V)_7:18_Uniy Engine: Mutat. Pub, yp eae rrr eS +n = Tomake the view select the specific bones you want to examine. The Animation windoy then nl spas th keynes orcas forts bones. ~ Limiting the view to just the selected bones Animation window offers a read-only view of the Animation data or viewing imported Animation keyframes. create a new empty Animation Clip in Unity to edit tis dat, after that select, copy and paste the Animation data from the imported Animation Clip into your new, writable Animation Clip. Fig, 7.6.2: Selecting keyframes from an imported clip. 16.2 Using the Animation View ‘To open the Animation window go to Window > Animation. In the Animation view you can preview and edit Animation Clips for animated GameObjects. (@) Viewing Animations on a GameObject ‘The Animation window is connected with the Project window, Hierarchy window, the Scene view, and the Inspector window. Like the Inspector, the Animation window shows the timeline and Keyframes of the Animation for the currently selected GameObject or Animation Clip Asset. - Youcan select a GameObject using the Hierarchy window or the Scene View, or select an Animation Clip Asset using the Project Window. 76.3 The Animated Properties List = The following image shows the Animation view (left). It shows the Animation used by the currently selected GameObject, and its child GameObjects if they are also controlled by this Animation = At the right side The Scene view and Hierarchy view shows the Animations attached to The following image show an emply clip and the Animation view shows list of the _Aaimated properties, but there is no property in empty clip. Dipiaet | ciret v,... ~ Whea you start the animation different properties within the clip will appear in anima window. If there are multiple child objects controlled by the animation window then the also add hierarchical sub-lists of each child object's animated properties. ~ Inthe example above, a variety of part of the Robot Arm's GameObject hierarchy are ; ‘nimated inside the same animation clip. All the properties are folded and unfolded ; disclose the exact values recorded at each keyframe. ~ The value fields demonstrate the interpolated value ifthe playback heads (the white lin: is between keyframes. ~ You can edit these fields directly. If modifications are done when the playback head over a keyframe, the keyframe’s values are modified. If modifications are done when playback head is between keyframes a new keyframe is created at that point with the ne value that you entered. Se. Comp. -Sem-V)_7-20___Unity Engine: Mult-plat. Publ. VR + 4. 7.64 The Animation Timeline — At the Sih eof he Annan View i tine fr he eae lp keyframes for each animated property become visible in this timeline, There are ‘ modes of the timeline view Dopesheet and Curves. = To toggle between these modes, click Dopesheet or Curve at the bottom of the anim property list area: These offer two interchanging views ofthe Animation timeline and keyframe data. wy onme Powering WU Cor Sem-V) 7-21 Unity Engine: Mult-plat. Publ. VA + AR 7641 Dopesheet Timeline Mode ‘This mode provides the more compact view and permits you to view each property's keyframe sequence in an individual horizontal track. This permits you to view a simple overview of the keyframe timing for many properties. The following image shows Animation Window is in Dope Sheet mode, it shows the keyframe positions ofall animated properties in the Animation clip oosecesoe eeoccesee oss 76.4.2 Curves Timeline Mode = To view how the values of every animated property changes over time the Curves mode displays is used. It is a resizable graph. All selected properties emerge overlaid in the same graph view. = This mode permits to contol the viewing and the editing the values, and how they are interpolated between. The following image shows the curves for the rotation data of four selected GameObjects within this Animation clip | | sees vomp.- SemV)_7-22_Unity Eng RAR 7.6.4.3 Fitting Your Selection to the Window ~ Sometimes there are various ranges for every property may differ greatly. For instance g spinning bouncing cube rotation value may go from 0 to 360 or degree; the bouncing Y position value may differ between the range 0 to 2. - When you view both the curves at the same time, the animation curves for the position values is very difficult to make out because the view will be zoomed out to fit the 0-369 range of the rotation values within the window. - To zoom view the currently selected keyframes Press F on the keyboard. It is a quick way to focus and re-scale the window on a part of Animation timeline for easier editing = Click on individual properties in the list and press F on the keyboard to mechanically re-scale the view to fit the range for that value, Using the drag handles at each end of the view's scrollbar sliders you can manvally adjust the zoom of the Curves window. y ame Progamnng MUB-Se.Conp.-Sen.v) 7.29 Unity Engine Mult-plat. Publ. V+ AR ‘The following image it shows the Animation Window is zoomed in to view the bouncing Y position Animation. The beginning ofthe yellow rotation curve is still visible, but at the present extends way of the top ofthe view: = To show all the keyframes in the clip press A on the keyboard to fit and re-scale the window. It is useful to view the whole timeline while preserving your current selection: 76.5 Playback and frame navigation controls Use Playback Controls at the top left of Animation view to control playback of the Animation Clip. The following image shows the Frame navigation ~The left to right controls in the image are: Preview mode (toggle on/off),Record mode (toggle on/off), Move playback head to the beginning of the clip, Move playback head to the previous keyframe Play Animation, Move playback head to the next key frame, Move Playback head to the end of the clip. 24 _ Unity Engine: Mi PUBL VR + ap = You can also use the sortcut’s like PressComma(,) to go to the previous frame, Press Period (.) to go to the next frame, Hold Alt and press Comma (,) to go to the previous keyframe, Hold Alt and press Period (,) to go to the next keyframe. 7.6.5.1 Locking the Window ~ _ Itis possible to lock the Animation editor window so that it does not automatically swt, to reflect the currently selected GameObject in the Hierarchy or Scene. ~ Ifyou want to focus on the animation on a specific GameObject, and stl be able to seley ‘and manipulate other GameObjects in the Scene then you can lock the window. ~The following image shows the lock button in rectangle. 7.6.6 Creating a New Animation Clip ~ You need an Animator Component attached to GameObjects in Unity for animation, This Animator Component have to reference an Animator Controller , which in tum consists of references to one or more Animation Clips. ~ To create a new Animation Clip for the chosen GameObject, ensure that Animation Window is visible. Ifthe GameObject does not so far have any Animation Clips assigned, there is a create button in the centre of the Animation Window timeline area. Click the Create button. You will then be prompted for saving your new empty Animation Clip anywhere in your Assets folder. = After creating the animation clip many things happen automatically: 1, Anew Animator Controller asset is created 2. The new created clip will be added into the Animator Controller as the default state 3, An Animator Component is added to the GameObject being animated 4. The Animator Component is having a new Animator Controller assigned to it yg Programming (MU BSo.Comp.-Sem-V) 7-25 Unity Engine: Mult-plat. Publ. VA + AR ‘The effect of this automatic sequence is that all the necessary elements of the animation system are set up for you, and you can now start animating the objects. 4 Adding Another Animation Clip If you want to create a new animation clip on an object then select create new clip from menu. You will be again prompted o save. 7.6.6.2 How it Fits Together ‘The above steps automaticaly set up the related components and references; it will be helpful to know which parts must be connected together. = A GameObject have to have an Animator component = The Animator component have to have an Animator Controller asset assigned = The Animator Controller asset have to have one or more Animation Clips assigned ~ The following diagram describes how these parts are assigned, starting from the new animation clip created in the Animation Window: (MU B.Sc. Comp. -Sem-V) _7-26 ine: Mult-plat. Put VR + AR, — In the following image you can see that a GameObject selected (“Cube”) that is not Up ti) ‘now animated. It is a cube, with no Animator component. ~ Inthe image the Animation, Hierarchy, Project and Inspector windows are arranged sige. by-side for clarity. - When you click on the create button in the Animation view a new animation clip is created. Unity will request to pick the name & location to save this new Animation Clip and creates an Animator Controller asset with the same name as the selected GameObject It will then add an Animator component to the GameObject, and attaches the assets up Properly. ~The following image shows After creating a new clip new assets created in the project window, and the Animator Component assigned in the Inspector window and the new clip assigned as the default state in the Animator Window Game Programming (MU B.Sc. Comp,-Sem:V) 7-27 Unity Engine: Mult-plat. Publ. VA + AR 161 Animating a GameObject ‘There are two methods for animating the gameobjects in animation window, they are rd Mode and Preview Mode, (6) Record Mode In record mode Key frames are automatically created at the playback head when you modify the animated property like rotate, move. Press the button next to preview. The animation will be tinted red when itis in record mode @ Animation z 5 Fig. : 7.6.3: The Animation Window in Record mode (0) Preview Mode In this mode altering your animated GameObject do not create the keyframes automatically. You have to create the keyframes manually every time when you modify your GameObject. The following image shows the Animation window time line is tinted blue when in preview mode Fig. : 7.64: The Animation Window in Preview mode 76.8 Recording Keyframes To record the keyframes for the chosen GameObject click on the Animation Record button. You will enterin Animation Record Mode, and the changes done to the GameObject are recorded into the Animation Clip. The following image is the example of Animation Window in record mode. Game Programming (MU B.Sc. Comp. Sem) _7-28 Unity Engine: Muli-pat. PUbL VA « ya | a = Gee ———— om Tog moped tae (6d ~ Ihis possible for you to. manipulate the properties of the Gameobject in animation record mode. In case you move, rotate or scale a GameObject then the matching keyframes are added for those properties in the animation clip. ~ In the Transform properties .x, -y, and .z properties are special. Therefore curves for all three are added at the same time. By clicking on Add Property button you can add the animatable properties to current game object. When you click this button it will show you the list of the GameObject’s animatable properties. These match with the properties listed in the inspector. e Programming (MU B.Se. Comp, Sem. 7 Timeline You can click anywhere onthe Animation window time line to move the playback head to spat frame, and preview or modify that frame in the Animation Clip. The numbers in the time sjne are shown as seconds and frames, so 1:30 means 1 second and 30 frames. fo this ime line bar in the Animation window shares the same name, but is separate from Timeline window the = : = |o00 0105 O10 115 10120 0:25, qIA Creating Keyframes in Preview Mode In preview mode, animated properties are tinted blue in the Inspector window. When you see this blue tint, it means these values are being driven by the keyframes of the animation clip currently being previewed inthe animation window. In preview mode, animated fields are tinted blue in the inspector = If you modify any of these blue-tinted properties while previewing (such as rotating a | GameObject that has its rotation property animated, as in the above screenshot) the | GameObject is now in a modified animation state. This is indicated by a change of color | inthe tnt ofthe inspector field ta pink color. Because you arent in record mode, your | ‘modification is not yet saved as a keyframe, Ee For example, in the screenshot below, the rotation property has been modified to have a Y | Yalue of -90, This modification has not yet been saved as Keytame in the animation clip. > A modified animated property in preview mode. This change has not yet been saved as a keyframe \ ' F J Game Programming (MU B.Sc. Comp. - Sem-V)_7-30_Unity Engine: Mult-plat. Publ, vi. AR ~ _Inthis modified state, you must manually create a keyframe t “save” this modification you move the playback head, or switch your selection away from the animate, GameObject, you will lose the modification. 7.7.2 Manually Creating Keyframes ‘There are three different ways to manually create a keyframe when you have modified 3 GameObject in preview mode. 1. Right click the property label of the property you have modified. It will permit you to add perties. recat, & Revove AK KeyAt eas KeyAtanted 2. You can also add a keyframe by clicking the Add Keyframe button in the Animation window: 3. You can add a keyframe (or keyframes) by using the hotkeys K or Shifi-K Hotkey K is Ke! all animated. It adds a keyframe for all animated properties at the current position of thé playback head in the animated window. Shift K add a keyframe for the animated properties Which have been modified Syllabus Topic : Timeline Timeline Roe cutscenes and game play sequence use the Timeline Editor To ereate the cinema window. The following image shows a cinematic in the Timeline Editor window. The tracks and clips are sayed to the project. The references to GameObjects are saved to the scene. Ttine moore r L ~The Timeline Editor window saves Timeline Assets and Timeline instance. 7.8.1 Timeline Asset The tracks and the clips are saved by Timeline Asset. The Timeline Editor saves the recorded animation as a children of the Timeline Asset. wr Programming (MU B.Se, Comp. -SemV)_7: Unity Engine: Muitiplat PUBL VR + ap 7.8.2 Timeline Instance You cannot directly add the timeline asset tothe scene. You have to create an instance yy animate the GameObjects in scene with a Timeline Asset. ~The Timeline Editor window gives an automatic method of creating a Timeline instancy while creating a Timeline Asset. ~ Ifthe playable GameObject is selected from the scene which i associated witha Timeline Asset then the bindings show in the Timeline Editor window and in the Playabje Director component (Inspector window). 7.8.3 Reusing Timeline Assets = You can reuse the same Timeline Asset with many Timeline instances, For instance, you have created a Timeline Asset named Victory Timeline with the music, animation, and particle effects that play when the main game character (Player) is victorious. = You can reuse the Victory Timeline Timeline Asset to animate a new game character (Enemy) in the same scene; you can make a new Timeline instance for the secondary ‘game character Lis shown in the following images. r ramming (MU B, ame Pred $2 Corp -Semy Unity Engine: Mult-plat. Publ. VA + AR 7,84 — Difference between the Animation window and the Timeline Window 7.8.4.1 The Timeline Window = You can create cinematic content, audio sequences, gameplay sequences in Timeline window. You can animate many different GameObjects within the same sequence, for example a scripted sequence where a character interacts with scenery. = Multiple tracks are available in timeline window and each track contains multiple scripts that can be trimmed, moved or blended between. For creating complex animated Sequences Timeline Window is used. i < "i FA Gamo Pr ing (MU B.Sc. Comp. - Sem-V)_7-34 _Uniy Engine: Multiplat. Pub. VR ap The Timeline window, showing many different typ es of clips arranged in the saing sequence 7.8.4.2 The Animation window ~ You can create individual animations by using the Animation window. You can also yey the imported animation clips. The Animation clips have animation for a sing GameObject ora single hierarchy of GameObjects ~The Animation window is useful for animating separate items in your game a sliding door, a swinging pendulum, ora spinning coin. Only one animation clip is shown at ong time in the animation window. ~The timeline is a part of user interface in the animation window but this is part to the Timeline window. B $88 poceeeose The Animation window, shown in dopesheet mode, showing a hierarchy of objects {in this case, a robot arm with numerous moving pars) animated together in a single animation lip 7.8.5 Creating a Timeline Asset and Timeline instance - The Timeline Editor window offers an automatic method of creating a Timeline instance at the time of creating a new Timeline Asset. The Timeline Editor window also creates all the essential components, ® To create a new Timeline Asset and Timeline Instance, follow these steps 1. First select the GameObject that you want as gameplay based sequence to use from your scene, Open the Timeline Editor window by clicking on menu: Window > Timeline Editor. If tne GameObject does not so far have a Playable Director component attached to a Timeline Asset, a message in the Timeline Editor window prompts you to click the Create button, ‘Tobey newtinnne nth Enemy cree a Drecercomsonen and 2 Tien ais. (eee) 3, Now Click Create button then a dialog box prompis you for the name and location of the Timeline Asset being created 4. Click Save. 7.8.5.1 Recording Basic Animation with an Infinite Clip 1. Create an empty animation track for GameObject to animate. 2. Click on the Record button for the empty Animation track in the Track list to enable Record mode. Go CroendTinwinei Crean) | 4. Now a track is in Record mode, the clip area of the track is shown with “Recording. ‘Message and the Record button blink. (MU BSc. Comp.-Sem-V) 7-36 __Unity Engine: If you do any changes in animatable property in record mode, it sets a key ata location the Timeline Playhead © Creating Animation Move the Timeline Playhead to the location of the first key, and perform one of the following: 1. In Inspector window, Right-click the name of the property and select Add Key. It adds the animation Key for the property and do not change the value. A white diamond will appear inthe infinite clip to show the position of the key. 2. In the Inspector window, alter the value of the animatable property of the GameObject. This adds an animation key forthe property with its changed value. A white diamond will appear in the infinite clip. 3, In theScene view , rotate, move, or scale the GameObject 10 add a key. This mechanically adds a key for the properties you alter. A white diamond will appear in the infinite clip. Now move the playhead to a different location on the timeline and alter the animatable properties of the GameObject. At each location, the Timeline Editor window adds a white diamond to the Infinite clip for any odifiedm properties and adds a key to its connected animation curves, = Now Right-click the name of an animatable property name when you are in Record mode, to do keying operations for example jumping to the next or previous keys, setting @ key without changing its value, removing keys, and so on. Right-click Position and choose Add Key from the context menu, Onsen a (yi ond stae ~ ‘Tag| Leagues 1) Layer( Buta, Ta Tanitorm Pesion, Toutot! 0d Koy sar | va ive ante ursxel ayn) Wap Goto Previous Key InclT} Go to Next Key animation Track ground Gaudio Track Srarpsoure aus |} (MU B.Sc. Comp. -Sem-V)_7:98__Ur ne: Mulla PUBL. VR 5 ap - After finishing the animation, click on the blinking Record button to disable Recorg mode. - To edit the keys when an Infinite clip appears as a dopesheet in the Timeline Edito, window, use the Curves view to edit keys or you may Double-click the Infinite clip ang edit the keys with the Animation window. - The recorded clip is saved in the Timeline Asset in the project. The Timeline Editor window saves the key animation from the Infinite clip as a source asset. ~The name of the source asset is “Recorded” and saved as a child of the Timeline Asset in the project as shown in the following image Game Programming (MU B.Sc. Comp, Unity Engine: Mult-plat, Publ. VA + AR Syllabus Topic : Multiplayer and Networking 19 Multiplayer and Networking 791 Multiplayer Overview ‘There are two types of users forthe Networking feature: 1, Users who make Multiplayer game with Unity. Such type of users has to start with ‘NetworkManager or the High Level API. 2. Users who build network framework or superior multiplayer game, Such type of users should start with NetworkTransport API 794.1. High Level Scripting API In Unity’s networking “high-level” scripting API you are not suppose to worry about the ow level implementation details. This API will give you the access to the commands that cover most of the general requirements for multiuser games. HLAPI allows you to: By using Network Manager you can control the network state of game. ‘You can operate client host games. Using serialize you can serialize dat You can send and receive the network messages. You can send the network commands from client to server. Remote Procedure Calls (RPCs) can be made from server to client. ‘Network events can be send from server to client, yma ee ee 2 Engine and Editor Integration = Unity’s networking is integrated into the engine and the editor, allowing you to work with components and visual aids to build your multiplayer game, For network object it has Networkldentity component. = For script it has a NetworkBehaviour, for object transforms it has Configurable automatic synchronization and Automatic synchronization of script variables. It gives support for Tocating the network objects in Unity scenes, 7.9.1.3 Internet Service Internet services are also offered by Unity for game support, it includes Matchmaking service, Relay server, list of available matches and join matches, creation of matches and advertising of matches, playing game on internet with no dedicated server, message routing. 7.9.1.4 NetworkTransport real-time Transport Layer = Whas a Real-Time Transport Layer that provides optimized UDP based protocol, 4 channel design to evade head-cf-Line blocking issues, Support for a variety of lese) Quality of Service (QuS) per channel, Flenible network topology that supports pee: scr ot chent-server architectures 7.9.1.5 Setting up a Multiplayer Project For meluplayer project you need a Network Manager, UL, Networked Player Prot scripts and GameObjects. You also need to understand the relationship between cent er sud bost and the authority over GazmeOhjects and actions 7.92 The Network Manager ~The Network Manager manages networking aspects of multiplayer game. So the Nex. Manager should be active in yourScene at a time, There is built in New Manager Component an Unity = Mt manages all the features of multiplayer game. You can write your own ner manager by using script for your custom requirement Fig. 19.1: The Network Manager Component A.user intertace tor players to find and join games Marimem multuplsyer game permits the players with 3 way to find out, crests, 2 indies! game matances‘matches. This part cf the game ts commonly called 2s the lott) x themes has ena cana lke chat [7 Myoo are using Network Manizer C % Vig. 7.9.2 Euample of TANKS networking demo Unity pro an interface, called the NetworkManager I is useful at catly stage 2s it permits you to easily make matches and test yo your own UL ‘Unity’s built-in Network Manager HUD, shown in MatchMaker mode. 9.2.1 Networked Player GameObjects You have to create 2 Gan the GameObject a Prefab doin your game player in your game. So make what the player can j ve the Prefab to the Player Prefab field. W came Programming (MU B.Se.Comp.- SemV)_742_uniy Engine: Mula Pub. yp a Network Manager (Seriet) «4, Cont Destroy OnLoww Run in Background eee e*! (infer atil 10 Offline Scene ‘GNone (Scene Asset) line Scene Giione (Scene Asset) > Network Info '® Spawn Info ee EN Player Prefab Player Car io) ‘Auto Create Played er Spawn Met{Randen fered Spawnable Prefabs: List is Empty Fig. 7.9.4: The network manager with a “Player Car” prefab assigned to the Player Prefab fed, 7.9.2.2 Multiplayer-aware Scripts ~ When you write scripts for a multiplayer game then you need to think about the different contexts that the scripts run in. if you have written script for Prefab then it should allow the owner of that player instance to control it, and should not allow other people to cont it. you have to check that whether the server or the client has authority over what the script does. = It may happen that you want your script to run on both client and server or you may wart to replicate the movement of GameObject. So itis important here to decide which parts ct your script should be active in which situations 7.9.2.3 Using the Network Manager = TheNetwork Manager is acomponent for managing the networking aspects of 8 multiplayer game. = The Network Manager features include: Game state management, Spawn managemtt Scene management, Debugging information, Matchmaking Customization 7.9.24 Getting Started with the Network Manager - First create an empty Game Object in your starting Scene and add the NetworkManaze! ‘component. E Programming (MU B.Sc. Comp.-Semv) 7-43 _ Unity Engine: Mult-pat. Publ. VA + AR ‘The added Network Manager component will be look as follows : 7 © GNetwork Manager (Script) Dont Destroy On Lead Run in Background = Log eve offne Sere Tene (Oba Online Scene Tare (Opec) . } Network Info, > Spawn Info Advanced Configuration Use Network Simulator) Script (OtietworkNanager Fig. 7.9.5: The Network Manager as seen in the inspector window 79.3 Game State Management There are two ways to manage the game state, one is use Network Manager HUD and second is to write your own UI that make the player to start the game, The Network Manager HUD mechanically tells the Network Manager which mode to start in. when you write a script then you have to call the following 3 methods and they are: 1, NetworkManager. StartClient, 2, NetworkManager. StartServer, 3. NetworkManager. StartHost Dont Destroy On Loac Run in Background Log Level Offline Scene Online Scene ‘@None (Scene Asset [Network Info Use WebSockets Netwark Address Tocalhost Network Port 7777 Server Bind To IP Script CRC Check Z Max Delay 0.01 > The network address and port settings in the Network Manager component The game can start at any mode by using the Network Address and Netw Port properties. So it is necessary to fix the address and port in the property setting. ork, an Pe r= sgame Progamming (MU B.Sc. Comp.-Sem¥)_7-44__Unly Engine: Multi PUBL Vp 7.9.4 Spawn Management ~ In the Network Manager o manage the spawning of networked GameObjects i ., Player Prefab slot. ~ _Assisgn this slot with your player Prefab. When you have a player Prefab se, a plyg, GameObject is automatically spawned from that Prefab for every user in the game Ca Auto Create Player Consents aa! Registered Spawnable Prefabs: Fig. 79.6: The “Spawn Info” section of the Network Manager component ~ In the Registered Spawnable Prefabs, Option you can add Prefabs.it is also possible to register prefabs using code, with the ClientSceneRegisterPrefab() method. 7.9.5 Customizing Player Instantlation = The NetworkManager.OnServerAddPlayer() implementation is used by th Network Manager to spawn player GameObjects. = You can customize the way player GameObjects are created, by overriding viral function. The following code shows an example ofthe default implementation: public virtual veld OnServerAddPlayer(NetworkComection com, short playerControllerid) if var player = (Ganedbject)Ganethject.Instantiate(playerPrefab, playerSpawnPos, Quaternion.idertity); AetwerkServer.AdéPlayerForConnection(conn, player, playerContrellerId); } 7.9.6 — Start Positions = Use Network Start Position component to control where players are spawned. Attach a Network Start Position component to a GameObject in the Scene, and position the GameObject where you want one of the players to start. = You can add as many start positions to your Scene as you wish. The method GetStanPostion() is used to get the start position of the player. 79.7 Seene Management In many games there is more than one Scene, At the very least, there is usually a title screen or starting menu Scene in addition to the Scene where the game is actually played, The Network Manager automatically manage Scene state and Scene transitions for a multiplayer game. There are 2 slots for NetworkManager Inspector for scenes: the Online Scene and the Offline Scene. If you drag Scene assets into these slots then it activates networked Scene management, ‘When a server or host is started, the Online Scene is loaded and name of the Scene is stored in the networkSceneName property. When network is stopped then offline Scene is, Joaded. This allows the game to automatically return to a menu Scene when disconnected from a multiplayer game, By calling NetworkManager ServerChangeScene () you can do changes in Scenes when game is active. It makes all the currently connected clients change Scene too, and updates networkSceneName thus new clients also load the new Scene. ‘The game state management functions such NetworkManagerStartHost () or NetworkManager.StopClient () can cause Scene changes when networked Scene management is active, The change destroys all the GameObjects in the previous Scene. You should normally make sure the Network Manager persists between Scenes; otherwise the network connection is broken upon a Scene change, To do this, ensure them Don't Destroy on Load box is checked in the Inspector. However it is also possible to have a separate Network Manager in each Scene with different settings, which may be helpful if you wish to control incremental Prefab loading, or different Scene transitions. 7.9.8 Customization In the NetworkManager class there are virtual functions, you can convert by creating your own derived class that inherits from NetworkManager. At the time of implementing these functions, ensure about the functionality that the default implementations provide. For example, in OnServerAddPlayer(), the function NetworkServer.AddPlayer have t0 Called to activate the player GameObject for the connection, Bsa 4A Game BSc. Comp. -Sem-V) 7-46 _Unity Engine: Multi-plat. Publ. vp , PSE sng Uneytogine networking: sng wnttytrptne. eter ing. natchs peste cleus Cistomanager 1 Retereiannge” « 1) Sever caltback public everride veld Onservercomect(neticrKcemectien e608) 0 i ‘ebogLOg("A client comected to the server: * # conn) ) publlc everride void onserveroiscommect(Heteerkcornection fem) G Se-se~.DestreylayersforCamnecton(cean) AF (com Ansttrrer Ie tersenstr-er0t) t A (.sgthiter dogteree) ( oetg.tagtrror("ServerBtsconnected due to errors * ) ca-Leg("A cLient diacomected from the servers 7 + conn): d Public overrice vald Ovserverhanay(iaps ComecTa com) c tenerase-erSetChienttendy (con) sLeg("Client 4s set te the rendy state (ready to recelve state updates): * + conn); ) fete erie wld eer rercmecin amy shrt jayne var player = ( ) -Instantiate(playerPrefab, vector3.2ero, Quaterndon.ddentit) ersertSeryersAdePlayerForConnectlon(coo, player, playerCantrollerd); eoog-Logt"Cliene has requested to get Ms pliyer added to the gure"); | , pele erie weld ererteentiyicrcaveto com, Pnetentae paps Af (player pamecbfect = rull) eneereserver Destrey( player. peactject); i oF ye Programming (MU B.Sc. Comp.-Sem-V) 7-47 __Unity Engine: Mult-plat_ Publ. VR + AR public override void onserverErrar(lietverkconnection conn, int errorCode) { Debug. Log("Server netrork error occurred: * + (lieteorxerrer)errortade) ; t public override void onstartHost() { Debug.Log(*Host has started"); t public override void onstartserver() { Debug-Log("Server has started"); ? public override void onstopserver() g-Log("Server has stopped"); Falic override veld osespern) Debug tog( "Host has starred"); ) M7 Cent calor public override void oxclierscomect( { > cen) base. orcliensconnect(com; he client. Debug. Log("Comected sucessfully to server, now t2 set up ether stuff seseetsen coma) pale evride vid ociiendisometis ‘Stopchient(); os conn taster) Legerree("Chientaiscomected dat te error: oe econ): Game BSc. Comp -Sem-V)_748__Us I > Palle everrize veld OCLeNETTe (ere CNC CO, : ‘Debug-Lag("Client neteark errer eccurred: * + (hetworti-rer)errercode);: Tae erode) aLUc override vod Oaclientnetbedy(erecrhconaecion om) c eteg-Legt "Serer has set client to be mot-rendy (stp getting state updates) Di (alic override vold oastartclient(tetersclient lent) ebeg-Ley("CLient has started"); PLLc override vold onstopClient() « [ebeg Loy "Cleat has stoped”); DbLic everride vold OnclientsceneChange(iemortcerection ¢om9) « base. rclLentSceneChanged(conn); Cebus Log “Server triggered scene change and we've done the sare, do any extre.nork here for the client. The inspector for the Network Manager provides the ability to change some connect parameters and timeouts, Some parameters have not been exposed here but can be chan through code using UnttyEngine.detworki public class Custostanager : Netvorktarager if 1 Set custon connection paraseters early, £0 they are not too late to be enforced void Start() { customconfig * true; connect ionconfg.axConbinedReliabletiessageCount = 40; connectionConfig.naxConbinedReliablevessagesize » 800; connectionConfig.naxsentMessagequeveSize = 2048; connectionConfig.IsAcksLong = true; slobalconfig.ThreadAsakeTineout = 1; ¥ Game Programming (MUB.Sc.Comp.-SemV) 7-49 _Unity Engine: Mult-plat. Publ. VA + AR 3 eee Syllabus Topic : UI -eC—LL—e oul The Ul system permits us to create user interfaces fast and naturally. In this section we will see the This is an major features of Unit's Ul system. 7401 Canvas ‘The Canvas is an area where all the UI elements are inside. It is a Game Object with a Canvas component on it, and all Ul elements must be children of such a Canvas. You can create an image in Canvas by clicking on menu GameObject > UI > Image. - You can see the area asa rectangle in the Scene ViewCanvas uses the EventSystem object for the Messaging System, 7.10.1.1 Draw Order of Elements Ul elements are drawn in the same order they appear in the Hierarchy. The first child is drawn first then the second and so on. If two UI elements overlap, the later one will appear on top of the earlier one, = To alter which element become visible on top of other elements, reorder the elements in the Hierarchy by dragging them. you controlled from scan also control the order by scripting and using the methods on the Transform component : SetAsFirstSibling, SetAsLastSibling, and SetSiblingindex. 7.10,1.2 Render Modes ‘The Render Mode setting is used in canvas to render in screen space or world space. 1. Screen Space - Overlay 2, Screen Space - Camera | 3 Word Space | Fig. 7.10.1 F 1. Screen Space - Overlay ~ This render mode locates Ul elements on the screen rendered on top of the scene. i he if you resize the screen or you change the canvas will automatically change’s si lution. Ulin screen space overlay canvas Fig. 7.10, % 2 Screen Space - Camera = In this mode the Canvas is located a given distance facing a particular Camera camera renders the Ul elements. Camera setting affects the UI appearance, The ¢ controls the distortion of perspective by Field of View. = Canvas will automatically change the size in case if you resize the screen, alter reso or the camera frustum changes. Fig. 7.103: Ul in sereen space camera canvas yom Progranming (MU B.Sc. Comp.-Sem-V) 7-51 _ Unity Engine: Mult-plat. Publ VA+ AR, 9 3 World Space Canvas will behave as any other object in the scene in this render mode, The Rect ‘Transform is used (0 se the size ofthe Canvas manually. This is useful for Uls that are meant to be a part of the world. This is also called as a diegetic interface ca a ei 7.10.4; UL in world space canvas 7,10.1.3 Basic Layout ‘As we have seen that the Image is ereated using the menu GameObject -> UI -> Image. — The % archer presets 7 chor and positon feds the Inspector ig, 7108 ~ SY Gare Progarenng (MU B.Sc Comp. Sem-V)_7-52_Unty Eng: Mut-plat Pub. VR > L The Rect Too! ~ Or the layout purpose all the UI elements are represented as a rectangle. ~ _ In the Scene View using the Rect Tool in the toolbar you can manipulate this recta By using the Rect Tool you can move, resize and rotate Ul elements. [rexeaeres 5 Fig. 7.10.6: Toolbar buttons with Rect Tool selected. ~ Rect Tool uses the current pivot mode and space, st in the toolbar. When working Ulit's usually a good idea to keep those set to Pivot and Local. Fig. 7.10.7 : Toolbar buttons set to Pivot and Local > 2 Rect Transform ~The Rect Transform component is used forall UI elements. It bas rotation, positior scale just like regular Transforms, but it also has a width and height, for specifyir dimensions ofthe rectangle. > 3, Resizing Versus Scaling - To change the local scale of the object Rect Tool is used. If you use a Rect Transfi will change the width and the height, keeping the local scale unchanged. This resizing does not affect font sizes, border on sliced images, and so on. > 4. Pivot = Rotations, size, and scale modifications happen around the pivot thus the position pivot affects the result of a rotation, resizing, or scaling. Molt-plet. Publ. VR + AR If the toolbar Pivot buon is set to Pivot moved in the Scene View 4 5. Anchors Rect Transforms include 2 layout called anchors. Anchors are the four small triangular handles in the Scene View and anchor information is shown in the Inspector. If the parent of a Rect Transform is a Rect Transform, the child Rect Transform is also anchored to the parent Rect Transform in a variety of ways. For example, the child can be anchored to the center of the parent, or to one of the comers. = The following diagram shows thatthe UL element anchored to the center of the parent and has maintained a fixed offset to the center. = Now the following diagram shows Ul element anchored to the lower right corner of the parent and elements have maintained a fixed offset to the lower right corner. WW came Progarwang (MU BS: Comp. ‘Sem-V)_7-54_Unity Engine: Muitiplat_ Publ VR + ~The anchoring permits the child to widen together with the width or height of the par The following disgram shows UI element with left comers anchored to lower left co; of the parent and right corners anchored to lower right and the corners of the element b maintained fixed offsets to their respective anchors. - The positions of the anchors are defined in fractions of the parent rectangle width heighe. 0.0 (0) corresponds to the left or bottom side,.0.5 (50%) to the middle, 1.0 (100%) to the right or top side. ~The following diseram shows Ul element with left comers anchored to a point a cer percentage from the left side of the parent and right comers anchored to a point a cer percectaze from the right side of the pareat rectangle. > 6 Anchor presets — You can find the Anchor Preset buton in the Inspector at upper left comer of Rect Transform component. Clicking the button shows the Anchor Presets dropdo You can select the some common anchoring options from here. The horizontal vertical anchoring is independent. a ng (MUBSc.Como.-Sem-¥) 7-55 _ Unity Engine: Mutpiat Publ VA + AR ‘The Anchor Presets buttons displays the curently selected preset option if there is one. If the anchors on either the horizontal or vertical axis are set 10 different positions than any ofthe presets, the cusiom options is shown ‘Anchor Presets Shift Also ser phvct_ At Also setposicion i i li = (o] Het ns] | [al [on) (ch) [| jumber fields which =e not handle in the Scene View, jon fields of rectangle is sed. Ifm all the anchor handles are and Height. The Pos X and Pos tive to the anchors. If the anchors sre to Left Right, Top and Boom. depends on whether th: together then the fields dsplaye: Y valnes show the position o: separated the fields can chang (MU B. Comp.-SemV) _7-56___Unity Engi 7.10.1.4 Visual Components Mul-plat. Publ. VR This section covers the new Components that can be created. > 1. Text = The Text component is also called as label. It has a text area to enter the text which be displayed. — You can set the font style, font size of the text as well as align the text and set horizontal and vertical overflow that control what happen if the text is larger than width or height of the rectangle and a Best Fit option that makes the text resize to fi available space. q gp core tomers Comp. - Semn-V) Unity Engi jult-plat. Publ. VA + AR ‘y 2 ‘Image ‘An Image has a Rect Transform component and an Image component. You can apply the sprite to the Image component under the Target Graphic field, and also set its color in the Color field. ‘A material is applied to the Image component. Image type field describes how the applied sprite will appear such as Sliced, Tiled, and Filled. oe NGUISprite i None(Materia) | © Raycast Target g@ Preserve Aspect 2) 4 3, RawImage | = Raw Image takes a texture and raw image is used if necessary or else Image will be appropriate inthe majority of cases | 94 Mask = Itisnot a visible UI control but it alters the appearance of a control's child elements, The mask limits the child elements to the shape ofthe parent. = Therefore, ifthe child is bigger than the parent then only the piece of the child that fits Within the parent will be visible. 95. Bttects ~ There are diferent effects are applied on the components like a simple drop shadow or jority interaction components have few things in common. They are selectable and ast one UnityEvent which is invoked when interaction takes place with ' FZ Game Programming (MU B.Sc. Comp.-Sem-V)_7-58_Unity Engine: Muit-pat PUL VR + aR - Ul system catches and logs any excepti Ul and their description. on. The following table shows the components of — (es) A Button has an OnClick UnityEvent to describe what it will act when clicked. — Toggle A Toggle has an Is On checkbox that determines whether the Togele is currently on ot off. It has an OnValueCHanged UnityEvent to describe ‘what it will act when the value is changed, Toggle Group Choose a character ‘A Toggle Group can be used to group a set of Togeles that are mutually exclusive, Slider A Slider has a decimal number Value that the user can drag between a minimum and maximum value. It can be either horizontal or vertical. It also has a OnValueChanged Unity Event to describe what it will act when the value is changed. Scrollbar ‘A Scrollbar has a decimal umber Value between (O and 1. When the user drags the scrollbar, the value changes accordingly. Serollbars are often, used together with a Sroll Rect and a Mask to create a scroll view. The Scrollbar has 2) Size value between O and 1 that determines how big the handle is as a fraction of the whole | scrollbar length. The Scroll Rect component 2 | automatically do this. The Scrollbar can be | either horizontal or verical. It also has @| OnValueChanged UnityBvent to describe what i will act when the value is changed. | | _—— Dropdown ame Programmi SERED WBS Cr Unity Engine: Mult-plat. Publ. VA + AR A Dropdown has a list of options to select from. AA text string and optionally an image can be specified for each option, and can be set either in the Inspector or dynamically from code. It has .OnValueChanged UnityEvent to describe what it will act when the currently chosen option is changed, Input Field An Input Field is used to make the text of a Text Element editable by the user. Ithas a UnityEvent to define what it will do when the text content is changed, and an another to define what it will do when the user has finished editing it. Scroll Rect (Gcroll View) Scroll View A Scroll Rectis usually used to sero a large Image or panel of A Scroll Rect can be used when content that takes up alot of space needs to be displayed in a small area, The Scroll Rect provides functionality to scroll over this content Usually a Scroll Rect is combined with a Mask in order to create a scroll view, where only the scrollable content inside the Scroll Rect is visible. It can also additionally be combined with one or two Scrollbars that can be dragged to scroll horizontally or vertically. 7.10.3 Animation Integration Animation transition mode is needed along with the Animator Component attached to the Controller element for Animation. You can do it automatically by clicking on Auto ~ Generate Animation; it will create an Animator Controller with states already set up, which will need to be saved. ‘ This controller stores the animations for the controller's transitions and these can he __ Customize, if prefered, Normal Trigger Normal | Highlighted Triggei Highlighted Pressed Trigger Pressed é Disabled Trigger Disabled Navigation [awomane | Visualize On Click (Button) List is Empty For example, if a Button element is selected with an Animator controller attached, y can edit the animations for every of the button’s states by opening the Animation wind (Window>Animation). ‘There is an Animation Clip pop-up menu from that you select preferred clip. Select fr “Normal”, “Highlighted”, “Pressed” and “Disabled”. ‘The Normal State is set by the values on button element itself and can be left empty. ! all other states, the most common configuration is a single keyframe at the start the timeline. ‘The transition animation between states will be handled by the Animator. ama Programming (MU B.Sc.Comp.-Sem-¥) 7-61 Unity Engine: Mult-pat. Publ VR. + AR righighed 0 ress = For example, change the width of the button in the Highlighted State by selecting the Highlighted state from the Animation Clip popup menu and with the play head at the start of the time line: 1, Select the record Button 2. Change the width ofthe Button in the inspector 3. Exit the record mode, Change to play mode to see how the button grows when highlighted, 7.10.4 Auto Layout The auto layout system offer ways to place elements in nested layout groups for in horizontal groups, vertical groups, or grids. It also permits elements to automatically sized according to the controlled content. Such as a button is dynamically resizes exactly fit its text content plus some padding. ~The auto layout system is a system built on top of the basic Rect Transform layo system. It can optionally be used on some or all elements 7.10.41. Understanding Layout Elements = There are of layout elements and layout controllers in the auto layout system. element is an Game Object with a Rect Transform and optionally other compo well = The layout element has sure knowledge about which size it should have. Layout elem: don't directly set their own size, but other components that function as layout contr! ccan use the information they offer to calculate a size to use for them. The layout cont=> components are Content Size Fiter and Layout Group components. A layout element properties that defines its own: Minimum width, Minimum height, preferred ‘+ Preferred height, Flexible width, Flexible height Game Programming (MU B Sc. Comp, ) 7-63 Unity Engine: Multiplat. Publ. VA+ AR 740.42 Layout Element Component By adding a Layout Element component to the Game Object you can override the minimum, preferred, or exible size you just tick the checkbox for a property you desire to override and then specify the value you desiteto override with. YELM tayout Element (Scrip) % ‘Min Width Min Height Preferred Width Preferred Height Flexible With Flexible Height oOo0o0000 7404.3. Understanding Layout Controllers ~ Layout controllers control the sizes and positions of one or more layout elements, it means Game Objects with Rect Transforms on. A layout controller can control its own layout element or child layout elements = A component that functions as a layout controller may also itself funetion as a layout clement at the same time 7.10.44 Content Size Fitter = This function is a layout controller and it controls the size of its own layout element. Add a Content Size Fitter component to a Game Object with a Text component to see the auto layout system in ation ~ Ifyou set either the Vertical Fit or Horizontal Fit to Preferred, the Rect Transform will adjust its width and/or height to fit the Text content YOM content Size Fitter (Script) , Horizontal Fit. (Unconsvained Vertical Fit Unconstrained $ © Aspect Ratio Fitter > The Aspect Ratio Fitter functions as a layout controller control the size of its own layout Clement. It adjusts the height to fit the width or vice versa it can make the element fit fnside its parent or envelope its parent. The Aspect Ratio Fitter does not take layout information into account for evample minimum size and ideal size. y ee weMINg (MU BSc. Comp, -Sem-V)_7.64 _Unty Engine: Mutat PUBL VR ap 7.10.4.5 Layout Groups Layout Group controller controls the sizes and position of its child layout elements, Fog example, a Horizontal Layout Group places its children beside each other, and a Grig Layout Group places its children in a grid. A layout group does not contro its ovin size ‘Asa substitute it functions as a layout element itself that can be controlled by other layout controllers 7.10.46 Driven Rect Transform properties The Rect Transform has driven properties. For example, a Content Size Fitter has Horizontal Fit property that set to Minimum width of the Rect Transform on the same ‘Game Object. The width will come out as read-only and a small info box at the top of the Rect Transform will tell that one or more properties are driven by Content Size Fitter. The driven Rect Transforms properties also change the size or placement of layout elements, which changes the values of driven properties 7.10.5 Rich Text Rich Text supports the UI system as well as legacy GUI system. The Text, GUIText, GUIStyle, and TextMesh classes have a Rich Text setting that instructs Unity to look for markup tags within the text. Markup format ‘The markup system is encouraged by HTML. Here the text is enclosed inside a pair of matching tags : They are laughing ‘The tags are just angular bracket characters, . The text inside the tag represents is name the ending tag has same name as the stating tag with the slash / character added. The information enclosed in the tag is displayed not the tags. The b tag used in the example above applies boldface tothe word “not, so the text will appear onscreen as They are not laughing ‘A marked up section of text (including the tags that enclose it) is referred to 8 an clement. a y Game Programming (MU B.S. Comp. Sem-V) 7-65 nity Engine: Mulia, Pub VR+AR , Nested elements ‘You can apply more than one style to a section of text by nesting the element one inside another. We are defiitely notein amused We ate definitely not amused 4g Tag parameters ‘There are few tags that permits variation on the text. For example, the color tag required to know which color to apply andthe information about it is added by using parameters ‘Trees are greene/color> in environment. - Weealso have b tag for boldface, i tag for italic, size tag to set the size and color tag to set the color. material = This is useful for text meshes and renders a section of text with a material given by the parameter. The value is an index into the text mesh 's array of materials as shown by the inspector. ‘They are texturally laughing, Quad It's used for text meshes and renders an image inline with the text. It takes parameters that give the material to use forthe image, the image height in pixels , and a further four that denote a rectangular area ofthe image to display < t } Editor GUI In the GUI editor system Rich text is disabled by default. You can enable it using a eastom GUIStyle, First set the rchiText property to true and the style passed to the GUL function in question Gamo ing (MU B.Sc. Comp. -Sem-V)_7-68 _Unlty Engine: Mul-plat. Publ. VR + ap, ‘Syllabus Topic : Navigation and Path Finding Igation and Path Finding Navigation is used to move the characters in the game world. Navigation mesh is created ‘automatically from the scene geometry. In this Dynamic obstacles are used to change the navigation of the character and off-mesh link is used to create specific actions like jumping down or opening door. ‘The Unity NayMesh contains NavMesh, NavMesh Agent component, Off-Mesh Link, NavMesh Obstacle. In the Navigation system you have to define the walkable area for the character, where locations are created and connected to a surface laying on top of the scene geometry called the navigation mesh. Surface is stored as convex polygons. Then to find the path between to locations map the start and end locations to their nearest polygons. To follow the path,Logical expression you have the sequence of polygons which gives the path from the start to the destination polygon is called a corridor. Use the logic to Avoid Obstacles, Creating aNavMesh To create a NavMesh Click menu: Window > Navigation. There are 4 steps for building a NavMesh for your scene: 1. Select scene geometry for walkable surface and obstacles that should affect the navigation 2. Check Navigation Static on to include selected objects in the NavMesh baking process. ‘Adjust the bake settings for matching it with your agent size, = Agent Radius how close the agent center can get to a wall or a ledge. = Agent Height defines how low the spaces are that the agent can reach. = Max Slope defines how steep the ramps are thatthe agent walk up. = Step Height defines how high obstacles ae thatthe agent can step on. Click bake to bulld the NavMesh. SePe POSITIVES: Comp Sem) 7.467 Unity Engine: Mua Publ VR + AR 2 Certo veers beta etd em Sete reap ait Meera “oh aes ~ You can see the blue overlay in the diagram that is NavMesh on a underlying ley geometry. It willbe visible whenever Navigation window will be open. ~ After baking is done you will find a NavMesh asset file inside a folder. The folder has same name which the scene of NavMesh belongs to. For example, if your scene name jg First Levelin the Assels folder, then the ~=«-NavMesh_ is, Assets > First Level > NavMesh.asset. = You can mark the object as Navigation Static in the Navigation Window. Use static menu at the top on the Inspector. It is good if you o not want to have the Navigation Window open. © Inspector [Terrain Gia Y Tag (io Nothing | Everything Lghtmap Static Occluder Static Batching Static Navigation Static Occludee Static Off Mesh Link Generation 7.11.1 NavMesh Bullding Components — You download the NavMesh building components form the Unity Store instead o! downloading them from the Unity Technologies GitHub and install them separately. ‘There are the 4 components to use with NavMesh: NavMesh Surface, NavMesh Modifier NavMeshModifierVolume, NavMeshLink @ Installation of NavMesh bullding components 1, Download Unity 56 or later and install i 2, Download the repository from the NavMesh Components page on the Unity Technologies GitHub by clicking on the green dovnload button you can also clone it by clicking on clone. 43, Using unity open the NavMesh Components Project. You can also copy the contents of the Assets/NaviMeshComponents folder to an existing Project. For using the NavMesh Surface component, navigate to GameObject > Al > NavMesh surface. It will create empty GameObject. ‘EScane Fig, : 7.11.1 ; A NavMesh Surface component open in the Inspector window Now you can do setting settings for the NavMesh Surface component to filter the input Seometry on a broad scale. a Gamo Pr (mus. 7 Advanced settings Fig. : 7.11.2: The NavMesh Surface Advanced settings panel 7 Advanced Debug Visualization Input Geometry, Polygonal Mesh Regions, Deal and Raw Contours reveal building the NavMesh with debug options. i +AR ame Programming (MU BSc. Co )_7-71__Unity Engine: Mul Nav Mesh Surface (Scrip) R=05 oat (Hela sf 753.04" Collect Objects fair ¥ Advanced Default Area | Waleable Override Voxel Size =) Yore! C.16068387 vocels paragert adias Override Tie Size Thk orld nts Build Height Mesh Debug Visualization f | L\ abet tatesh suc for dbug data tobe cllected Input Geometry Vorels Regions. Raw Contours Simplified Contours Polygon Meshes Polygon Meshes Deal during the NavMesh building process. Agent Tipe xa) Inu Layers (cei Use Geometry (RenderMeshes n ‘ay Mesh Surface Inspector with Debug Visualization options ‘The settings in the Debug Visualization is used to diagnose any problems encountered r W sare Progaming (MU Se. Comp. -SemV)_7-72_Unity Engine: Mutat PVR + AR = The unusual tickboxes show each step of the NavMesh building process, including region spliting (Regions), input scene voxelization (Input Geometry), contour generation (Contours) and the NavMesh polygons (Polygon Meshes). 7.11.2 NavMesh Modifier - NavMesh Modifiers are not in the Unity standard install. For using the NavMesh Modifier component, navigate to GameObject > AI>NavMesh Modifier. In the following fig. 7.11.4 the platform at the bottom right there is a modifier attached to it. It has set its Area Type to Lava, Vig. 7.11.4: A NavMesh Modifier component open in the Inspector window 7.11.3 NavMesh Modifier Volume I iy used for creating walkable surfaces, For using the NavMesh Modifier Volume component, navigate to GameObject > AI > NavMesh Modifier Volume. ramming (MU. , came Prog me V)_7-73 Unity Engine: Mutt-plat. Publ. VR + AR xT eT Comer x ola v T Area ype Aires Ages Fig. 7.11.4: 43 741.4 NavMesh Link Mesh Modifier Volume component open in the Inspector window is not in the U ~The NayMesh Link compo: standard install; for using the NavMesh Link component, navigate to GameObject > Al > NavMesh Link. x -TAST0S Y 0254735 2 -LEREST | | nd Point Fig. 7.11.5 : A NavMesh Link component open in the Inspector window Game Pr (MU B.Sc. Comp.-Sem-V) _7- Unity Engine: © Connecting multiple NavMesh Surfaces together ee XO Yo 2p edteet x6 YT ee jonior Giestonoerte ‘You can see two different surfaces in the image which are connected by the Navy Link. 7.11.5 Creating a NavMesh Agent ~ First we create a character which can move and navigate. For building agent we a NavMesh Agent component and a simple script. reer Game Programming (ay p y SS OTANI (MU B Se, Lore Sem) 775 Up Engine: Mult-plat. Publ VA + AR aga Soe seston Seeroeg Osamee 20 ang Ohne Avance Quy reray re hing ‘Aus Travese OF Me puorca 741.6 Creating the Character 1, Create a cylinder by clicking on: GameObject > 3D Object > Cylinder. 2, We keep the default cylinder dimensions (height 2 and radius 0.5) as they are good for a humanoid shaped agent 3, Now add a NavMesh Agent component by clic gon: Component > Navigation > NavMesh Agent. ion and path finding. Now write the script for navig 41 WoveTo.cs using UnityEngine; using System.Collections; public class MoveTo : MonoBehaviour public Transform goal; void Start() NavMeshAgent agent = GetComponent() ; agent.destination = goal.position; } Now to build a simple script which permits you to send your character to the destinat given by another Game Object, and a Sphere which is the destination to move to: 1. Create a new C# script (MoveTo.cs) and replace its contents with the above script 2. Assign the MoveTo script to the character you've just created. 3. Create a sphere, this will be the destination the agent will move to, 4. Move the sphere away from the character to a position that is close to the NavM surface. 5. Select the character, locate the MoveTo script, and assign the Sphere the Goal property. 6. Press Play; you should see the agent navigating to the location of the sphere. Programming (Mt SIM BSs.Conp.-Semv)_7. Unity Engine: Multiplat. Publ. VA + AB. 440-7 Creating an Off-mesh Link Off-Mesh Links are used for describing jump from the upper platform to the ground. 1, Create the create two cylinders by ig on: Game Object > 3D Object > cylinder. 2, To make the cylinders wok easily scale them t0 (0.1, 05, 0.1) 3, At the edge of the top platform move the the frst cylinder and then close the NavMesh surface. 4, Now locate the second cylinder on the ground, close to the NavMesh, atthe place where the link should land, 5, Next add an Off-Mesh Link component to the left cylinder by selecting it. From the inspector select Add Component and choose Navigation > Off Mesh Link 6. Assign the leftmost cylinder in the Star field and rightmost cylinder inthe End field. 7.11.8 Navigation Areas and Costs Areas which has lower cost are preferred for path finding.A* algorithm is used by Unity tofind the cost, A* calculates shortest path on Nav Mesh. See Syllabus Topic : Publishing (Bia Publishing Unity provide support to publish our project on to any platform we require. For publishing Your project follow the following steps pets 778 OD oe Go to File and then click on Build settings. @ Unity FEE) cat Assets Gameobject_ Component _Window _Help #N | ‘Unity 20173018 Personal (64bit) xs Save Sceneas. 02S New Project. Open Project... ‘Save Project Build & Run’ Following window will appear. | PURSE «| 22 tac nc sansire urge Paton elope ld G iaocreee ie pier Stony us Compretsion Metnod Let's say we want it to publish for android. Then select the platform as Android. 7 click on Player Settings. You will notice that inspector is looking similar to the screen below. u AR Fg came Pomemnna M8 Se.cConp -sem.p 7-79 Unity Engine: Multrplat. Publ. VA + © inspector re Playersetings CompanyName SetukCompanp fredictime igtiaoee Default con Default Cursor x aa wl Lf sete Cursor Hotspot. = x -——y> iH % | o — Settings for Androig i |_Resolution and Presentation | icon |. Splash image | other serings l Publishing Settings XR Settings Virtual Reality Supported (-) ARCore Supported a Vuforia Augmented Realty) kage name Then click on the Other settings some options will appear. In that fill the ce oe Which is like com.Atharva.MyFirstProject and set the minimum API level for your app! ‘ie, the lowest version of android on which your project will run properly) Identification Package Name Version* Bundle Version Code ‘Minimum API Level Target API Level Configuration ‘Scripting Runtime Version* (NET 3.5 Equivalent. uy Scripting Backend (Hone tiiiahaabaansiditd| ‘Ap! Compatibility Level* C+ + Compiler Confiquratic Kelense 3] Then click on Build if you want to save the output file in your computer or else Build Run to directly run it on your android device (But for that your device must be conn through your computer and with USB Debugging enabled on your device), Scop toga ‘Scr Ob Cemression Method 30 fer App Sores AVL cre va came Center = he TON a f _ Then window will spear siilar othe on given bellow name the output file you want to give and browse 10 the location where you want to store your project output. Then you are reay Witt YOU pk or output file and which is ready to launch, Save AS: Myf Tags: Where; _” MyFirstProject } Review Questions 0.1. Write down the steps in Unity Installation? Q.2 How you add the Asset in Unity? Write short note on Physics in Unity? Write short note on scripting? How to perform animation in Unity? "How to create keyframes in preview mode? How to create keyframes manually? 10. Write short note on mutiplayer and networking? | What isthe diferent render made settings used in Canvas? ~ Explain the basic layout for image in Unity? | Explain the visual components of UI? Write the steps to create NavMesh? plain publishing in Unity? Chapter Ends. goa Syllabus Scripting : Scripting Overview, Scripting Tools and Event Overview Syllabus Tople : Scripting Overview 8.1 Scripting Overview Here we will study how objects created in the Unity are controlled using scripts. 8.1.1 Creating and Using Scripts = The components that are attached to the GameObjects control the beahavior of GameObjects. which are attached to the GameObjects. Unity permits you to create your own components by using the scripts. It helps to trigger game events, modify Component properties over time and respond to user input in any way you wish. © Unity supports two programming languages ce 2) UnityScript - In addition many to this languages many other many other NET languages can be used with Unity if they can compile a compatible DLL 1.2 Creating Scripts To create a script click on Create meno atthe top Heft of the Project panel or by. selecting Assets > Create > CH Seript (or JayaSerit) from the main menu. The script is created in the folder selected by you in project panel. You name the seript file as it will prompt to you to enter a new name, The script file name is used to create the initial text inside the file .3 Anatomy ofa Script file = When you double-click a script Asset in Unity, it will be opened in a text editor. By default, Unity will use Visual Studio, but you can select any editor you like from the External Tools panel in Unity’s preferences (go to Unity > Preferences). = The initial contents of the file will look something like this : | using Unityéngine; public class first : “oneBehaviour { H/ Use this for initialization void Start () { } 11 Update is called once per frane void Update () { A Script in unity creates a connection with the intemal workings of Unity by implementing a class which derives from the built-in class called MonoBehaviour. A class ' { like a blueprint for creating the new components type that can be attached 10 GameObjects, " uk o a. (MU B.Sc. Comy 83 Sop Every time you add a script component to a GameObject it generates @ new instance of the object defined by the blueprint. The name of the class and the name of the script fg ae same. AS you can see there are 2 functions inside the class : Start() and Update(), Start Start) function is called by the Unity before gameplay start. Initialization takes place in start(). Updated) The Update() function contains the code which handles the frame update for the GameObject. It triggering actions, movement, and responding to user input, basically anything that needs to be handled over time during gameplay. A UnityScript script works lite bit differently to C# script. The update and start functions have same meaning but the class is not explicitly declared. The script itself is understood to define the class; it will implicitly derive from MonoBehaviour and take its name from the filename of the script asset. Controlling a GameObject To activate the code of your script you have to create an instance of the seript and attach it to.aGameObject. You can attach a script by dragging the script asset to a GameOject in the hierarchy panel or the inspector of the GameObject that is currently selected. ‘You can also click on a Scripts submenu on the Component menu which contains all the scripts available in the project, including the scripts created you, The script instance looks much like any other Component in the Inspector: ‘After attaching the instance your script will stat working when you press Play and run the game. To check this run the following code in the Start function:- 11 Use this for initialization void Start() Debug.Log("T am alive!"); Game Programming (MUBSe.Comp.-Semv) Bt = Debug.Log is a command used to a message to Unity’s console output. If you press Play now, you should see the message atthe bottom of the main Unity editor window and in the Console window (menu: Window > Console). 1.4 Variables and the Inspector While writing the script you can create your own new type of component. Like other components properties you can also edit your components in the inspector. Fjusing Unityéngine; using syste: [public class *inPiaye> : { public string sylisre; df Used 1 void Start() gLog("t ot alive and ny nane is void Update() { } = The above code creates editable eld inthe Inspector labelled “My Name”. Main Player (crip Mul + nyllane); ~The Inspector label created in Unity has a space wherever a capital letter takes place in the variable name. This label is used for display purpose so always use the variable name within your code. If after eiting the name you press Play, then you will see that the Message contains the text you entered. Console | Propet ar Clap Cer on Mr orate Gh Lamaleand my nee Fan UnayEng we Debug Lag(Cbject) [Stam ane ard my rapes x (MU B.Sc. Comp. - Sem-V) Scripting ~ In Unity by default variables are public. You have to specify the private variables, In cy you have declare a variable as public to see it in the Inspector. 6.1.5 sprogra strict private var invisiblevar: int; function Start() { } In Unity you can change the value of a script’s variables while the game is running. It wit) help you to see the effects of the changes directly without having to stop and restart When Gameplay ends, the values of the variables will be reset to whatever they were before you pressed Play. Controlling GameObjects using Components As itis already mentioned that in Inspector you can do changes in Component properties, like changes to the position, values of the Transform Component will result in a change to the GameObject’s position. The same way you can change the color of a Renderer’s material or the mass of a Rigid body with a matching effect on the look or behavior of the GameObject. 8.1.5.1. Accessing Components Many times a script require accessing the other components which are attached to the same GameObject. As you know that Component is an instance of a class, so first obtain the reference of the Component instance which you want to work with. You can use GetComponent function for this. So, you need to assign the Component object to a variable. In C# is done using the following syntax : void Start() Rigidbody rb = GetComponent (); In Unity Script, the syntax is subtly different : function Start() var rb = GetComponent.< Rigidbody > (); } Game Programming (MU BSc. Comp.-Sem-V) 8-6 After getting a reference to a Component instance set the values of its properties in Inspector: void Start() Rigidbody rb = GetConponent() 5 // Change the aass of the object's Rigidbody. rb.nass = 16f; ~The feature of calling functions on Component instances is not available in the Inspector: void Start) Rigidbody rb = GetConponent(); 11 bdd a force to the Rigidbody. rb.AddForce(Vector3.up * 1éf); 8.1.5.2 Accessing Other Objects = Other objects are sometimes operating in isolation, Scripts keep the track of other objects. For example, a pursuing enemy might want to identify the position of the player. There ‘are many ways in Unity to retrieve other objects, each suitable to certain situations. 8.1.6 Linking GameObjects with Variables = To find a associated GameObject the simple way is to add a public GameObject variable to the script: public class { public GaneObject players // Other variables and functions... ¥ This variable will be visible in the Inspector like any other: a [Cheney None (Game Object. ‘Sem-V) Seri (MU BSc. ~ Now drag an object from the scene or Hierarchy panel onto this variable to assign t There are Component access variables and GetComponent function available for qj, ‘object as with any other, so use code like the following : public class Enery : MonoBehaviour { Public Ganeobject player; void start() { 1/ Start the enery ten units behind the player character. transfora.position = player.transfora.position - Vector3.forward * 10F; ) } ~ If you have declared a public variable of a Component type in script, then drag any GameObject that has that Component attached onto it, It accesses the Component directly rather than the GameObject itself, publicTransform playerTransform ; ~ It is very useful to link objects together with variables when you are dealing with individual objects that have permanent connections. To link numerous objects ofthe same type an array variable is also used, but you have to make the connections in the Unity editor rather than at runtime, It is often suitable to locate objects at runtime and Unity provides two basic ways to do this, as given below. 1. Finding child GameObjects - Sometimes, a game Scene makes use of a many GameObjects of the same type, for example enemies, obstacles, and waypoints. These objects need to be track by a particular script that supervises or reacts to them (for example, every way points may require to be accessible to a path finding script). It is possible to link these GameObjects but the design process becomes tedious, if every new waypoint has to be dragged to a variable on a script, Similarly, if a waypoint is deleted, then it is a pain to have to eliminate the variable reference to the missing GameObject. To avoid this manage a set of GameObjects by making them all children of one parent GameObject. The child GameObjects can be easily retrieved using the parent's Transform component : ys e Programming (MUB.Sc.Conp.-SemV)_ 6-8 __Seriptng, [public class \aypointianager ; tonoBehaviour { | public Transfore{) waypoints; void Start() { waypoints * new Transfore{transform.childCount]; int i= @; foreach (Iransfore t in transform) { waypoints[it+] = ts = By using the Transform. Find function you can also locate a specific child object : (OE Transtorm Find(“Dog"; I It will be useful when an object has a child that can be added and removed during gameplay. A weapon that can be picked up and put down is a good example of this. 2. Finding GameObjects by Name or Tag = Ifyou information about the object then it i easy to locate GameObjects anywhere in the Scene hierarchy. By using the GameObjectFind function you can retrieve an individual objects : Gameobject players void start() { , player = Gancobject..Find(“NainHerecharacter"); = You can also locate the object or a set of objects by using their tag. You can use the - functions GameObject FindWithTag and GameObject.FindGameObjects WithTag : - Psaneobjece players © | Semcobjecet} events ‘void start() player = Goneobject.FindwithTag("Player” enemies = Gonedbject FindGanedbjectswithTeg("Enesy”); > (MU B.Sc. Comp.-Sem-V)_ 69 Seip 8.1.7 Event Functions ~ In Unity when you run the program, Unity passes control to a script imegularly by calling certain functions that are stated within it. Scripting Event Functions ~ When a function execution is over then control is passed back to Unity. Such functions are called as event functions as they are activated by Unity in response to events that happen during gameplay. Each function is identified by name. ~ In MonoBehaviourclass you will get the list of the functions. Let's see some important events. Fig 8.11 1, Regular Update Events ~ A game is made by using the animation frames, In games programming making changes to position, state and behavior of objects in the game is done just before each frame is rendered. Such type of code is written in Update function, This function is called before animations is calculated and before the frame is rendered and also. void Update() float distance = speed * Tine.deltaTine * Input.Getéxis(*Horizontal*); transfors.Translate(Vector3.right * distance); ) ~The FixedUpdate event function is called in physics engine to do updates in discrete time steps in a same way to the frame rendering. Ifthe code is written in FixedUpdate function for Update then you will get good accuracy. veld Fixeaipdate() Vector3 force = transfora.forward * driveForce * Input.GetAxis ("Vertical"); rigidbody.addForce(force) 5 , = You can also do changes after Update and FixedUpdate functions have been called for ll objects in the scene and afterall animations have been calculated. An example is whet a camera should stay trained on a target object; the modification to the camera's orientation must be done after the target object has moved. You can use LateUpdate function in such kinds of situations. void LateUpdate(y { Conera.nain. transform. Lookat target .transfora) + 2, Initialization Events - Always call the initialization code in advance of any updates that occur in gameplay. The Start function is called prior to the firs frame or physics update on an object. For each object in the scene the awake function is called in the scene at the time when the scene Toads. The Start and Awake functions are called in arbitrary order, but the Awakes will have finished before the first Start is called, That is the Start function can make use of other initializations before carried out in the awake phase, 3, @Ulevents = There is also a system in Unity for rendering GUI controls over the main action in the scene and responding to clicks on these controls. The code is placed in This code OnGUI function, and is called occasionally Vorr ORO { GUI.Label(labelRect, “Gare Over”); ih | = _Itto show the information about the character as of now under the mouse pointer you can likewise distinguish mouse occasions that happen over a GameObject as it shows up in the scene. ‘This can be utilized for focusing on weapons or showing information about the character as of now under the mouse pointe. An arrangement of OnMouseXXX occasion capacities (eg, OnMouseOver,OnMouseDown) is accessible to enable a script to respond to client ties with the mouse. ‘instance, if the mouse catch is squeezed while the pointer is over a specific object collisions against an object by calling event functions on that object's script is d by the physics engine. The functions OnCollisionEnter, OnCollisionStay and (MU B.Sc. mV) _ 8-11 Script ‘OnCollisionExit are called as contact is made, held and broken. The corresponding functions OnTriggerEnter, OnTriggerStay and OnTriggerExit are called when the object's collider is arranged as a Trigger (ie. a collider that just recognizes when something enters it instead of responding physically) These might be called a few times in progression if in excess of one contact is recognized amid the physics update and so a } parameter is passed to the function giving subtle elements of the crash (position, character of the approaching object, and so forth). veld Ontollisiontnter (otherobj; collision) { Lf (otherobj.tag == “Arrow") ApplyDamage(10) ; 8.2 _ Time and Frame Rate Management = The Update function monitors the input and other events frequently from a script and take suitable action. For example, moving a character when the “forward” key is pressed. ~ Keep in mind that the frame rate of game is not constant and also the length of time between the Update function call while handling time-based actions. ~ For example, consider the task of moving an object forward steadily, one frame at a time. You may feel that you are shifting the object by a fixed distance each frame : T7GE seript exaaple using UnityEngine; [using Systee collections; public class ExacpleScript + NonoBehaviour \€ | public float distancePerFrase; Ls C y i 1198 script exasple war distancePerFrame; float; funssier yedate() sransfoce.Translate(0, ©, distancePectcams); i game Programming IMU BSe.Comp.-Sem\) gz —_ If you have given the framerate as 10 milliseconds then the object move forward by by distancePerFrame one hundred times pt second. But if you set the framerate 25 milliseconds (due to CPU load, say) then object will move forward forty times a second and so cover less distance. The answer is to scale the size of the movement by the frame time which you can read from the Time.deltaTime property. So the distance is given by distancePerSecond rather than distancePerFrame. As the framerate varies te size of the movement step will vary accordingly and thus the object's speed will be constant Fee weript exersie dain Unityengine; [using syster.Collec public class Exsrplesc~ip public float distancerersecend; void Update() transfore.translate(0, 8, distancePersecond * Tire.deltaTime); U19S script exorple yar distancePersecond: floaty ssien Yedass() | scansfoce-transiate(e, ©, dssanceersncend * tive. deleaTéne)s 2 33 Fixed Timestep + For the accuracy and consistency of the simulation Unity’s physics system does work to a fixed timestep At the beginning of the physics update, Unity puts an “alarm” by adding the fixed timestep value onto the time when the lst physics update finished. + The physics system then performs computations until the alarm goes off. From the Time Manager you can modify the size of the fixed timestep and also read it from the script by __ Using the Time.fixedDeltaTime property If the timestep is less then it will result in frequent physics update and exact simulation } but -at the cost of greater CPU load. t Sem-V) 6-43 Scriptin 8.3.1 Maximum Allowed Timestep ' The fixed timestep creates problem where game makes heavy use of physics and the frame rate of the gameplay become low. The major frame update processing has to be “squeezed” in between the usual physics updates and if there is a lot of processing to do then a number of physics updates can take place during a single frame. As at the start of the frame its time, object position and other properties are fixed, due to this the graphics get out of the sync, due to frequently updated physics the graphics ean get out of sync. The CPU power is a lot, Unity provides the options to slow down physics time to allow the frame processing become equal. In the Time Manager by using the Maximum Allowed Timestep setting we can put the limit on the amount of time Unity will spend processing physics and FixedUpdate calls through a given frame update. If a frame update exceeds Maximum Allowed Timestep to process, the physics engine will “stop time” and allow the frame processing draw near, After finishing the frame update, the physics will begin again as still no time has passed since it was stopped. The result of this is thatrigid bodies will not move completely in real time as they typically do but will be slowed a little. But, the physics “clock” will track them as though they were moving normally. ‘The slowing of physics time is typically not noticeable and is an acceptable trade-off against gameplay performance. 8.3.2 Time Scale ' To apply the special effects like “bullet-time”, you may have to slow the passage of game time. Because of this the animations and script responses happen at a reduced rate. It may be like that you want to fix the game time completely when the game is paused. It uses Unity's Time Scale property, which is responsible for controlling how fast game time proceeds relative to real time. Set the scale to 1 to match the game time with real time. If you set it to 2 then it go double as quickly and if scale is 0.5 then it will slow down the game half speed. If you set the value to 0 then it stop the time. ing (MU B.Sc. Comp.-Sem-V) 8-14 Script ‘The scale is responsible for changing the time step reported to the functions Update and FixedUpdate by Time-deliTime and Time.fixedDeltaTime. You can set the global time scale by using the Time timeScale property in Time Manager. TR Se aE using UnityEngine; using Systen.Collections; public clas: { pleScript + MoneBehavieur void Pause() { Tine.tineseale = 0; y void Resune() Tiwe.tirescale = 1; } y JI9S cerips example function Pause(), t y function Resume() Time. timescale = 0; Time. timescale = 15 » 83.3 Capture Framerate ‘When in game you want to record the gameplay as a video then saving the screen images | takes large time and it results in reduction in framerate ofthe game. To overcome this problem Unity provides Capture Framerate property. When you set the -_Yalue of Capture Framerate property except zero then game time is slowed and the frame “updates are issued at exact regular intervals The interval between frames is equal to | / Time.captureFramerate, thus if the value is set 9 5.0 then updates arise every fifth of a second. With the demands on framerate well ed, you get time in the Update function to save screenshots : Terint oaeple ing UnktyEngines sing Systen Collections public class Exeeplescript : Monceehaviour 11 Capture fremes a3 4 screenshot sequence. Images are 11 sxored as Pug files in a folder - these can be combined into U1 a wevse using Snage utllity softuare (eg, QuickTine Pro). 11 The elder xo contain our screenshots. U1 AO the folder exists we sll append numbers to create an empty folder. String folder = “ScreenshotFolder Ant frametate = 253 wold start() 11 Sex the playback fraserate (real tine will not relate to game tine after this) Tiee-capturerramerate = frasetate: WU Greate the folder System. 10.Directory CreateDirectory( folder); y pte Urea) 11 Append flenawe to folder nase (forest {2 ‘0005 thot.png*) String nase = string.Format("{a}/{1:084} shot.prg", folder, Tine. franeCount); U1 Capcure the screenshot to the specified le. ‘oplication.Capturescreenshet(naee); d [=//95 script example 11 Copture froses as a screenshot sequence. Teages are 11 skeced 03 PRG files in» folder - these can be caebined into 11 a wovie using irage utility softuare (eg, QulckTize Pro). 11 The folder to contain cur screenshots. 11 Uf the folder exists we will append ninbers to create an expty folder. wa folder = “ScreenshotFolder'; war LrensRets = 25; = function Stacs() { 11 Set the playback fraserate (real tine will not relate to gare tise after this). Tiee.captureFraserate = {raeeRate; 11 Creave the folder Systen.10.Directory,CreateDirectory( folder); y possi Neda) 11 Bopend TAlenawe to folder nese (format 13 "003 shot.png™") var nase = String-Format(*{0}/(1:084} shot.png", folder, Thee.frameCount) ; 11 Capture the screenshot to the specified file. peplication.CaptureScreenshot (nase); Lb % we e Programming (MU BS. Comp.-Sem-V) 6416 ——— 84 Creating and Destroying GameObjects In games you have few constant objets, few you create and delete during gameplay. You can create a GameObjet in Unityby using the Instantiate function. Itereates a new copy of an existing object: public GaneObject enery; void start() for (int § = 0; i ¢ 55 iH) : Instantiate(eneny); ; t It is necessary that the object whose copy you are creating must be present in the scene. Will copy all the Components present on the original — Destroy function is used to destroy an object after the frame update has finished or after a particular time delay, void Oncollisionenter (Collision otherOby { ee if (otherdbj.garedbject.tag == “Nissile") { Destroy(ganedbject, .5f)5 Destroy function destroys the individual components without affecting the GameObject itself. A regular error isto write something like : ning in function takes place within a single frame update, A function call is not used Sequence of events over time or for procedural animation, Example of it is, the task ly reducing an objects alpha (opacity) value until it becomes totally invisible. (MU BSc. Sem-V)_ 8-17 Scriptin ‘void Fade() { for (float f = 1f; f >= 0; f -= @.1f) { Color ¢ = renderer.naterial. color; cas f renderer.material.color = ¢3 The fade function may not give expected effect, so to make fading visible the alpha has to reduce over a sequence of frames to show the middle values being rendered. Though, the function will execute in its whole in a single frame update. The middle values will not at all be seen and the object will disappear directly. Such situations are handled by adding code to the Update function that executes the fade on a frame-by-frame basis. The most suitable option is to use coroutine for such type of task. It is like a function that has the skill to pause execution and return control to Unity but then to continue where it left off on the following frame, In Ct, a coroutine is declared as follows : ‘IEnumerator Fade() for (float f = 1f; f >= 0; f -= 0.1f) { Color ¢ = renderer.naterial.color; cas fs renderer.material.color = c; yleld return null; It is basically a function declared with a retum type of IEnumerator and with the yield return statement incorporated somewhere in the body. The yield return line the execution will pause and be resumed the following frame. Use the function StartCoroutine to set a coroutine running, void Update() if (Input. GetKeyDown("F*)) StartCoroutine("Fade”) ; 1 Comp.-SemV) 8-18 Script 1 came Programming (MU Jo UnityScript any function that contains the yield statement is understand to be a coroutine and the IEnumerator return type require not be explicitly declared : rendererseaterial color « yields y o, 2 coroutine can be started in unityScrist by calling it ax Af € were a normal functfon: pes pete) ot by calling tt as AF At Ue canpus.setteydown(*F)) Fade(): = You might have observed thatthe loop counter in the Fade function preserves its true value over the lifetime of the coroutine, = Actually any variable or parameter willbe propery sealed between yields. - By default, a coroutine is resumed on the frame after it yields but it is also likely to introduce a time delay using WaitForSeconds i Fade() { la for (float f = 1f; f >= 0; f -= 0.1f) { Color ¢ = renderer.naterial.color; cas fp renderer.material.color = c; yield return new WaitForSeconds(.1f)3 } } Piet UnityScript: Function Fade() { for (vor Fv 1.0) f >= 6; f -= 0.1) { var ¢ = renderer.material.colors cas fi renderer.naterial.color = ¢3 yield waitForSeconds(0.1)5 c. Comp. 8-19 Script = _ This can be utilized as an approach to spread an impact over some stretch of time, yet it ir likewise a valuable advancement, - Numerous undertakings in a diversion should be completed occasionally and the mos clear approach to do this is to incorporate them in the Update function. In any case, thi function will normally be called all the time. = Atthe point when an task doesn't should be repeated so much of the time, you can place i in a coroutine to get a update frequently yet few out of every odd single frame. A case of this may be a caution that cautions the player if an enemy is adjacent. The cod, may look something like this: function Prowtadtycheck() 4 for (Ame 4 1s 1 < enentes.Lengths 44+) Af (vector3.Distance(transform.position, eneafes[t].transform.positon) < dangerDlstence) { , ? return true; return false; ) = If there are many enemies then calling this function every frame might introduce important overhead. On the other hand, you might use a co-routine to call it each tenth 0 a second : [=iTEnumerator DoCheck() { i for (5 5) { ProximityCheck; yield return new WaitForSeconds(.1f); } + This will help to reduce many checks done without any evident effect on gameplay. 8.4.2 Namespaces — If the number of scripts get increased then the projects will become larger. Then there is probability of having clashes between script class names and it will grow always greater. - For example, when multiple programmers work on the same game. It may happen th: programmers might have given the same to script so there will be clash when the Zz Game Programming (MU B.Sc. Comp.-SemV) 8:20 Scripting combine the project. By using the naming conventions you can avoid the clash or by renaming class you can avoid clash but still itis troublesome. ‘The C# language offers namespaces feature, this feature solves the above problem. A namespace is a collection of classes that are referred to using a selected prefix on the class name. ‘The following example shows the two classes that are Controller! and Controller? , these classes are the members ofa namespace called Enemy : Jranespace Eneny public class Con { } + MonoBehaviour public class Con t } + MonoBehaviour Ir "= You can refer the class as Enemy.Controller] and Enemy.Controller2, respectively. You can also use multiple bracketed namespace sections around classes wherever they arise, although those classes are in different source files. To avoid writing the namespace prefix repeatedly, simply add using directive at the top of the fil. | using Enemy; 5 | This line indicates that where the class names Controller! and Controller? are found, they jould be taken to mean Enemy. Controller! andEnemy.Controller2, comespondingly. Attributes ttributes is a property or function in script and markers that can be placed above @ ass. For example, the Hidelnlnspector attribute, it is added above a property declaration id the property being revealed in the inspector, although it is public. JavaScript, an attribute name begins with an “@" sign, even as in Ci, it is enclosed ‘square brackets : | | (MU B.Sc. Comp. - Sem-V) _ 8: Sci 773s @HidetnInspector var strength: floats Wee [tideIntnspector] public float strength; There are many attributes offered by Unity that are listed in script. In NET library also attributes are defines which you can use in Unity code. 85 Execution Order of Event Functions In Unity scripting, there are a number of event functions that get executed in a fixed order as a script executes. This execution order is described below: 85.1 First Scene Load — The functions shown in table are t called when a scene starts (once for each object in the scene). Awake Itis always called before any Start functions and too just after a prefab is instantiated OnEnable (if object | It is called just after the object is enabled. This happens when a is active) MonoBehaviour instance is created, for example when a level is loaded ‘or a GameObject with the script component is instantiated. OnLevelWasLoaded | It is executed to inform the game that a new level has been loaded. 7 Editor Reset : To initialize the script’s properties when it is first attached to the object Reset is called. = Before the first frame update = Start : Before the first frame update the Start is called, but the script instance should be enabled. The objects which area added in the scene, the Start function is called on all scripts before Update, etc are called for any of them. obviously, this cannot be forced when an object is instantiated throughout gameplay. | |LateUpdate | LateUpdate is called once per frame, after Update has finished. Any nApplicationPause : When a pause is dtected successfully between the usual frame updates then OnApplicationPause i called atthe end of the frame. One extra frame will be jssued after OnApplicationPause js called to permit the game to demonstrate graphics that show the paused state, update Order For keeping tack of animations, camera positions, game logic and interactions etc., there are a few different evens you can use. You can use other functions also shown in table : xedUpdate | FixedUpdate is often called more frequently than Update. It can be called multiple times per frame, if the frame rate is low and it may not be called between frames at all if the frame rate is high. All physics calculations and updates occur immediately afterFixedUpdate. When applying movement calculations inside FixedUpdate, you do not need to multiply your values bby Time.deltaTime. This is because FixedUpdate is called on a reliable timer, independent ofthe frame rate Update is called once per frame. Its the main workhorse function for frame updates. © Rendering calculations that are performed in Update will have completed when LateUpdate begins. A common use for LateUpdate would be a following third-person camera. If you make your character move and tum inside Update, you can perform all camera movement and rotation calculations in LateUpdate. This will ensure thatthe character has moved completely before the camera tracks its position. OnPreCull ineVisible/OnBecamelavsible | It is called when an object becomes visible/invisible to This function is called before the camera culls the scene. Culling find outs which objects are visible to the camera, OnPreCull is called just before culling takes place. any camera. ee OnWillRenderObject Itis called once for each camera if the object is visible OnPreRender Itis called before the camera starts rendering the scene OnRenderObject It is called after all regular scene rendering is done You can use GL class or Graphics.DrawMeshNow ig draw custom geometry at this point. OnPostRender Itis called after a camera finishes rendering the scene, OnRenderlmage I is called after scene rendering is complete to allow postsprocessing of the image, see Post-processing Effects. OnGUI It's called multiple times per frame in response to GUT events, The Layout and Repaint events are processed first, followed by a Layout and keyboard/mouse event for each input event. OnDrawGizmos It is called for drawing Gizmos in the scene view for visualization purposes. = Coroutines - After the Update function retums the normal coroutine updates. A coroutine is a function that can suspend its execution (yield) until the given Yield Instruction finishes. Different uses of Coroutines are as follows : yield The coroutine will continue after all Update functions have been called on the next frame. yield WaitForSeconds | The coroutine will Continue after a specified time delay, after all | Update functions have been called for the frame yield StanCoroutine | It chains the coroutine, and will wait for the MyFunc coroutine to complete frst. yield It will Continue after a WWW download has completed. WaitForFixedUpdate 8.5.2 When the Object Is Destroyed i = OnDestroy: The OnDestroy function is called after every frame updates for the last frame of the object's existence. Called onal game objects before the application is quit. In the editor it is called when the user stops playmode. Called when the behaviour becomes disabled or inactive. stributed from a focal pol called the heap. the point when the thing is never again being used, the memory it once involved can be ered and utilized for something different. Before, it was regularly up to the oper t0 dispense and discharge these blocks of stack memory clearly with the opriate function calls. ‘on a. usual basis. dentally this snot recess the eal storage space fora big item is allocated from ie and a small “pointer” value is used to Keep in mind its location. From that point just the pointer require be replicated amid parameter passing. | | Gamo (MU B.Sc. m-V)_ 8-25 Seriptn - Types that are stored straight and copied through parameter passing are called value types. These incorporate Booleans, integers, floats, and Unity's struct types. ~ Types that are allocated on the stack and afterward got to by means of a pointer are called reference types, since the value put away in the variable just " refers " to the genuine information. Cases of reference writes incorporate objects, strings and arrays. 8.5.5 Allocation and Garbage Collection = The memory manager keeps track of the unused memory, When new memory block is requested the memory manager selects the unused memory block and then deletes the allocated memory from the known unused space. = Successive requests are handled the same way until there is no free area large enough to allocate the necessary block size. — Memory manager searches for the currently active reference variables and mark the block as “live” to find out heap blocks that are no longer in use. ~ At the ending of the search, any space between the live blocks is consider empty by the memory manager and can be used for successive allocations. ~The process of allocating and releasing the unused memory is known as garbage collection. 85.6 Optimization = Garbage collection is automatic and it is invisible to the programmer. The Garbage collection process needs the major CPU time behind the scenes. = Automatic memory management is usually equal to or it beats the manual allocation and increase the overall performance if it is used correctly. / - But for this programmers have to reduce the mistakes which lead the collector more often than essential and introduce pauses in execution, Few algorithms also creates problem for example Repeated string concatenation : game Programming MUBSe.Comp.-sem¥) 8.26 Tice script exarple using UnityEngir using Systen.Collections; public class Evarplescript + von jour void ConcatExample(int{] intarray) { string Line = intteray[9].Tostring(); ter (= 4; ic intérray.tengths i++) ; line # ", "+ intarray[i].Tostring()s return Line; } ? * IIIS script exacple | function Concatéxanple(intarray: int[]) { var Line = intarray[9).Tosteing(); for (i = 1; i < inthrray.Length; i++) { Line +=", " + intarray[i].ToString(); } return line; Sa nel ne EVO) the prior contents of the lin variable become dead - an entire new string is allocated e the original piece plus the new part atthe end. also increases and so it is easy to use up hundreds of bytes of free heap space time this function is called ; Mono library's System Text StringBuilder class concatenate many strings together. d concatenation will not ease to0 much trouble if not called very often. In Unity typically implies the frame update. Something like : HC script exaeple using Unityéngine; using System.Collections; pobtic class Exasplescript : MonoBehavicur public GurText scoreBoard; public int score; void Update() string scoreText = "Score: * + score.ToString()s scoreBoard.text = scoreText} W128 script exarple var scoreBoard: GUITexts function update() { var scoreText: String = “Score: “ + score.ToString(); scoreBoard.text = scoreTexts ..-Will allocate new strings every time Update is called and create a constant drip of garbage. Most of that can be saved by updating the text only when the score alters using Unityéngine; using Systen.Collections; public class ExerpleScript : NoncBehavicur public GuITex: scoreBoard; public string scoreText: public int score; public int oldscore; void Update) Af (score != oldScore) { scoreText = "Score: * + score.ToString(); scoreBoard.text = scoreText; oldScore = score; Programming (MU B.S. Comp,-Sem-V) 8-28 TPS script exanple var scoreBoard: GurText; var scoreText: String; var score: int; var oldScore: int; function Update() if (score != oldscore) { ScoreText = "Score: " + score. ToString(); scoreBoard.text = scoreTexts oldScore = score; } “Another possible problem occurs when a function returns an array value [TICE Seige aes using Unityéngines using Systen.Collections; public class éasples Float{] RandooList(int nuntlesents) { var result « new float{nunélenents]; for (int i = 0; i < nunElenents; i++) result{i] = values } return result; JIS script ex function Rande { ; var result st(nutélenents: int) = new Float[nueélenents]; for (i = 0; i € nunelenents; it) result[i] * Forder values } return result; bo | | (MU B.Sc. Com 8.29 Script = Such type of functions is neat and suitable when creating a new array filled with values, Conversely, if itis called frequently then new memory will be allocated each time, = As arrays can be very large, the free heap space could get used up rapidly, resulting in frequent garbage collections. - To avoid this problem make use of the fact that an array is a reference type. An array passed into a function as a parameter can be customized within that function and the results will stay after the function retums. The above function is replaced with something like : 7/@ script exemple using UnityEngine; using System.Collections; public class ExapleScript : MonoBehavicur if void Randoslist(float[] arrayToFill) ‘ for (int d= 0; 4 < arrayToFill.tength; i+) arrayToFill[i] = Randos.value; ; } a 1135 script example function RandoaList(arrayToFill: float(]) { for (i = @; i < arrayToFill.Length; i++) { arrayToFill[i] = Randow. value; } t It replaces the contents of the array with new values. The initial allocation of the array should be done by calling the code, the function will not produce any new garbage when itis called. 85.7 Requesting a Collection ~ It is better to avoid the allocation as much as possible, we cannot eliminate it completely. So, to minimize the interruption into gameplay we use 2 strategies : ramming (MI — MUBSe. Com. -Semy) 8.90 Scripting 5574 ‘Small Heap with Fast and Frequent Garbage Collection syill be in use only temporarily, will be about Tms Jt is beneficial sometimes to request a yge collection at a usual frame interval as it processed quickly and has minimum effect on gameplay : collection time for your game. es. Which Te a Such nis long time gameplay and needs smooth framerate use this statrategy Such types of games normally allocates small blocks regularly but these blocks this When 2 vst strategy On iOS the heap size is about 200KB and garbage collection needs about Sms on an iPhone 3G. Ifthe heap size made to IMB, then the collection time if (Time.frameCount % 38 == 8) { systen.GC.Collect()3 Use this strategy with care and check the profiler statistics to ensure that it is reducing 95.7.2 Large Heap with Slow but Infrequent Garbage Collection during pauses in gameplay So the heap size become large and it will system memory. = Still, the Mono runtime ignores increasing ‘This strategy is used when the allocation is needed rare for games. and can be handled not kill your app by the OS because of low the heap automatically if at all possible. You can enlarge the heap manually by preallocating some placeholder space in startup = “ foe (it t= th Sols = newbies), AU redense caterer such te eens for trae 8 aged for begs B16 = An adequately huge heap should not get completely filled up between those pauses in gameplay that would accommodate a collection. When such a pause occurs, you can as for a collection clearly : System,GC.CollectQ); - When you use the strategy, be careful while using it and also pay attention to the profile, statistics instead of just assuming it is having the desired effect. 8.5.8 Reusable Object Pools - There are few objects in game which encountered over and over again although only small number will ever be in play at once. = In such cases, it is possible to reuse objects instead of destroying the old one ang replacing it with new one. ones. 8.5.8.1 Platform Dependent Compilation - Unity has a feature called Platform Dependent Compilation. This contains some preprocessor directives that allow you to partition your scripts to compile and execute a section of code completely on supported platforms. - This code is run on Unity Editor, it means you can compile code on your targeted platform and test it on Unity editor. 8.5.9 Platform #define Directives The platform #define directives that Unity supports for your scripts are given in table as follows : Property Function UNITY_EDITOR_OSX define directive for Editor code on Mac OS X. | UNITY_EDITOR_WIN define directive for Editor code on Windows. UNITY_STANDALONE_OSX | define directive for compiling/executing cade specifically for Mac OS X UNITY_EDITOR define directive for calling Unity Editor scripts from your game code. UNITY_STANDALONE_LINUX | #define directive for compilingfexecuting code specifically for Linux standalone applications. UNITY_XBOXONE #define directive for executing Xbox One code. UNITY_PS4 fidefine directive for running PlayStation 4 code. Fe Programmi Wcame SEN WBS Comp Sem a Scriptin a Deprecated. Use UNITY_1OS instead. { Deprecated. Use UNITY10S instead. Seline directive forthe Android platform. $efine directive forthe Tizen platform. ‘define directive for compiling/executing code for the iS platform. UNITY_STANDALONE UNITY_ASSERTIONS UNITY_TVOS UNITY_WSA ‘define directive for compiling/executing code for the Wii console, ‘define directive for compiling/executing code for any Standalone platform (Mac OS X, Windows or Linux). define directive for assertions control process. define directive for the Apple TV platform. i) fidefine directive for Universal Windows Platform. Additionally, NETFX_CORE is defined when compiling, Ct files against NET Core and using .NET scripting backend UNITY_WSA_10_0 define directive for Universal Windows Platform. Additionally WINDOWS_UWP is defined when compiling C# files against NET Core. UNITY_WINRT_10_0 UNITY_WEBGL Same as UNITY_WSA. Equivalent to UNITY_WSA_10_0 ‘define directive for WebGL. | | UNITY_ADS ‘define directive for calling Unity Ads methods from your game code, Version 5.2 and above. UNITY_FACEBOOK define directive for the Facebook platform (WebGL or ‘Windows standalone). UNITY_ANALYTICS ‘#tdefine directive for calling Unity Analytics methods from your game code. Version 5.2 and above. = From Unity 2.6.0 beyond version, you can compile code selectively. Based on the Available version of the Editor. Given a version number X.Y.2 (for example, 2.6.0), Unity depicts three global #define directives in the following formats ; UNITY_X, UNITY_X_Y and UNITY_X_Y.2Z. ‘Game Programming (MU Comp.-Sem-V) _8-33 Scripting UNITY_S ‘#define directive for the release version of Unity 5, exposed in every 5.X.Y release, = Anexample of #define directives exposed in Unity 5.0.1 are as follows : | UNITY_5_0 | #define directive for the major version of Unity 5.0, exposed in every 5.0.7 release. | UNITY_5_0_1 | #define directive for the minor version of Unity 5.0.1 | - From the Unity's version 5.3.4, you can selectively compile the code based on the initia version of Unity necessary to compile or execute a given part of code. The format given above (X.Y.Z), Unity exposes one global #defines in the format UNITY_X_Y_OR_NEWER, that can be used for this reason, The supported define directives are : ENABLE_IL2CPP Scripting backend #define for IL2CPP | ENABLE_DOTNET Scripting backend #define for NET. “| ENABLE_MONO Scripting backend #define for Mono. NET_STANDARD_2_0 Defined when building scripts against NET Standard 2.0 API compatibility level on Mono and IL2CPP. ENABLE_WINMD_SUPPORT | Defined when Windows Runtime support is enabled on IL2CPP and .NET. See Windows Runtime Support for more details. NET_2_0_SUBSET Defined when building scripts against NET 2.0 Subset API compatibility level on Mono and IL2CPP. NET_2.0 Defined when building scripts against NET 2.0 API compatibility level on Mono and IL2CPP. NETFX_CORE Defined when building scripts against NET Core class libraries on .NET. NET_4.6 Defined when building scripts against NET 4.x API compatibility level on Mono and IL2CPP. The DEVELOPMENT_BUILD #tdcfine is used to identify whether your script is running in a player which as built with the “Development Build” option enabled. You can also selectively compile the code depending on the scripting back-end. ~ Now select the platform to test your precompiled code against and click Switch Platform to inform Unity which platform you have targeted. jn oo feenting MBS: Compson ey = 6 create #67 and IPE or copypaste the following code: TT function Avake() ssi UNITY_gprToR Debug.Log(“Unity Editor"); endif #3 UNITY_IPHOUE Debug. L0g(“Ip sendit ne"); #2 UNITY sTANDALOUE_osx Debug. Log("Stand sends ‘lone 95x"); #3 UNITY sTanoaLone yu Debug.Log(*stand Alone ieindoxs"); endif using Unityéngine; using Syster.Collections; public class Platfo { void Start() ‘Sif UNTTY_gDITOR Debug.Log("Unity Editor” endif if UuTTY_10s Debug Sendif #1 unrTy_staunacone_o5x Debug. tog("Stand Alo ‘Sendit ‘if UMTTY_sTANDALONE MIN | , Debug.Log("Stand Alone Windows"); Sendit Y care Programming (MU B.Sc. Comp. - Sem) _ 8.35 Scipting = Click on Play Mode to test the code. Now by checking for the significant message in the Unity console confirm that the code works for the selected platform. For instance, if you have selected iOS , the message “Iphone” is set to show in the console. You can use CONDITIONAL attribute in C# . — This attribute clear and less error-prone of stripping out functions this attribute do not affect the Unity’s common callbacks like Start(), Update(), LateUpdate(), FixedUpdate(), and Awake() as they are called directly from the engine and, for performance reasons, it does not take them into account. - Additionally, to the basic #if compiler directive, you can also use a multiway test in C# ; ‘Sif UNITY_EDITOR : Debug-Log("Unity Editor"); elif UNITY_10s Debug. Log( “Unity iPhone"); welse Debug. Leg(“Any other platfors"); bendif, 8.5.10 Platform custom #defines = Its possible to include the built-in selection of #define directives by giving your own, From the Player Settings open the Other Settings panel and navigate to the Scripting Define ‘Symbols text box. sepremntem: memasimnnny Configuration Scripting Define Symbols ~ Enter the symbol names that you want to define for a specific platform, separated by semicolons. You can use these symbols as conditions for #i£ directives, identical to the built-in ones. f | v Game Programming (MU B.Sc, Com; 036 Scripting 45.11 Global custom #detines It is possible to define your own preprocessor directives for controlling the code. These preprocessor directives get included in your code at the time of compiling. = To include it in Assets folder adda text fle wit the extra directives. The extension is sp and the name to the files given based on the language you use. C# (player and editor scripts) | /Assets/mes.rsp UnityScript /Assets/us.rsp ~ For example, if you have added single line -define: UNITY_DEBUG in your mes.rsp file, the #define directive UNITY_DEBUG exists as a global #define for C# scripts, excluding Editor scripts. ~ _ Every time you make changes to rsp files, you require to recompile in sequence for them to be effective. It is also done by updating or reimporting a single script (,js or .cs) file. 8.5.1.1 Special Folders and Script Compilation Order - Some folder names are reserved in Unity that shows that the contents have a particular purpose. Some of these folders have an effect on the order of script compilation. These folder names are: Assets, Editor, Editor default resources, Gizmos, Plug-in, Resources, Standard Assets, StreamingAssets = There are 4 phases of script compilation. The phase where a script is compiled is specified by its parent folder. This is important in cases where a script have to refer to classes defined in other scripts. = The essential rule is that everything that is compiled in a phase after the current one ‘cannot be referenced. Everything that is compiled in the current phase / an earlier phase is fully available. = Another case is when a script written in one language refers to class written in another language. For this the rule is the class being referenced have to have been compiled in an previous phase. The compilation phase areas follows : Phase 1 : Runtime scripts in folders named Standard Assets, Pro Standard ‘Assets and Plugins. Phase 2 : Editor Scripts in folders named Editor that are everywhere inside top-level folders called Standard Assets, Pro Standard Assets and Plugins. (MUB Se. 837, Script 3. Phase 3: All other scripts that are not inside a folder called Editor. 4. Phase 4 All remaining scripts (those that are inside a folder called Editor). The importance of this phases order comes in to light when a UnityScript file wants ig reference a class defined in a C¥ file. To get this, you require to place the C #file in a Plugins folder and the UnityScript file in a non-special folder. If you fail to do this, an error is thrown saying that the C# class cannot be found. 8.5.11.2 Script Compilation and Assembly Definition Files About Unity has automatically defined that how scripts compile for managing assemblies. If you add more script to the project then the compilation times will increase. To define your own assemblies and manage them define your own assemblies based on the scripts inside the folder. For doing this, separate the Project scripts into several assemblies with well-defined dependencies in sequence to make sure that only needed assemblies are rebuilt when making modifications in a script. This decreases the compilation time. Feel that each managed assembly as a single library inside the Unity Project. Fig. 8.5.1 : Script compilation As shown in Fig. 8.5.1, the project is spitted into various assemblies. If only scripts in Main.dll are changed it do not cause other assemblies to recompile. ‘As Main.dll has fewer scripts, it also compiles faster than Assembly-CSharp.il Likewise, script modifies in only Stuff.dll causes Main dil and Stuff.dll to recompile. f Game Programming (MU B.Sc. Comp.-SemV) 838 Sabin 12 How to Use Assembly Definition Files Assembly definition files are nothing but the Asset files that you have created by going to Assets > Create > Assembly Definition, Their extension is .asmdef. In the Unity project add an assembly definition file to a folder for compiling all the in the folder into an assembly. Inthe Inspector, Set the name of the assembly “Olnspector | fp Bample import Serings Unity References TetAnenbies Options Allow ‘unsate Code Platforms ‘Any Patform ¥ Exclude Platforms Androis Editor ‘os Unux 32-1 Unux 64-bit Unux Universal macOS Nintendo305 Psa Psvita ‘Switch os f Universal Windows Pat Webci. Windows 32-8 Windows 64-t XboxOne Selectal] [Deselect al Fig. 85.2: Example Import Settings ‘Game Programming (MU B.Sc. Comp. Sem-V) 6-39 Script = Add references to other assembly definition files in the Project using the Inspector too. To view the Inspector, click your Assembly Definition fie and it should appear. = Toadd a reference, click the + icon under the References section and choose your file, = The references are used to compile the assemblies in Unity and dependencies are also defined between the assemblies. - Enable the Test Assemblies in the Inspector for marking the assembly for testing. This includes references to unitframeworkdll and UnityEngine.TestRunner.dll in the ‘Assembly Definition file. When you are marking assemblies for testing, keep in mind that : 1. Predefined assemblies like Assembly-CSharp.dll do not automatically reference Assembly Definition Files flagged for testing, 2. The assembly is not incorporated in a normal build, To add the assemblies in a player build, use BuildOptions.IncludeTestAssemblies in your building script. 8.5.12.1 Multiple Assembly Definition Files Inside a Folder Hierarchy The multiple assembly definition files in a folder hierarchy causes every script to be included to the assembly definition file with the shortest path distance. For example, if Assets/ExampleFolder/MyLibrary.asmdef, file and an Assets/ExampleFolder/ExampleFolder2/Utility.asmdef file, then : ~ Every scripts inside the Assets > ExampleFolder > ExampleFolder2 folder is compiled into the Assets/ExampleFolder/ExampleFolder2/Utility.asmdef defined assembly. ~ Every files in the Assets > ExampleFolder folder that are not in the Assets > ExampleFolder > ExampleFolder2 folder is compiled into the Assets/ExampleFolder/MyLibrary.asmdef defined assembly. 85.122 Assembly Definition Files are Not Bulld System Files ~ Assembly build files are known as the assembly definition files. ‘These files do not support conditional build rules naturally found in build systems. ~ This is also the cause why the assembly definition files do not support setting of Preprocessor directives (defines), as those are static at all times. 8.5.12.3 Backwards Compatibility and Implicit Dependencies - With Unity’s existing Predefined Compilation System the assembly definition files are backwards compatible. That is the predefined assemblies forever depend on each ‘Game Programming (MU B.Sc. Comp.- Sem) @-40. = assembly definition file's assemblies. Iti same as how all scripts are dependent on all precompiled assemblies (plugins / dlls) compatible with the active build target in Unity. Predefined assembles (Assembly-CSharp.l ec) Precompiled assembli (Piugins) Fig, 85.3 : Assembly dependencies The Fig. 8.53 illustrates the dependencies among assembly definition files assemblies. predefined assemblies, and precompiled assemblies. Unity has given priorities to the assembly definition over the Predefined Compilation System. = Itmeans there will be no effect on compilation from any of the special folder names from the predefined compilation inside an assembly definition file folder. Unity take it as regular folders without any special meaning = Forall the scripts in the Project use the assembly definition files or do not use it. Tf not, the scripts which are not using assembly definition files always recompile each time an assembly definition file. It reduces the advantage of using assembly definition files. 86 API ‘You will get the information about the assembly definition and all the assemblies built by Unity from the UnityEditor Compilation namespace that has a static CompilationPipeline class. 86.1 File Format Assembly definition files are JSON files. They have the following fields : Field Type Name String references(optional) string array includePlatforms (optional) | string array excludePlatforms (optional | string array (MU B.Sc. =SemV)_ 8-41 Scripting - Keep in mind that includePlatforms and excludePlatforms fields should not be used in, the in the same asserably definition file, = Toretrieve the platform names use ‘CompilationPipeline GetA ssemblyDefinitionPlatforms. * Examples MyLibrary.asmdef { chase": “hyLibrary”, creferences”: [ “utility” ], “includePlatferes*: [“Android”, “ios"] } MyLibrary2.asmdef name": “MyLibrary2", references": [ “Utility” ], TexcludePlatfores": ["WebGL") 8.6.2 .NET Profile Support Number of .NET profiles are supported by Unity and each profile offers different API surface for C# code for interacting with the NET class libraries. To change the NET Profile in the Player Settings, go to Edit > Project Settings > Player using the API Compatibility Level ‘option in the Other Settings section. 8.6.2.1 Legacy Scripting Runtime ~ Two platforms are supported by the legacy scripting runtime and they are NET 20 Subset and NET 2.0. they are directly aligned with the NET 2.0 profile from Microsoft ~The NET 2.0 Subset profile is smaller than the NET 4.x profile, and it permits access to the class library APIs that the majority Unity projects use. set of portable APIS for multiplatform suppor. Tris the perfect choice for size-constrained platforms, for example mobile, and it offers @ By default, the majority Unity projects should utilize the NET Standard 2.0 profile. ' Game Programming (MU BSc. Comp.-Sem-V) 8-42 = 1362.2. Stable Scripting Runtime Two platforms are supported by stable scripting runtime and they are .NET Standard 2.0 and NET 4x. The name of the NET Standard 2.0 profile can be a bit confusing as it is not associated to {ET 2.0 and NET 20 Subset outline from the legacy scripting runtime. in its place, Unity’s support for the NET Standard 2.0 profile matches the profile of the similar name available by the NET Foundation. The NET 4.x profile in Unity matches the .NET 4 series of profiles from the NET Framework. Just use the NET 4.x profile for compatibility with external libraries, otherwise when you need functionality that is not presented in NET Standard 2.0. 86.2.3 Cross-Platform Compatibility = Unity aspires to support the huge majority of the APIs in the .NET Standard 2.0 profile on every platforms. = As mot all the plaforms support the NET Standard, libraries for cross-platform ‘compatibility should target the NET Standard 2.0 profile. = The NET 4.x profile adds a much larger API surface, plus parts which may work on litle or no platforms. 8.6.2.4 Managed Plugins ~ Managed code plugins are compiled outside of Unity and they work with either the NET Standard 2.0 profile or the NET 4.x profile in Unity. ~The following table shows configurations Unity supports : API Compatibility Level Managed plugin | pr Standard 2.0| NET 4.x ‘compiled agai NET Standard | Supported Supported NET Framework | Limited Supported NET Core Not Supported _| Not Supported 86.2.5 Referencing Additional Class Library Assemblies To access a part of the INET class library API that is nat compiled by default the Project an tell the C# compiler in Unity ‘The behavior depends which NET profile the Project uses. ie ia f 1 W care Programenng (MU B.Sc Comp. -SemV)_ 643 Scripting 8.6.3 .NET Standard 2.0 Profile — If your project is using the NET Standard 2.0 Api Compatibility Level, then there is no eed to take any additional steps to use part of the NET class library API. ~ Jf part of the API is missing, then it might not be added with NET Standard 2.0. The Project may require to use the NET 4.x Api Compatibility Level in its place. 86.4 .NET 4.x Profile — Unity references the assemblies shown in table by default when using the NET 4.x Api Compatibility Level : mscorlib.all | System Core dll | system.Xml.all Systemall | System Runtime Serialization dll | System Xml-Ling dll = You should reference some other clas library assemblies utilizing a mes.rsp file. You can add this file to the Assets catalog of a Unity Project, and utilize it to pass extra order line arguments to the C# compile. = For instance, if a Project utilizes the HupClient class, which is characterized in the SystemNetHup.dll get together, the C# compiler may create this underlying error message : ‘The type “HpClient is defines in an assembly that is not referenced, You must add a reference to assembly System NetHnp, Version=4.0.0.0, Culture-neutral, PublicKeyToken=bOS{SI7f1145003a" ~ By adding mes.ssp file to the Project itis possible to resolve this error : orSystem Net Hitp.dll ie = You must reference class library assemblies as explained in the example above. Do not copy them into the Project directory. 86.5 Switching between Profiles Be careful when you use an mes.sp file to reference class library assemblies. If you alter the API Compatibility Level from NET 4.x to NET Standard 2.0, and an mes.rsp similar to the one in the example above present in the Project, then C# compilation will fails. ~ TheSystem NetHttp dll assembly does not exist in the NET Standard 2.0 profile, thus the C# compiler is not capable to locate it The messsp file can have parts that are specific to the present .NET profile. If you &: modification to the profile, you require altering the mes.sp file. ' t ¥ Game Programming (UBS. Comp -SemV) eth Scipting 4.6.5.1 Stable Scripting Runtime : Known Limitations Although Unity supports @ modem NET runtime but still it bas following issues when sing the NET runtine 4. Code size ~The stable scripting runtime has a large NET class library API than the legacy scripting runtime. It means thatthe code size is normally larger. = This size raise may be imponant, particularly on size-constrained and Ahead-of-Time (AOT) platforms. To moderate code size increases : 1, Choose the smallest possible NET profile possible. The NET Standard 2.0 profile is about half the size of the NET 4.x profile, thus use the NET Standard 2.0 profile where feasible, 2. Enable Strip Engine Code in the Unity Editor Player Settings by going to Edit > Project Settings > Player. This choice statically analyzes the managed in the Project, and it also removes any idle code 2, Transport Layer Security (TLS) = Current Mono has TLS 1.2 help for various platforms. Unity underpins TLS 1.0 in the NET class libraries, and TLS works for the Mac Standalone player. = Where you require full TLS bolster, utilize Unity WebRequest or platform-specific local arrangements. Unity is curently dealing with including TLS 1.2 help for all .NET class library APIs on all stages that Unity underpins. 8.6.5.2 Generic Functions = There are some functions inthe script reference like various GetComponent functions that are listed with a alteratve that has a leter T or a type name in angle brackets after the function name 2 = ee void FunclianeeT>()3 - Such types of functions are called generic functions. The importance of these functions for scripting is that you get to state the types of parameters and the return type when you The type is correctly Un ce var obj = GetCosponent(); Mn 3S yar obj = GetComponent.< Rigidbody > (); = InC#, itcan save many keystrokes and casts : Rigidbody rb = go.GetComponent(); I) 1.28 compared with: Rigid! ~ Any function that has a generic alternative listed on its script reference page will let th special calling syntax. 86.6 Scripting Restrictions Some platforms have restrictions for Unity Support. The following table shows tt restrictions to each platform and scripting backend : 8.6.6.1 .NET 4.x Equivalent Scripting Runtime Platform Ahead-of- | No. | NET Core class (scripting backend) time compile | threads | libraries subset Android (IL2CPP) v Android (Mono) iOS (L2CPP) v | PlayStation 4 (IL2CPP) v Standalone (IL2CPP) v Standalone (Mono) Switch (IL2CPP) v Universal Windows Platform (IL2CPP) v ij Universal Windows Platform (.NET) Vv _| WebGL (IL2CPP) v v WiiU (Mono) Xbox One (IL2CPP) v Windows Platform (IL2CPP) Sal Windows Platform (NET) . mp.- Sem-V)_ 8-47 8.6.7 Ahead-of-Time Compile ~ There are few platforms which do not support runtime code generation. As a result, any managed code which depends upon just-in-time (JIT) compilation on the target devicg will fail. ~ Instead, we want to compile all of the managed code ahead-of-time (AOT). Frequently, this difference doesn’t matter, but in a few specific cases, AOT platforms require additional concem. 7 System.Retlectlon.Emit - An AOT platform cannot execute any of the methods in the SystemReflection.Emit namespace. Keep in mind the rest of System. Reflection is acceptable, only if the compiler can conclude that the code used by reflection require to exist at runtime, = Serialization - AOT platforms may meet issues with serialization and deserlization because of the use of reflection, - If a type or method is just used through reflection as piece of serialization or deserialization, the AOT compiler cannot notice that code wants to be generated for the type or method. 8.6.8 Generic Virtual Methods Generic methods want the compiler to do additional work to increase the code that is written by the developer to the code in fact executed on the device. ~ For instance, we require different code for List with an int or a double. In the presence of viral methods, where behavior is specified at runtime instead of compile time, the compiler can simply require runtime code creation in places where it is not completely clear from the source code. ~ Suppose we have the following code, which works accurately as projected on a JIT platform it prints “Message value: Zero” to the console once : using UTE; using Syste; |-public class sore-cbi { public enum i { zero, ore, } void stare() { 11 Sibtie wigger: The type of rarager “Hust” be 1 anager, re Yaaer, vo trigger the AT robles Planegee manager = nen nee oer settee, J+ public void owessagecta(r value) { ? Detug.Logtoreat("Message value: y public class “279 { 5 public vold Sendtesagecto(tieceiver target, 7 value) { } ‘earget.Omessage(velue)s } [public interface Téeceise” void omtessagec">( values y public interface s/arase" fe target, T value); void Sendiessagect> (ihe Game Pr (Mu in — When you execute the code on AOT platform with the IL2CPP scripting backend, lowing exception occurs : ‘ExecutionEngineException: ‘Attempting to call method ‘AOTProblemExample :: ‘pMessage’ for which no ahead of time (AOT) code was generated. ‘at Manager SendMessage[T] (IRreceiver Target, TValue)[0x00000] in :0 ‘at AOTProblemExample.Start()[0x00000) in :0 ~__ Similarly, the Mono scripting backend also give same exception : ExecutionEngineException: Attempting to JIT compile method ‘Manager:SendMessage (Receiver, AOTProblemExample.AnyEnum>" while running with -aot-only. At ‘AOTProblemExample.Start()[0x00000] in :0 - The AOT compiler does not understand that it should generate code for the generic method OnMessage with aT of AnyEnum, so it wonderfully goes on, skipping this method. - When that method is called, and the runtime cannot discover the correct code to execute, it gives this error message. To solve this issue we often force compiler to create a correct code for us. If we add following methods to the AOT Problem Example class : 11 reno lao needs 048 Line, Note that we ace 1) calling directly on the Parage”, rot the Dinager interfact. new Panager()-Senaressage(vull, anjtra270)), 1 nce oy magpie son cob ure ts ttf Cs eth Us cae ‘row new Lnvalidperstionacepeion("ThIs wethod Ls used for ADT cede generation enly. bo not call 18 at rutive.*}; — When the compiler meets the open call to OnMessage with a T of Any Enum, it creates the correct code for the runtime to execute. The method Used Only for AOT Code Generation does not yet require to be called; it now requires being present for the compiler to observe it. > No threads ~ Use of thread is not supported by all the platforms thus any managed code which uses the System. Threading namespace will fil at runtime, ‘Game Programming (MU B.Sc. Comp.-Sem-V) 8-50 Scripting ‘Moreover, a few parts ofthe NET class libraries totally depend upon threads. Frequently used example is the System. Timers. Timer class, this is depending on the support for threads. ns Syllabus Topic : Scripting Tools and Event Overview SS as 87 Scripting Tools and Event Overview In this session we study the tools inside the Unity Editor, and tools supplied with Unity that help you in developing your sripts, @ Console Window = The Console Window (menu : Window > Console) shows errors, warnings and other messages generated by Unity. To help with debugging, you can also display your own messages in the Console using the Debug.LogError, Debug Log, and Debug.LogWarning and functions. De Ho/O2]] lee AP a UT can Uy fo upgrade ro rhe ne] 7 CEOETD “Untyirgine Component ig body 15 ebs0lee ted Use Ceorzenert0 instead. (UneyUpgraduble!” _ are displayed The Clear bution is used to remove any messages generated from your code but keeps compiler errors. You can clear the console automatically whenever you run the game by enabling the Clear On Play option. The Collapse option displays only the first instance of an error message that keeps Fecurring. You can use this option for runtime error, like null references, that are infrequently generated idemtic each frame update. BSc. Comp.-Sem-V)_ 851 Ser 3. The Error Pause choice will make playback be stopped at whatever point Debug LogEnror is called from a script. can be handy when you need to solidify playback at a particular point in execution and review the scene 4, The Open Editor Log and Open Player Log things on the console tab menu get to Unity’s log documents which record subtle elements that may not be appeared in the console. 87.1 Obsolete AP! Warnings and Automatic Updates Unity also shows the warning about the handling of obsolete API calls in your code, Unity had “shortcuts” in MonoBehaviour and other classes to access common component types. So, youcan access a Rigidbody on the object using code like: Tae "Igiasody” variable Iz part oF the class ond vet declared in Che weer SerIpe Vector3 v = rigidbody.velocitys ~ These shortcuts have been criticized, so you should now use code similar to : 77 Use GetCozponent to access the component. GetConponent(), Vector3 v = rb.velocit ~ Unity will display warming message when the obsolete API calls are detected. When you double-click on the waming this message, Unity will try to improve the criticized usage to the suggested equivalent automatically. 8.7.2 Adjusting the Line Count — For adjusting the number of lines that a log entry displays in the list, click on the exclamation button, go to Log Entry, and select the number of lines. = This enables you to set the granularity you need for the window regarding the measure of setting versus what number of entries fit in the window. Game Programming ( BSc. Comp. - Sem. 973 Stack Trace Logging It is possible for you to sate how comet stack trace should be captured when log message is printed tothe console o og fil Clete Ase Tab Fig, 87.1 : Log entry line count = Itwill be helpful in a situation where the error message is not very clea stack trace, you can deduct from what engine area the error appears. The following are the 3 options for logging stack trace yy looking at the None Tewill not print stack trace | SeriptOnly 1: will print only managed stack trace Full | Iwill print both managed and native stack trace. A) Log Files At the time of development you may need to get information from the logs of the standalone player you have built the target device, or the Editor. You refer this file when you face a problem and to find where the problem is occurred. On macOS ~Console.app utility is used for the player and Eulitor logs. On Windows, the Editor logs ‘are put in folders and they are not shown in the Windows Explorer by default. See below. Game (MU BSc. Comp.- Sem-V) 8-53 Scrip 8) Editor To view the Editor log, then select Open Editor Log from Unity's Console window, _| ~/-config/unity3d/CompanyName/ProductName/Player.log | Windows | C:\Users\serame\AppDatalocalLow'CompanyName\ProductNameloutput_logtxt D) los The iOS access the device log in XCode using the GDB console or the Organizer Console. Organizer Console is very useful for getting crashlogs in the cases when your application was not running during the XCode debugger. ) Android = Android OS access the device log using the logcat console. Use the adb application found in Android SDK/platform-tools directory with a trailing logcat parameter : Sadb logcat - You can inspect the LogCat by using the Dalvik Debug Monitor Server (DDMS). - You can start DDMS either from Eclipse or from inside the Android SDK/tools. DDMS also offers a number of other debug related tools. 8.7.4 Universal Windows Platform r SeUSERPROFILE®\AppData\Local\Packages\TempState\ UnityPlayer.log Scripting ame Programing (MU B.Se. Comp, [grat WebGL On WebGL, log output is written to the browser's JavaScript console. (@) Accessing log les on Windows Log files are stored in hidden location by default in windows system. First make the hidden folder visible in Windows Explorer of Windows XP by clicking on Tools > Folder Options... > View (tab), In Windows Vista/7, make the AppData folder visible in Windows Explorer by clicking ‘on Tools > Folder Options... > View (tab). By default the tools menu is hidden, to make it visible press the Alt key once (&) Event System The Event Systemis a system of sending events to objects in the application based on input. The input can ce mouse, keyboard, touch, or custom input, The Event System has some ~ components that work together to send events, (c) Overview = _ If you add an Event System component to a GameObject then you will observ that it does not have much functionality exposed. = It happens because the Event System itself is designed as a manager and facilitator of “communication between Event System modules. The main roles of the Event System are follows : To manage the which GameObject is considered chosen ‘To manage which Input Module is in use To Manage Raycasting (if needed) To update all Input Modules as needed Modules input module js a main module. It contains the main logic of how you want the event : : : 7 1 to behave lives; they are uflized for managing event state, handling input, and fo events to scene objects _ nu BSc. Sem-v) 655 Scripting. At a time in Event System only one Input Module is active and they should be components on the same GameObject as the Event System component. You can write your custom input module provided that you send events supported by existing Uy components in Unity Raycasters Raycasters are used for find out what the pointer is over. It is general for Input Modules to utilize the Raycasters configured in the scene to compute what the pointing device is over, By default there are 3 Raycasters : 1. Graphic Raycaster- It is used for UI elements 2. Physics 2D Raycaster— It is used for 2D physics elements 3. Physics Raycaster~ Itis used for 3D physics elements, If there are 2D or 3D Raycaster configured in your scene then itis feasible to have non UI elements receive messages from the Input Module. Simply add a script that implements one of the event interfaces. Messaging System In the new UI system messaging system designed to replace SendMessage. The system aims to find the issues in current SendMessage and it is pure C#. ‘The system works uses custom interfaces, implemented on a MonoBehaviour to specify that the component is able to receive a callback from the messaging system. A target GameObject is specified when the call is made. The call is issued on all components of the GameObject that implement the given interface that the call is to be issued against. The messaging system permits to pass the custom user data and how far through the GameObject hierarchy the event should propagate; i.e. should it just execute for the specified GameObject, or should it also execute on children and parents. The messaging system offers helper function to search and find GameObjects that implement a given messaging interface. Messaging system is not only used for UI system but it is also used for general game code. fier declaring the interface implements it in MonoBehaviour. When it is implemented it nes a functions that will be executed if the specified message is issued against this 7 FonoBehaviour, TcustosessageTarget Debug. Log( "Message 1 received"); public void Nessage?() Debug.Log( “Message 2 received"); we have the script that can receive the message, So we need to issue the message. _message is issued in response to event. For example, we issue an event in UT system ke PointerEnter and PointerExit re are many other things happen in response to user input into the application. A static class is exist to send the messages. It takes target object as an argument for the ge, some user specific data. and a function that maps to the particular function in the ge interface you desire to target. f3)=> xitesagel SND bove code executs the Messagel() functionon any component of on that implement the ICustomMessageTarget interface. 8.7.5.2 Supported Events Game (MU BSc. Comp. -Sem-V) _ 8-57 Script 875.1 Input Modules ] ‘An Input Module is where the main logic of an event system can be configured and | customized. There are 2 Input Modules, one designed for Standalone, and other designed for Touch input Every module gets and sends events as you would guess on the given configuration, The Event System and the business logic take place in Input module. When the Event System | is active it looks at what Input Modules are attached and passes update handling to the | particular module. Input modules are designed and modified based on the input system that you wish to support like mouse, touch, joystick, motion controller The built in Input Modules are | designed to maintain common game configurations for example keyboard input, touch | input, controller input, and mouse input.. | | Many events are supported by the Event System and they are modified further in users custom input modules. ‘The events supported by the Standalone Input Module and Touch Input Module are given by interface and are implemented on a MonoBehaviour by implementing the interface, Events will be calle at correct time if valid Event System configured. Event Description [PointerEnterHandler - OnPointerEnter This event is called when a pointer enters the object [PointerExitHandler - OnPointerExit This event is called when a pointer exits the object [PointerDownHandler - OnPointerDown —_| This event is called when a pointer is pressed on the objet [PointerUpHandler - OnPointerUp ‘This event is called when a pointer is released PointerClickHandler - OnPointerClick ‘This event is called when a pointer is pressed and released on the same object | MitializePotentialDragHandler - ‘This event is called a drag target is found, can be OninitializePotentialDrag used to initialize values TBeginDragHandler ~ OnBeginDrag This event is called on the drag object when _| dragging is about to begin qendDragHandler- OnEndDrag ee Description ‘This event is called on the drag object when a drag is happening This event is called on the drag object when a drag finishes tedHandler - ; - ° se . _ _ This event is called on the selected object each tick ‘onUpdateSelect IDreptandier~ OnDrop This event is called on the object where a drag finishes jscrollHandler ~ OnScroll This event is called when a mouse wheel scrolls This event is called when the object becomes the selected object PDeselectHandler ~ OnDesclect ie ‘This event is called on the selected object becomes deselected aoe This event is called when a move event occurs (left, This event is called when the submit button is pressed ‘This event is called when the cancel button is pressed vas ic Raycaster ~ It is utilized for UI elements, lives on a Canvas and searches within 2D Raycaster— It is utilized for 2D physics elements Te ing (MU B.Sc. Comp. - Sem-\ "Physics Raycaster ~ It is utilized for 3D physics elements When a Raycaster is present ‘and enabled in the scene, it will be utilized by the Event System when a query is issued from an Input Module. 6 Event System Manager This is a subsystem and it is responsible for controlling all the other elements that | compose eventing, = It coordinate to find out which Input Module is at present active, which GameObject is at | present considered ‘selected’, and a host of the high level Event System concepts, For every ‘Update’ the Event System gets the call, looks during its Input Modules and decides which is the Input Module that should be used for this tick. It then give the processing to the modules. ~ Following is the Properies table for “Add Default Input Modules” button. = Properties First Selected ‘The GameObject that was selected frst. Send Navigation Events | Should the EventSystem permit navigation events (move / submit / cancel). Drag Threshold The soft area for dragging in pixels 87.7 Graphic Raycaster ~The Graphic Raycastr is utilized to raycast alongside a Canvas. The Raycaster take care of all Graphics on the canvas and find out if any of them have been hit. i - To ignore backfacing the Graphic Raycaster is configured and it is blocked by 2D or 3D _ objects that exist before of it. — — Itapplies the a manual priority for element processing to be forced to the front or back of the Raycasting. 7 Properties Propeey | Blocked Objects ‘Type of objects that will block graphic raycasts, Blocking Mask Type of objects that will block graphic raycasts, Ignore Reversed Graphics | Should graphics facing away from the raycaster be considered? “gh 18 Physics Raycaster 8-60 ‘The Raycaster raycast alongside 3D objects in the scene . This permits messages 10 be sent t0 3D physics objects that implement event interfaces. Properties Property Function Depth It get the depth ofthe configuration camera. Event Mask Logical and of Camera mask and eventMask Event Camera Ttget the camera that is used for this module Final Event Mask Logical and of Camera mask and eventMask 87.9 Physics 2D Raycaster ‘The 2D Raycaster raycasts alongside 2D objects in the scene. This permits messages to be sent to 2D physics objects that implement event interfaces. The Camera GameObject is needed to be used and it get added to the GameObject if the Physics 3D Raycaster is not added to the Camera GameOhject. © Properties Property Function Sort Order Priority _ | Priority ofthe raycaster based upon sort order. Render Order Priority | Priority of the raycaster based upon render order. Event Camera The camera that will generate rays for this raycaster. Priority Priority of the caster relative to other castes. JO Standalone Input Module ¢ standalone input module is developed to work as you would anticipate a controller / ¢ input to work. Events for dragging, button presses, and like are sent in response to his module sends pointer events to components as a mouse or input device is moved und, and it uses the Graphics Raycaster and Physies Raycaster to compute which ent is presently pointed at by agiven pointer device. (MU B.Sc. Comp.-Sem-V) 8-61 - You can organize these raycasters for detecting or ignoring parts of your Scene, to oa, your requirements. This module sends move events and submits or cancels events ig reaction to Input tracked by the Input manager. This module works for keyboard as wef as controller input 7 Properties [Property Funetion i Horizontal Axis ‘Type the preferred manager name fr the horizontal axis buton, Vertical Axis ‘Type the preferred manager name for the vertical axis. Input Actions Per Second | Number of keyboard/controller inputs allowed per second. Repeat Delay Delay in seconds before the input actions per second repeat rate takes effect. Force Module Active Tick this checkbox to force this Standalone Input Module to be active. Submit Button ‘Type the preferred manager name for the Submit button. Cancel Button ‘Type the preferred manager name forthe Cancel button. + Details The module uses: 1. For navigation it uses vertical or horizontal axis for keyboard and controller 2. For submit and cancel events it uses Submit / Cancel button 3. Ithas a timeout between events to only permit a maximum number of events a second. The flow for the standalone input module is as follows : entered. 1]Send a Move eveat to the selected object if a valid axis from the input manager is 2/A submit or cancel event to) the selected object if a submit lor cancel button is pressed me Programming (MU B.Sc. Comp.-SemV) ggg Scriptit gess Mouse input |Iftisanew press - Send PointerEnter event - Send PointerPress event - Cache the drag handler Send BeginDrag event to the drag handler ~ Set the ‘Pressed’ object as Selected in the event| system If this isa continuing press |~ Process movment - Send DragEvent to the cached drag handler - Handle PointerEnter and| PointerExit events if touch) moves between objects If this isa release | Send PointerUp event to} the object that received the PointerPress ~ If the current hover object is the same as the PointerPress object send a PointerClick event J Send a Drop event if there was a drag handler cached — Send a EndDrag event to the cached drag handler Process scroll wheel events 1 Touch Input Module [The TouchInputModule is obsolete and is designed for working with the touch devices. It sends pointer events for touching and dragging in response to user input. This module also supports multitouch. It uses the scene configured Raycasters to compute {what element is currently being touched over. For every current touch a raycast is issued. > Details ‘The flow of the touch Input module is given in the following table : For each | If it is a new press touch event Send PointerEnter event (sent to every object up the hierarchy that can handle it) Send PointerPress event Cache the drag handler (first element in the hierarchy that can handle it) Send BeginDrag event to the drag handler Set the ‘Pressed’ object as Selected in the event system If this is a continuing press Process movment Send DragEvent to the cached drag handler Handle PointerEnter and PointerExit events if touch moves between objects If this is a release Send PointerUp event to the object that received the PointerPress } If the current hover object is the same as the PointerPress object send a PointerClick event Send a Drop event if there was a drag handler cached Send a EndDrag event to the cached drag handler 87.12 Event Trigger ~The Event Trigger get events from the Event System and it calls registered functions for each event. ~The Event Trigger is used to state functions you desire to be called for every Event ‘System event. It is possible to n assign multiple functions to a single event and whenever the Event Trigger get that event it will call those functions. 7 Events You can add new event in the Event Trigger by clicking the Add New Event Type button. Explain start () and update () method in script? How to create variables inthe scrit? How to link Gameobjects to variables? ~ Explain event functions in Unity? Explain time and frame rate management? How to create and destroy Gameobjects? Explain namespaces? Explain the execution order for event function? Explain the automatic memory management? How to use assembly dentin fles? Explain Generic Virtual Methods? How to define a custom message? What is input module? List and explain the supported events? ‘What is Raycasters? Explain in brief? 7 Explain stand alone and touch input module? Chapter Ends... og Co Nia ) Syllabus XR : VR, AR, MA, Conceptual Differences. SDK, Devices 9.1 Syllabus Toplc : XR: VR, AR, MR, Conceptual Ditferences, SDK, Devices XR Differences, SDK and Devices 9.1.1 Google VR ‘The Google VR technology supports the following VR platforms : Google daydream 2. Google cardboard > Google VR provides support to Smartphone, standalone head mounted displays and application and head mounted viewers and controllers. Google VR has given the SDK for Unity to build the VR contents. 1, Google Daydream Google daydream is included in Google VR technologies. Google Daydream is a platform for cardboard and high quality mobile VR. It is built for bite size VR experience and also for 360 video. It has daydream ready headsets, phones, and controllers and also has high end Android mobile phone experience and support, examples of Daydream ready phones ar: Google Pixel2 Samsung Galaxy $8 & $8+ Samsung Galaxy Note 8 LG V30 Google Cardboard. aydream allows permits more component rich encounters than Cardboard, however just kings Daydream ready devices like built for VR with high-resolution displays, yydream-ready phones, smooth graphics and high reliability sensors for exact head ‘king. yydream phone appscan applications are available with the Daydream View, Google's eadset and movement controller for encountering amazing VR, utilizing a Daydream- dy phone. Daydream standalone HMD which are built by the original manufacturers are also ported by Daydream, Hardware and Software Requirements for Google VR. ‘minimum hardware and software requirement for Google VR SDK are as follows : dboard dware : Devices running Android 4.1 or later with a gyroscope, Devices running 9S 8 or later witha gyroscope A viewing device compatible with Cardboard. oftware : Unity Cardboard integration requires Android Lollipop or greater, 10S deviees ‘the Google Cardboard application installed. Game Programming (MU B.S. Comp.-Som-V)__9.3 xi 2. Daydream Hardware : Daydream compatible devices such as VR headset and controller. 9.1.2.1 Recommended Hardware and Software Requirements ‘The recommended hardware and software requirement for Google VR SDK are a follows : 1. Cardboard ~ Hardware : Cardboard compatible devices, any iPhone 5 series or later phone for 10s Cardboard compatible viewing device. = Software : Android 5.0 or later, - iOS 8 or later, with the Google Cardboard applicatio installed. 2 Daydream - Hardware : Daydream specification compatible any device. - Software : Android 7 or later. 9.1.22 Supported APIs and SDKs The following table shows the supported and not supported API's and SDk’s | supported API's and SDK | Not Supported API’s and SDK Android : OpenGL XRSettings showDeviceView 10S : Metal, OpenGL XRDevice.isPresent XRStats.gpuTimeLastFrame 9.1.3 Google VA Controllers and Input Devices - Daydream controller is the input device for Daydream. The Google cardboard uses man types of third party controller as input device to take the input. - The daydream controller had 3 degrees of freedom that gives rotational and positions information. It has dual axis click or touch controller along with 2 extra buttons. he following table shows the features for mobile VR device for working with Google Unity. Platform Rendering topo] ard | Android Lollipop or later | - OpenGL. ~ Stereo Instanced Rendering = _ Stereo Rendering ios = OpenGL. - Metal - Stereo Instanced Rendering ~ Stereo Rendering Android Nougat or later | - OpenGL Daydream using compatible hardware | _ stereo Instanced Rendering | controller Stereo Rendering Vuforia are few important concepts related to the forms of tacking and marker types which e used in Vuforia applications. Marker-Based Tracking ne markers are the registered images or objects with the application in MR or AR. These ects go about as data triggers in the application show the virtual substance over the world position of the marker in the camera see. The Marker-based tracking composes contains, 2D labels, QR codes, physical intelligent markers and Image Targets. Image target is the normal sort of marker utilized as a part of amusement application. = Common Image Target types 1) Image target 2) Markerless traci > 1) Image Targets — Image targets are the physically enlisted images with the application that produces the Image Targets are a particular kind of marker utilized asa part of Marker-based tracking. They are images you physically enroll with the application, and go about as triggers that show virtual substance. ~ For Image Targets, utilize images containing unmistakable shapes with complex frameworks. This makes it less demanding for image acknowledgment and tracking calculations to remember them, > 2) Markerless tracking — Applications utilizing Markerless tracking are all the more regularly area based or position-based Augmented or Mixed Reality. =~ This type of tracking depends on advances, for example, GPS, accelerometer, spinner and ‘more mind boggling image preparing calculations, to put virtual objects or data in nature. - The VR equipment and programming at that point regards these objects as though they are secured or associated with particular genuine areas or objects, ame Programming (MU BSc. Comp.-Sem-V) 946 9.2.2 Hardware and Software Requirements for Vuforia ‘The following are the hardware and software requirements for Vuforia SDK with Unity. «phere are mobile devices as well as the see-through digital eyewear devices supported by the voforia. @ Mobile devices ‘The following are the mobile phones that supports the Vuforia for application development. Device OS Development OS Unity Version| Android (1) | 4.1x+ | Windows(2)| 7+ | Windows 2) | 2017.2 iOS (2) oF OSX | 10.11. Windows (2) | 10 UWP | [osx _|20172+ ‘© Optical see-through digital eyewear __Device Device OS | evelopment 0S Unity Version HoloLens |Current version Android (1)2), 40:3+ |Windows (3)| 7+ | Windows (3) |2017.2+| ope r+ | windows(t)[touwe| 08x frosts] osx | 20172 Epson | _BT-200 [| © Vuforia tools Her The following table shows the Some Vuforia that are supported on specific hardware. Devices OS Version S VCalibration Assistant | Moverio BT-200 ODG R-7 Android 4.0.3+ Object Scanner Samsung Galaxy $8+ | Latest supported OS on the device Samsung Galaxy $8 Samsung Galaxy S7 Samsung Galaxy $6 tual Reality SDK integration support following table shows alist of VR SDKs fully integrated into the Unity Editor, fons release version of the Editor where they were first integrated. W Game Programming (MU 97 YRSDK Versions Google VR SDK Unity 2017.2 Cardboard Android SDK | Unity 2017.2 Windows Mixed Reality | Unity 2017.2 (HoloLens only) = Graphics API support The Graphics APIs that are supported by Vuforia are shown in the following table. Android ios Windows OpenGL ES 2.0 | OpenGL ES 2.0 | DirectX 11 on Windows 10 OpenGL ES 3.x | OpenGL ES 3.x Metal (iOS 8+) 9.2.3 Windows Mixed Reality For Mixed Reality, there is Windows Mixed Reality (WMR) Microsoft platform available where you can built around the Windows 10 API, and allow the applications render digital content on holographic and immersive display devices. 1. Holographic and Immersive headsets ~ Holographic gadgets, for example, the Microsoft HoloLens enable you to see the physi condition around you while wearing the headset, mixing this present reality with virt substance, = Windows Mixed Reality immersive headsets highlight a murky show to shut out physical world and encompass you in a 360 degree virtual condition. 9.2.3.1 HoloLens ‘The HoloLens device is used to run the applications of Augmented Reality and Mixed Reality. While creating the applications developers should consider the differences ~ between HoloLens and Windows immersive devices. ‘The applications and the API's used by the Windows Mixed Reality applications are also _used by HoloLens applications. In the Hologram the HoloLens renders 3D virtual objects HoloLens renders 3D virtual objects by creating the illusion of those objects such a that those objects are in the real-world environment. _ The user can interact with these objects vis voice commands, ing gestures, gaze. On the other hand objects also interact with the real world surfaces. To holograms you can attach animations, audio and other components. Immersive headsets. Windows Mixed Reality (WMR) immersive headsets feature an opaque display to block Out the physical world and surround you in a 360 degree virtual environment. fany Virtual Reality devices, such as the HTC Vive or Oculus Rift, use sensors built to headsets and cameras facing towards these to track user movements. This is known ‘outside-in tracking. majority of WMR immersive headsets use inside-out tracking: Instead of using al cameras in the environment, inside-out tracking devices use outward= ing cameras built into the headset for positional tracking. Windows Differences between Holographic and Immersive Devices table below presents the main features of Holographic and immersive Windows ed Reality devices. It has a see through display, it permits you to see the physical environment while wearing the thas an Opaque display that blocks out the physical environment while wearing headset the headset It uses world scale for tracking It uses Stationary or Room scale for tracking ‘The tracking type used is Inside-out The tracking type used is Inside-out 9.2.3.3 Differences between Hololens and Immersive Devices Details : Difference | Holograms do not replace reality Unlike an immersive 3D environment, holograms are preservative; the device draws them over the real world. HoloLens applications augment and can interact with your physical environment. | | Input paradigms are different It uses Gaze, gesture and voice as primary input forms. | ‘The HoloLens device is power sensitive HoloLens is a mobile device, so it needs the power for usage like phone app. Ensuring you | disable features and optimize for CPU | utilization is more important than on other platforms. | Photo capture and video capture Immersive Windows Mixed Reality headsets do not allow to capture pictures and videos where, HoloLens permits you the same. Using the photo and video capture APIs will permit you to capture from a webcam associated 10 your PC. Sn Spatial Mapping HoloLens Supports spatial —_mappi components and APIs. Windows Mixed Reality immersive headsets do not support these. | Ware Programming WUB Se Cons -semvy 9.19 = Difference Spa ; ipracking Space Immersive devices work best for standing- scale (stationary) or room-scale experiences, while the HoloLens works at world scale. | Users are not limited to a single location, and | can walk far beyond their initial location. Applications need to deal with dynamic reference frames and world-locked content. 92.3.4 WMR Hardware and Sottware Requirements Component Minimum requirements DE requirement rere Operating System | Windows 10 Fall Creators | Windows 10 Fall Creators Update (RS3) - Home, Pro, | Update (RS3) - Home, _| Business, Education Pro, Business, Education. RAM | SGB DDR3 dual channel (or | $GB_DDR3 dual channel | better), (or better). Disk Space Atleast 10 GB. Atleast 10 GB. CPU Processor | Intel Core i5 7200U (7th | Intel Core iS 4590 | generation mobile), dual- | (4th generation), quad-core core with Intel® Hyper- | (or better) AMD Ryzen 5 | Threading Technology | 1400 3.4Ghz (desktop), ‘ ‘enabled (or better). quad-core (or better). | Bluetooth (for controllers) | Bluctooth 4.0 [ Bluetooth 4.0 _ | Graphics Driver Windows Display Driver | Windows Display Driver Model (WDDM) 2.2 Intel® HD | NVIDIA GTX 960/1050 hhics 620 (or greater) | (or greater) DX12-capable capable integrated discrete. GPU. AMD RX {your model is | 460/560 (or greater) DX12- ct NVIDIA | capable discrete GPU. GPU XISO/965M_ (or greater) | must be hosted in a PCle 3.0 | DX12-capable discrete | x44 Link slot. | Gru. Model (WDDM) 2.2 Graphics Card F it — | USB connectivity USB 4.0 Type-A or Type-¢ | Graphics Display Port HDMI 14. or DisplayPort | HDMI 2.0c¢ DisplayPort 12 oo 12 _ = 93 Working on WMR _ a Windows Mixed Reality (WMR) 18 a Windows Applications, and it uses Visual Studio 2017 development environment, s0 there ty no need of an additional SDK installauon 9.3.1 Installation of Tools Tools you need to install ase Visual Studio 2017 and HoLens Emulator and Holographic Templates ~ To imstall the tools follow the Microsoft documentation instructons For detail information visit the Link hiips://developer microsoft com/en-us/window vimine GaliyAbsyclopment overvicw = when you will be done with the installation then setup a Unity Project and Publish it wi Windows Mixed Reality devices 932 Project Set Up Unity project for Windows Mixed Reality 1s same as the Unity project for other platforms with some notavle exceptions. To support the features of mixed reality there is a need | change the settings of camera, performance and Publishing 933 Camera Settings ~ For camera setting make sure that camera used to track the position of the HMD's 1s sc the Maincamera. SO check it by selecting the camera you are using and in the Inspector sce the Tag drop-down, ~ Unity eutomatically applies this tag tothe default Camera in the Scene. (MU BS Comp - Sem-v) 4 Camera Settings for HoloLens form the following steps forthe Hololens camera setting 11, For main camera set the Clear Flags property to Solid Color Instead of Skybox, 2. Set the Background color to Black 3, Make sure the Cameras Transform postion 1 set 10 0.0.0) Performance Settings To maximize the performance and to reduce the consumption of power the performance setting 1s done HoloLens uses Fastest quality setting and the setting 1%; It > Project Settings » Quality rea Setting quality settings for Holol-ens Publishing Settings Fo get the important sy stem capabilities for a Windows Mixed Reality application, Goto Player Settings » Publis Settings 2. Cheek the bor for each option you want to use from the Capabilities list 0p)ec%839 ProneCa erAccountintermenen Hig 9A: System capability settings is {important capability sctings for publishing Windows Mixed Reality appheauom® in the following table | Comp -Sem-V)_9-13 XR Immersive | HoloLens Description headset | support ‘support InternetClient Y Y — | Gives access to your Intemet connection for outgoing connections to the Internet. InternetClientServer Y Y Gives access to your Internet connection, including incoming unwanted connections from the Intemet. ‘The app can send information to or from your computer through a firewall. WMR requires this for voice recognition. MusicLibrary Y N | Gives access to your music library and playlists, including the capability to add, change, or delete files. This allows you to use VideoCapture audio recording functionality in your WMR app. PicturesLibrary Y Y | Your picture library, including the capability to add, change, or delete files. This capability also includes pictures libraries on HomeGroup computers, along with picture file types on locally connected media servers. VideosLibrary Y N | Your video libraries, including the capability to add, change, or delete files. This capability also includes video libraries on HomeGroup computers, along with video file types on locally connected media servers. Game Programming (MU B.Sc, Comp.-SemV) 9-14 = ability Setting | Immersive | HoloLens Description headset | support Support fyebcan a Y | It permits you to use PhotoCapture and VideoCapture functionality in | a Mlerophone Y Y |i permits you to use voice recognition functionality in your app. Bizetooth Y Y | Enables Bluetooth communication in | your app. WMR requires this to | permit you to use Windows Mixed Reality spatial controllers. | SpatialPerception | Y It permits you to use Spatial Mapping in your app. 9.3.6 Exporting a Visual Studio Solution = After creating the project now it's time to test it, So to test it you have to export your project to Visual Studio solution, To deploy the WMR applications first build a Visual Studio solution with Unity. © Goto File>Build Settings and select Universal Windows —_—Platform from the Platform list © Click the Switch Platform button at the bottom left of the window to configure the Editor to build for Windows, ~ Default setting works flawlessly for the standard build of Windows Mixed Reality immersive headsets. If you want to build it for HoloLens then change the setting of Target Device to HoloLens. Fig. 9.3.2 : Settings window showing Universal Windows Platform default settings ailable for the Worldwide ~The following table gives the list of the Build Settings Wisows Platform and explains theit usage Target Device Select Any Device to build for immersive headsets and HoloLens to build for HoloLens. This is significant for optimization. Build Type Choose from D3D (Direct3D) or XAML. D3D Wt provides faster results than XAML because there is no XAML layer in the app. This builds the spp in 3D exclusive space, end you can't switch to a 2D XAML app or modify this after generation XAML Adds an XAML code layer over the app, which allows the user to switch from the 3D app and open a 2D app. XAML code c2 be modified after generation. The most common example of this | tusung a touch keyboard for the HoloLens. Select the specific version of Windows 10 SDK your app uses. This tet to Latest installed by default. parenng (MU B Se Cong Sem vial Studio | Choose the specific version of Visual Studio to generate @ solution version (ln) for Ths os set toatest installed by default, which is the if you have al Studio 2017 installed. denice your application runs on when you click Jand Run button This is set to Local Machine by default, and uld not need to change this ing of UnutyPlayer ull, assoctated dlls and data to the solution folder instead of referencing them directly from lation foldee This requires extra space, but enables the 10 be portable (you can copy it to another machine butld it even if Unity 1s not installed on that machine) sion of the scripung files from your project in the fated solution Thus setting 1s only available when you set NET ¢ Backer Development Build Unichs dehy Projects. This allows you to connect the built project to the res and allows developers to test and debug built clopment features such as more des other puts Enabling this setting also makes nect Profiler settings available. utomatically connect to the build clopiment Build 1s enabled, & project. This is only available Scripts Only Build | c pment Build 1s enabled | tisenabled g Ww build Windows Mixed Reality applications = Tis important configure ¢ Properly before building an app 1, From the Build Setun, 2 Enable Virus! Rea 3, Click the + bution on the Virtual Reality Devices lst and select Windows Mixed click Player Settings > XR Settings Reality fs After finishing the configuration of Build Setungs, Click on the Build button. Build your 7 to anew folde member the locati as zz s2me Programming (MU B.Sc. Comp.- Sem-V)_ 9-17 xR = Perform the following steps to deploy your app in Visual Studio : 3 In Visual Studio, open the generated solution file (sln) located inside the folder where you built your project. Select the device to run the solution and change the target platform in the Visual Studio taskbar (left to right in image below). Lael Device Pepe cam cra are Emulator Click the dropdown arrow on the right of the Run button (marked by a green arrowhead) and a list of possible devices will become visible. ~The following are 4 options to run and test your Windows Mixed Reality application from Visual Studio : 7») Local Machine Remote Machine Device HoloLens Emulator Local Machine (immersive headsets only) Local Machine grants you to construct your application and introduce it to the Mixed Reality Portal on your Windows 10 PC. Whenever constructed, your application consequently keeps running on your PC, and you can test it through your immersive headset. - You can dispatch your application again whenever through the Start menu from the Mixed Reality Portal. (Game Programming (MU B.Sc. Comp.-SemV)_ 8 ei) > 2) Remote Machine (HoloLens only) Remote Machine prompts you to enter the IP address of the HoloLens or other headset you wish send to. When you click Run with Remote Machine chosen, a dialog box approaches you for a PIN for the device. ‘To get this PIN : 1, Switch on your HoloLens and go to HoloLens Settings 2. Select the For Developers tab and empower Developer Mode utilizing the gave flip catch. 3, Select the Pair catch to get the PIN and enter it into Visual Studio in the popup box. = Your application remote introduces onto your HoloLens and, when the assemble procedure is finished, keeps running on the device naturally. > 3) Device (HoloLens as it were) = Device enables you to build your Visual Studio project and deploy it to a HoloLens device associated with your PC by means of a USB link. When you click Run, Visual Studio fabricates the undertaking, introduces the application con the associated HoloLens and keeps running on the device consequently once the construct procedure is finished. > 4) HoloLens Emulator (HoloLens only) = HoloLens Emulator permits you to build the Visual Studio project and run the app HoloLens emulator. It allows you to test the app and simulate gestures and other inputs on your PC before deploying it to HoloLens device. — How to include Script in your generated solution (optional) 1. To include scripting mark the Unity C# Projects checkbox from your project in the generated solution. It permits you to edit and debug your script without to re-exporting it from when there will be change in project setting and content change then only there is need. This setting is only available when you set the Scripting Backend to NET. User can use either IL2CPP or NET scripting backends for their application, 2. To change the Scripting Backend, go to Player Settings > Other Settings and select the relevant backend in the Configuration section. 'WMR Input and Interaction Concepts Windows Mixed Reality uses HoloLens or immersive headsets for the interaction. There are different types of inputs in immersive headsets like spatial controllers, The limitation on HoloLens is that it can use only 3 forms of input, except extra hardware is needed by the application to be used in addition to the headset, ~ With Mixed Reality immersive headsets all types of input work. The primary means of interaction in HoloLens are Voice, Gesture and Gaze. 9.4 VR Devices ~ There is a builtin support provided by for a number of virtual reality devices. The device availability is per platform basis, It means that for every device is not supported by every platform. ~ There are multiple devices available so you choose the devices which is needed for your app. Only one device is active at given time at runtime. = You can switch between the devices if you have selected the multiple device support for your app. 9.4.1 VR Device Information ~ _ The VR devices are : ~ _ XRDevice.family is a string corresponding to the current XRSettings.loadedDeviceName A family can have multiple models, which have different characteristics. For example, Oculus has DK2, Gear VR, etc . The model can be accessed with XRDevice.model 9.5 Oculus 95.1 Getting Started (Windows) You can see the ¢ detailed documentation on the hitps://developer.oculus.com/. 9.5.1.1, Getting Started (Gear VR) ~ There is no need to install anything extra on your machine to deploy to Gear VR. Follow the instructions at the bups-Hresources.developer.samsung comv050_Gear_VR_and_Gear_360 Game Programming (MU B.Sc. Com XA — _ Ensure that you can deploy a Unity app to your Galaxy Note 5, $6, S6 Edge+, or S6 Edge Perform the following steps : 1. Connect your Android device to your PC/Mac using a micro USB cable. 2, Create anew empty project (menu : File > New Project). 3. Switch your build platform to Android (menu ; File > Build Settings). 4 Open the Player Settings (menu : Edit > Project Settings > Player). Select Other Seitings and check the Virtual Reality Supported checkbox. Add Oculus to the Virtual Reality SDK's list. Create the folder PluginvAndroid/assets under your project's Assets folder. Include an Oculus signature file in your project in the Plugins/Android/assets folder. Build and run. Insert the device into your headset and see the skybox with head tracking. 9.5.1.2 Getting Started (OpenVR) = Torun OpenVR applications Steam is needed. So first install Steam and SteamVR. Once SteamVR is working correctly with your headset, add OpenVR to the list of supported SDKs. = If you need extra functionality beyond Unity’s built-in support, see the hups://www.assetstore.unity3d.comven/#!/content/32647. 9.5.2 Input for Oculus = 3 inputs of the Oculus Rifts : one Oculus Remote andiwo Oculus Touch Controllers. These 3 inputs are separate joysticks For this use Unity UnityEngine.Input class to read the axis and button values of these inputs. 9.5.3 Naming Convention and Detection ~ After configuration and connection of the oculus with operating system the Oculus Touch Controllers come into view in the list returned by UnityEngine Input GetoystickNames0) as Oculus Touch - Left and Oculus Touch - Right, and the Oculus Remote appears a> Oculus Remote. | i ae Game (MU BSc. Comp. - Sem-V) _0-21 xR = The following diagram shows the Oculus Remote : Button.DpadUp Fig. 9.5.1 : Oculus Remote 9.6 OpenVR ~ To run the OpenVR application Steam is needed. If the SteamVR is working properly with your headset then OpenVR to the list of supported SDKs. - _ OpenVR supports Windows and macOS platforms ~ On macOS platform Unity OpenVR needs the graphics and 64bit application target. It does not support OpenGL. OpenVR supports macOS 10.1.6 or later, but is optimized for macOS 10.13 High Sierra or later. 9.6.1 Input for OpenVR Controllers ~ Unity VR subsystem has VR controller inputs as separate joysticks. In Unity UnityEngine. Input class is used to access the axis and button values. ~ When you refer any specific axis and button then Unity integration does not refer to any particular hardware. ~The controllers supported by OpenVR are : Oculus, HTC Vive, Touch, and Valve Knuckles Controllers. 9.6.2 Naming Convention and Detection - The OpenVR controller are named as either OpenVR Controller - Left or OpenVR Controller - Right after configuration and connection. Access this name through the list retuned by UnityEngine. Input. GetJoystickNames(). ‘These controllers appear in green in SteamVR status menu when they are tested with steam. It is mandatory to install both Steam and SteamVR. On your machine to access this menu. The following diagram shows the Steam VR status menu Fig. 0.6.1: Steam VR status menu = Do the periodical checking to see the presence of controller in the list of joystick names through script. 7 Unity Input 1 HTC Vive Controllers The diagram below displays the different inputs available on HTC Vive controllers for ‘use in VR applications 9.7.1: HTC Vive controller inputs mapping (Image courtesy of developer.vivepor-com) 6.| Tracking sensor | 3 | System button | 7 | Trigger 4 | status igh 8 | Gnp button 9.7.2 Upgrading a Project that Contains the SteamVR Package Qr a2 a3 a4 as a6 a7 ae as Fst enable the VR support. To Do this first Open the the Player Settings ‘Chick on menu: Edit > Project Settings > Player. Select Other Settings and check the Virtual Reality Supported checkbox. Use the Virtual Reality SDK list displayed below the checkbox to add OpenVR. Enter Play Mode in the Editor to test the build. Explain Google Daydream 7 Explain Google Cardboard ? What is Vutona 7 What is marker based tracking ? Wotte short note on Windows Mixed Reality 7 ‘What is the diforence betwoen Holographic and immersive Devices 7 What isthe ditference between Hololens and Immersive Devices ? Wote short note on Oculus 7 ‘Wirte short note on OpenVR ? Chapter Ends oa Total Marks : 75 Attempt any three of following: What are function graphs ? How to transform vectors? Explain Projection and Homogeneous Clip Space with the help of frustrum, Vertex projection, normalizing coordinated, projection equation and normalized depth value’? Explain trigonometry and trigonometric ratio? Explain stat () and update () method in script? Attempt any three of following (15 Marks) Explain in brief Theorem of Pythagoras in 2D? Explain roll, pitch and roll quaternions? Explain the tessellation stage? ). What is the sine and cosine rule? Whats Vutons? Attempt any three of following (15 Marks) Explain Polygonal Shapes ? A B. Write short note on Matrices? C. Define frustum? D. What is MR? Explain ts application” E. Explain Generic Virtual Methods” Attempt any three of following: (15 Marks) Explain 3D coordinates? Explain homogeneous coordinates? What is primitive topology? . What is VR? Explain its application? Write down the steps in Unity Installation’ | TTT" Ware Programming wu B.Sc. Comp. -Sem-V) M2 Noel Quaaton Pay t Q.5 Attempt any three of following: (15 Marl A. B. Explain Euler's Rule ? What is Quaternion? Explain addition, subtraction and multiplication quaternions? What is the work of geometric shader stage? ). What is interpolation? Explain Google Daydream? ac Game Programmi Q. 1(a) Multiple choice Questions : (6 Marks) 1, The relation between vertices, faces of a 3D polygon object is given as (a). Verlices = faces - edges + 2 (b) Vertices = faces + edges +2 (0). Vertices = faces — edges -2 (@) Vertices = faces + edges ~2 ‘Ans. : None of the option 2. The basic building block in 3D object model is. (@) Rectangle (b) Triangle (©) Polygon (d) Cube Ans. : Polygon -_8._ Which one of the folowing is not a valid geometric transformation ' (a) Scaling (b) Revolution < (6). Rotation (@) Reflection Revolution 4. The API used in Unity 30 is (a) OpenGL. (b) Direct30 (©) OpenGL Es (@) Proprietary API ? OpenGL t ae. rogramming (MU-B Se-Comp) Az Appendix 5. The process of computing pixel color from projected 8D triangle is known as (2) Blending (0) Shading (c) Rasterization (@ Positioning Ans. : Rasterization Q.1(b) 1. Mention the four co-ordinate systems used in graphics pipeline. Ans. : (a) World coordinate system (b) Modelling coordinate system (©) Device coordinate system (4) Screen coordinate system (e) _ Viewport coordinate system (Object coordinate system Q.1(b) 2. What is the relation between a Quartemion and its inverse, (Section 2.8.4) Q.1(b) 3. State the Pythagoras theorem for 3D. (Section 1.7.1) Q.1(b) 4. Mention the use of interpotant in computer graphics. (Section 5.2) @.1(b) 5. Define the term Virtual Reality and give its application. (Sections 6,2.2 and 6.2.2.1) . 1(c) Fill In the blanks taking answer from the pool of values: (1 Mark) (1 Mark) (1 Mark) (1 Mark) (1 Mark) (5 Marks) [Controller Swapping, Double, Presenting, Animation, Stencil, Tessellation, Translation, Rigidbody ] 1. _Interchanging the roles of back buffer and front buffer is called Ans. : Presenting. 2. ABbit Buffer is always attached to depth butter. Ans. : Stencil 3, Subdividing the triangles of a mesh to add new triangles is called __- Ans. : Tessellation 4, allow game objects to act under the control of Physics Engine. ‘Ans. : Rigid body 5. Manipulating images and objects in dynamic medium as moving images is called Ans. : Animation f E WW came pranming M8 Soon) 2 (1). Define Lambert's law and explain its use in lighting calculation. (Sections 4.11.3, 4.11.4, 4.11.5, and 4.11.6) (5 Marks) Q.2 (2). Explain in detail the stages in the rendering pipeline. (5 Marks) Ans. : The Rendering Pipeline Processes 1. Per-Vertax Operation 2. Primitive Assembly 43. Primitive Processing il 4. Rasterzation 5. Fragment Processing 6. Per-Fragment Operation Fig. 1- Q.2(2) ‘The Rendering Pipeline processes the data through several stages as follows : 1. Per-Vertex Operation In this stage vertices are processed by the Vertex Shader. Each vertex is transformed by a space matrix, changes 3D coordinate system to a new coordinate system. . Primitive Assembly The processed vertices are taken to primitive assembly, here primitives are constructed by connecting the vertices in a special order. 8. Primitive Processing Here the primitives that falls outside the view-volume are clipped and ignore in next stage. In other words clipping is performed in this stage. 4. Rasterization At this stage pixels are tested to see that if they are inside the primitives perimeter or not. If they are not inside the perimeters then they are discarded __ and if they are inside the perimeter then they are taken to next stage. 5. Fragment Processing Fragment is a set of pixels that approximates the shape of primitive. When the fragment leaves the rasterization stage it is then go to Per-Fragment stage of fragment shader. The shader apply color or a texture to the pixels within the fragment. 6. Per- Fragment Operation Several tests like: pixel Ownership test, Scissor test, Alpha test, Stencil test, Depth test are performed on the fragment. Lastly the pixels are saved in Frambuffer. These are the pixels which you can see on the screen. Q.2(3) Describe any two 2D transformations in detail, (Section 2.2) Q. 2(4). Bring out the advantages of GPU architecture. (Section 2.14) Q. 2(5) Differentiate between super sampling and multisampling techniques. (Section 3.7) Q. 2(6) Write a short note on Direct 3D Feature levels. (Section 3.9) Q. 3(1). What are B-Splines.State its types and advantages. (Section 5.5 ) Q. 3(2) Describe the steps in perspective projection. (Section 2.11) Q.3(3) Explain the procedure of interpolating two Vectors. (Section 5.2.3) Q. 3(4) Obtain the Hessian Normal form for a straight line. (Section 5.7.4) Q. 3(5) Describe the intersection points of two straight lines. (Section 5.8.1) Q. 3(6) Write a short note on Quartemions. (Sections 2.8.1 to 2.8.7) Q, 4(1) Explain the use of assets and assets store in unity 3D. Ans.: Use of asset : (6 Marks) (5 Marks) (5 Marks) (5 Marks) (6 Marks) (5 Marks) (5 Marks) (5 Marks) (5 Marks) (5 Marks) (6 Marks) ‘An Asset is a representation of any item you can use in your game or Project. An ‘Asset may come from a file created outside of Unity, such as a 3D Model, an audio file, an image, or any of the other file types that Unity supports. There are also some Asset types that you can create in Unity, such as a ProBuilder Mesh, an Animator Controller, an Audio Mixer, or a Render Texture. Common types of assets are - Image files - FBX and Model files - Meshes and animations - Audio files - Standard Assets ‘Game programming (MU-B.Sc-Comp) AS Appendix ‘Use of asset store: ‘The Unity Asset Store is home to a growing library of free and commercial Assets created both by Unity Technologies and also members of the community. A wide variety of Assets is available, covering everything from Textures, Models and animations to whole Project examples, tutorials and Editor Extensions. You can access the Assets from a simple interface built into the Unity Editor which allows you to download and import Assets directly into your Project. Unity users can become publishers on Asset Store, and sell the content they have created Q. 4(2) Define HMD and explain any two such devices. (Section 6.5) (5 Marks) Q. 4(3) What is meant by specular lighting? (Section 4.11.6) (5 Marks) Q. 4(4) Explain the term MA and state its applications. (Sections 6.2.3 and 6.2.3.1) (5 Marks) Q. 4(5) Describe how a material is associated with a game object in Unity 3D. (5 Marks) Ans. Materials define how a surface should be rendered, by including references to the ‘Textures it uses, tiling information, Color tints and more. The available options for a Material depend on which Shader the Material is using. Create a new Material, use Assets->Create->Material from the main menu or the Project View context menu, By default, new materials are assigned the Standard Shader, with all map properties empty. Once the Material has been created, you can apply it to an object and tweak all of its properties in the Inspector. To apply it to an object, just drag it from the Project View to any object in the Sceneor Hierarchy. Setting Material Properties ‘You can select which Shader you want any particular Material to use. Simply expand the Shaderdrop-down in the Inspeetor, and choose your new Shader. The Shader you choose will dictate the available properties to change. The properties can ‘be colors, sliders, textures, numbers, or vectors. If you have applied the Material to an active object in the Scene, you will see your property changes applied to the object in real-time. bee ‘There are two ways to apply a Texture to a property 1, Dragit from the Project Vi v on top of the Texture square Gamo Ao 2 Click the Select button, and choose the texture from the drop-down list that appeare Material peremeters ‘Tho etandand shader presents you with a list of material parameters. These parameters vary elightly depending on whether you have chosen to work in the Metallic workflow mode or the specular workflow mode. Most of the parameters arg the same across both modes, and this page covers all the parameters for both modes. Rendering Mode ‘The firet Materia] Parameter in the Standard Shader is Rendering Mode. This allows you to choose whether the object uses transparency, and if to, which type of Lending mode to use = Opaque : Is the default, and suitable for normal solid objects with no transparent areas. ~ Cutout + Allows you to create a transparent effect that has hard edges between the opaque and transparent areas. In this mode, there are no semi-transparent areas, the texture is either 100% opaque, or invisible. This is useful when using transparency to create the shape of materials such as leaves, or cloth with holes end tatters. ~ Transparent : Suitable for rendering realistic transparent materials such as lear plastic or glass. In this mode, the material itself will take on transparency values (based on the texture’s alpha channel and the alpha of the unt colour), however reflections and lighting highlights will remain visible at full clarity as 1s the case with real transparent materials. ~ Fade tAllows the transparency values to entirely fade an object out, including ny ‘specular highlights or reflections it may have. This mode is useful if you want to animate an object fading in or out. It is not suitable for rendering realistic transparent materials euch as clear plastic or glass because the reflections and highlights will also be faded out. G. 4(6) Expiain the following functions wah example Update() and FuedUpdate(). (5 Marks) Ans. UpéateQ) ‘A gums is rather like en animation where the animation frames are generated on the fly. A key concept in games programming is that of making changes to position, sate and bebsvior of objects in the game just before each frame is rendered. The (UB SeComp) 7 Update function is the main place for this kind of eode in Unity. Update is called rred and also before animations is calculated. distance » speed * Tine deltaTine * Input, GetAsis("Horizontal”); -Teansiate(Vectorl right * distance); FixedUpdate() The physics engine frame rendering A separate each physics update Sin the same frequ it in the Fuedl' FoxodUpdates) { force = transtorm forward * driveForce * Input GetAxs("Vertical”), AddForce|tor in a similar way to the function called FixedUpdate is called just before ysis upe and frame updates do not occur with _ you wall get more accurate results from physics code if you place function rather than Update (5 Marks) Ans. : A data-parallel computation is a computation that has been parallelized by distributing the data a g nodes. It can be contrasted with a task- parallel computati puting tasks is emphasized as opposed to the data One fra single instruction, multiple data” same instructions on different pice ince it allows flow control co allows more of the hardware = In 3D graphics a scene is ere represented by a geo of vertices and particular » lish data-parallelism is ch rmultiple processors execute the ata This is the architecture used in GPUs, tion to be shared am processors and thus execution arate objects. The objects are geometric forms are represented by a set fort pe of graphie primitives = The graphic primitives indicate how devices are connected to create a shape Graphics hardware can render a set o Segment and group of filled polygons Many times list of tnangl which references three po idual points a series of line at the surface of 3D model each of U -B.So-Comp) AB Appendix There is a dedicated graphics Processing Unit in modern 3D called Graphics Processing Unit (GPU). GPU execute instructions independently, The command is send by Central Processing Unit (CPU) to the GPU, then GPU does the reridering operation and time CPU continue with other task; such operations are called asynchronous operations. The geometrical information is given to the rendering library of Directx or OpenGL. To request the rendering operations the function calls are used, It returns the considerable amount of time before the GPU finished the rendering graphics. There will be no problem occur in the delay time between the giving in of a rendering command and the completion of the rendering operation. But in some cases it is essential to know that at which drawing completes. (In other words we can say that rendering a pixel on the screen which you need to communicate with the GPU you need a medium and this medium is known as openGL, openGL is not a programming language, it is an application programming interface (API) whose purpose is to take data from CPU to GPU.) Hence Computer Graphics developers your task is to send data to GPU to openGL objects. An OpenGL extension is also available.It permits the program running on the CPU to discover when a specific arrangement of rendering commands hes completed the process of executing on the GPU. Be that as it may, such synchronization backs off a 3D graphics application, so it is ordinarily kept away from when the execution of an application is imperative. An application sends the command to a rendering libraryto communicate with the GPU, for instance OpenGL, which thus sends commands to a driver that, knows how to address the GPU in its local dialect. ‘The interface to OpenGL is known as a Hardware Abstraction Layer (HAL) in light of the fact that it uncovered a typical arrangement of capacities that can be utilized to render a scene on any graphics hardware that backings the OpenGL design. ‘The driver deciphers the OpenGL function calls into code that the GPU can get it. A 3D graphics driver as a rule actualizes OpenGL functions straightforwardly to limit the overhead of issuing rendering commands. (The block diagram appeared in Fig. 2 - Q.5 (1) shows the communications between the CPU and GPU.

You might also like