You are on page 1of 31

HYDERABAD INSTITTE OF TECHNOLOGY AND MANAGEMENT

VIRTUAL REALITY LAB

LAB MANUAL

YEAR: 2023-2024

COURSE CODE:21PC6CO14

REGULATIONS: R22

CLASS: B.TECH

BRANCH:

SECTION: III-ISEM

TEAM OF INSTRUCTORS: (OPTIONAL)


HYDERABAD INSTITTE OF TECHNOLOGY AND MANAGEMENT

Program Outcomes

PO1
Engineeringknowledge:Applytheknowledgeofmathematics,science,engin
eeringfundamentals, and an engineering specialization to the solution of
complex engineering problems.

PO2: Problem analysis: Identify, formulate, review research literature, and


analyze complex engineering problems reaching substantiated
conclusions using first principles of mathematics, natural sciences, and
engineering sciences.

PO3: Design/development of solutions: Design solutions for complex


engineering problems and design system components or processes that
meet the specified needs with appropriate consideration for the public
health and safety, and the cultural, societal, and environmental
considerations.

PO4: Conduct investigations of complex problems: Use research-based


knowledge and research methods including design of experiments,
analysis and interpretation of data, and synthesis of the information to
provide valid conclusions.

PO5: Modern tool usage: Create, select, and apply appropriate techniques,
resources, and modern engineering and IT tools including prediction and
modeling to complex engineering activities with an understanding of the
limitations.

PO6: The engineer and society: Apply reasoning informed by the contextual
knowledge to assess societal, health, safety, legal and cultural issues and
the consequent responsibilities relevant to the professional engineering
practice.
PO7:

Environment and sustainability: Understand the impact of the


professional engineering solutions in societal and environmental
contexts, and demonstrate the knowledge of, and need for sustainable
development.

PO8: Ethics: Apply ethical principles and commit to professional ethics and
responsibilities and norms of the engineering practice.

PO9: Individual and team work: Function effectively as an individual, and as a


member or leader in diverse teams, and in multidisciplinary settings.

PO10:
Communication: Communicate effectively on complex engineering
activities with the engineering community and with society at large, such
as, being able to comprehend and write effective reports and design
documentation, make effective presentations, and give and receive clear
instructions.

PO11: Project management and finance: Demonstrate knowledge and


understanding of the engineering and management principles and apply
these to one’s own work, as a member and leader in a team, to manage
projects and in multidisciplinary environments.

PO12: Life-long learning: Recognize the need for, and have the preparation
and ability to engage in independent and life-long learning in the
broadest context of technological change.
HYDERABAD INSTITTE OF TECHNOLOGY AND MANAGEMENT

Program Specific Objectives

PSO1: Foundation of mathematical concepts: To use mathematical methodologies to


crack problem using suitable mathematical analysis, data structure and suitable
algorithm.

PSO2: Foundation of Computer System: the ability to interpret the fundamental


concepts and methodology of computer systems. Students can understand the
functionality of hardware and software aspects of computer systems.

PS03: Foundations of AR/VR: the ability to grasp the methodologies of AR/VR


systems. Possess competent skills and knowledge of software design process.
Familiarity and practical proficiency with a broad area of AR/VR concepts and
provide new ideas and innovations towards research technological change.
INDEX

S.No. Week Page


Name of the experiment No
No:
1 1 1. Importing 2D/3D assets and UI elements into Unity

2 2 2. Exploring Transformation & animations of 2D/3D models

3 3 3. Assigning different materials & Textures to models

4 4 4.Design and simulation of Lighting, reflections and shadows in a


model
5 5 5. Design and simulation of collision & physics system

6 6 6. Design and simulation of Dynamic Particles & Sprites


systems
7 7 7. Design and integration of 3D Spatial audio and sound
effects

8 8 8. Implementation of VR navigation system(UX)

9 9 9. Implementation of AR navigation system(UX)


10 10 10. Implementation of VR hand interaction system(UX)

11 11 11. Implementation of AR interaction system(UX)

12 12 12. Exploring rendering pipelines and post-processing


systems
13 13 13. Optimisation and exporting a VR software build
WEEK-1

Aim:- Importing 2D/3D assets and UI elements into Unity


Steps:-
1. Launch Unity:

Open Unity Hub and launch the Unity Editor by selecting the appropriate project or creating a new one.

Access Assets Window:

In Unity Editor, navigate to the Assets window. You can find it usually at the bottom of the interface or by
going to Window > Assets.

2. Prepare Assets:

Before importing assets into Unity, ensure that you have the necessary 2D/3D assets and UI elements
ready.These assets could include images, textures, models, animations, sprites, audio files, and any other
resources required for your project.

3. Import Assets:

To import assets, simply drag and drop them from your file explorer into the Assets window in Unity.
Alternatively, you can use the right-click menu in the Assets window and choose "Import New Asset..." to
browse and select assets from your computer.

4. Organize Assets:

After importing assets, organize them into folders within the Assets window for better management.

Create folders such as Textures, Models, Sprites, Audio, Scripts, etc., and move the corresponding assets into
their respective folders.

5. Adjust Import Settings (Optional):

Depending on the type of asset, Unity provides various import settings in the Inspector window.

For example, for textures, you can adjust compression settings, texture type, size, etc., to optimize
performance and quality.

6. Preview Assets (Optional):

You can preview 3D models, animations, and textures directly within Unity by selecting them in the Assets
window.

Unity provides a built-in preview window where you can view and interact with assets before using them in
your scenes.

7. Import UI Elements:

To import UI elements, such as buttons, text fields, or panels, create a Canvas GameObject in your scene.

Then, import UI assets (e.g., sprites, images) and assign them to UI components like Image or Text.

8. Verify Import:

Once assets are imported, verify that they appear correctly in the Assets window and that there are no errors
or issues reported in the Console window.

9. Save Project:

After importing assets, save your Unity project to ensure that all changes are preserved.
WEEK-2

Aim:- Exploring Transformation & animations of 2D/3D models


Steps:-
1. Launch Unity

- Open Unity Hub and create a new Unity project or open an existing one.

2. Import 2D/3D Models

- Import the 2D/3D models you want to use for the experiment into your Unity project.

3. Create a Scene

- Create a new scene or open an existing one where you want to explore the transformation and animations
of the models.

4. Add Models to the Scene

- Drag and drop the imported 2D/3D models from the Assets folder into the scene view to add them to the
scene.

5. Explore Transformations

- Select a model in the scene view and use the Transform tool in the Inspector window to manipulate its
position, rotation, and scale.

- Experiment with translating, rotating, and scaling the models to understand how these transformations
affect their appearance and behavior.

6. Animate Models

- Create animations for the models using Unity's animation system.

- Open the Animation window, create a new animation clip, and keyframe the desired properties to create
animations.

- Use Animator components and animation controllers to control and manage animations in your scene.

7. Experiment with Animation Parameters:

- Explore the use of animation parameters such as triggers, floats, bools, and integers to control animations
dynamically during runtime.

- Create scripts to manipulate animation parameters based on user input, game events, or other conditions.

8. Test and Iterate:

- Test the transformations and animations of the models in the scene to ensure they behave as expected.

- Iterate on the design and implementation as needed, making adjustments to the models, animations, or
scripts to achieve the desired results.

9. Add Interactivity (Optional):

- Add interactivity to the scene by implementing user input controls, such as keyboard, mouse, or touch
input, to manipulate the models in real-time.

- Create scripts to respond to user input and update the transformations or animations of the models
accordingly.
10. Optimize and Polish:

- Optimize the scene by optimizing the models, animations, and scripts to improve performance and
efficiency.

- Polish the scene by adding visual effects, sound effects, lighting, and other enhancements to make the
experience more immersive and engaging.

11. Document Findings:

- Document your findings, observations, and any challenges encountered during the experiment. Note
down any insights or learnings gained from exploring transformations and animations of 2D/3D models in
Unity.

12. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your
findings in a clear and concise manner, including screenshots, videos, or demonstrations if applicable.
WEEK-3
Aim:- Assigning different materials & Textures to models
Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Import Models:

- Import the models you want to assign materials and textures to into your Unity project.

3. Create or Open Scene:

- Create a new scene or open an existing one where you want to work with the models.

4. Add Models to Scene:

- Drag and drop the imported models from the Assets folder into the scene view to add them to the scene.

5. Prepare Materials & Textures:

- Import or create the materials and textures you want to assign to the models. Ensure they are compatible
with Unity.

6. Assign Materials:

- Select a model in the scene view.

- In the Inspector window, locate the Mesh Renderer component.

- Click on the small circle next to the Material property to open the Material Picker window.

- Choose the desired material from the list or create a new one.

- Repeat this process for each model in the scene, assigning different materials as needed.

7. Apply Textures:

- If the materials require textures, drag and drop the textures onto the corresponding slots in the material
properties.

- Adjust the texture settings as necessary to achieve the desired appearance.

8. Adjust Material Properties:

- Fine-tune the material properties such as color, transparency, shininess, and emission to achieve the
desired visual effects.

- Experiment with different settings to create unique looks for each model.

9. Preview and Test:

- Enter Play mode to preview how the models look with the assigned materials and textures.

- Make any necessary adjustments to the materials or textures based on the preview results.

10. Save and Iterate:

- Save your project and iterate on the design by making further adjustments to the materials, textures, or
models as needed.

- Test different combinations to find the most suitable appearance for your scene.
11. Document Findings:

- Document your findings, observations, and any challenges encountered during the process of assigning
materials and textures to models.

- Note down any insights or learnings gained from experimenting with different material properties and
texture settings.

12. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your
findings in a clear and concise manner, including screenshots or visual demonstrations if applicable.
WEEK-4

Aim:- Design and simulation of Lighting, reflections and shadows in a model

Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Import Model:

- Import the model you want to design and simulate lighting, reflections, and shadows for into your Unity
project.

3.Create or Open Scene:

- Create a new scene or open an existing one where you want to work with the model.

4. Add Model to Scene:

- Drag and drop the imported model from the Assets folder into the scene view to add it to the scene.

5. Prepare Lighting Setup:

- Assess the lighting requirements for your scene based on the desired mood, atmosphere, and realism.

- Choose the appropriate type of lighting sources such as directional lights, point lights, spotlights, or area lights.

6. Adjust Light Properties:

- Select the lighting sources in the scene view.

- In the Inspector window, adjust the light properties such as intensity, color, range, and angle to achieve the
desired lighting effects.

7. Simulate Reflections:

- Enable real-time reflections in your scene to simulate the reflection of light off surfaces.

- Utilize reflection probes or screen space reflection effects to create realistic reflections on materials and
objects.

8. Create Shadows:

- Enable shadows for the lighting sources in your scene to simulate the blocking of light by objects.

- Adjust the shadow properties such as resolution, distance, and softness to control the appearance of shadows.

9. Fine-tune Lighting Setup:

- Experiment with different lighting configurations, positions, and intensities to achieve the desired visual
impact.

- Pay attention to the interaction between lighting, reflections, and shadows to create a cohesive and
immersive environment.

10. Preview and Test:

- Enter Play mode to preview how the lighting, reflections, and shadows affect the appearance of the
model in real-time.

- Make any necessary adjustments to the lighting setup based on the preview results.
11. Save and Iterate:

- Save your project and iterate on the design by making further adjustments to the lighting, reflections, or
shadows as needed.

- Test different lighting scenarios to find the most suitable setup for your scene.

12. Document Findings:

- Document your findings, observations, and any challenges encountered during the process of designing and
simulating lighting, reflections, and shadows in the model.

- Note down any insights or learnings gained from experimenting with different lighting techniques and
effects.

13. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your findings
in a clear and concise manner, including screenshots or visual demonstrations if applicable.
WEEK-5

Aim:- Design and simulation of collision & physics system


Steps:-

1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Import Model or Create Objects:

- Import the model you want to use for collision and physics simulation into your Unity project.

- Alternatively, create simple geometric objects or primitives within Unity to serve as colliders and
dynamic objects.

3. Set Up Scene:

- Create a new scene or open an existing one where you want to work with the collision and physics
system.

4. Add Objects to Scene:

- Place the models or objects you imported or created into the scene view to set up the environment for
collision and physics simulation.

5. Configure Colliders:

- Add collider components to the objects in the scene to define their collision boundaries.

- Choose the appropriate type of collider (e.g., box collider, sphere collider, capsule collider) based on the
shape of the object.

6. Set Up Physics Material (Optional):

- If desired, create and assign physics materials to the colliders to control properties such as friction,
bounciness, and friction combined.

7. Configure Rigidbody Components:

- Add Rigidbody components to the objects that require dynamic physics behavior, such as gravity, forces,
and collisions.

- Adjust the Rigidbody properties such as mass, drag, and angular drag to control the object's response to
physics forces.

8. Apply Forces or Impulses (Optional):

- Use scripts or built-in Unity functions to apply forces or impulses to the Rigidbody objects to simulate
interactions such as pushing, pulling, or launching.

9. Set Up Constraints (Optional):

- Configure constraints on the Rigidbody objects to restrict their movement along specific axes or to limit
rotation.

10. Test Collision and Physics Interactions:


- Enter Play mode to test the collision and physics interactions between objects in the scene.

- Observe how objects react to collisions, gravity, and applied forces, and make adjustments as needed.

11. Fine-tune Physics Parameters:

- Experiment with different physics settings and parameters to achieve the desired behavior and realism in
the simulation.

- Adjust properties such as gravity strength, friction coefficients, and collision detection modes to optimize
the physics system.

12. Preview and Test:

- Continuously preview and test the collision and physics interactions to ensure they meet the requirements
and expectations of the simulation.

13. Save and Iterate:

- Save your project and iterate on the design by making further adjustments to the collision and physics
system as needed.

- Test different scenarios and edge cases to validate the robustness and accuracy of the simulation.

14. Document Findings:

- Document your findings, observations, and any challenges encountered during the process of designing
and simulating collision and physics systems.

- Note down any insights or learnings gained from experimenting with different collision shapes, physics
materials, and Rigidbody properties.

15. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your
findings in a clear and concise manner, including screenshots or visual demonstrations if applicable.
WEEK-6
Aim: Design and simulation of Dynamic Particles & Sprites systems
Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Create or Import Particle System:

- Create a new Particle System GameObject or import a pre-made particle system asset into your Unity
project.

3. Set Up Particle Emitter:

- Configure the properties of the particle emitter to define the emission rate, shape, size, and direction of the
particles.

- Experiment with different emission shapes such as sphere, cone, box, or mesh to achieve the desired
particle distribution.

4. Adjust Particle Parameters:

- Fine-tune the particle parameters such as lifetime, speed, size, color, rotation, and opacity to customize
the appearance and behavior of the particles.

- Use curves and gradients to create variations in particle properties over time or distance.

5. Add Forces and Turbulence:

- Apply forces and turbulence to the particle system to simulate dynamic effects such as wind, gravity,
vortex, or attraction.

- Adjust the strength and direction of the forces to control the movement and behavior of the particles.

6. Integrate Sprite Sheets (Optional):

- If using sprite-based particles, import sprite sheets or individual sprites into your Unity project to serve as
particle textures.

- Configure the particle material to use the sprite texture and adjust the UV mapping settings as needed.

7. Animate Sprites (Optional):

- If desired, create animations for the sprite textures to add dynamic movement or effects to the particles.

- Use Unity's Animation or Animator components to create sprite animations and control playback.
8. Set Up Collision Detection:

- Enable collision detection for the particle system to interact with other objects in the scene.

- Configure collision parameters such as bounce, friction, and trigger events to control the outcome of
collisions.

9. Fine-tune Particle System Behavior:

- Experiment with different settings and parameters to achieve the desired visual effects and interactions.

- Iterate on the design by adjusting particle properties, forces, and collision settings to enhance realism and
aesthetic appeal.

10. Preview and Test:

- Enter Play mode to preview how the dynamic particles and sprites system behaves in real-time.

- Observe the movement, appearance, and interactions of the particles, and make adjustments as needed.

11. Save and Iterate:

- Save your project and iterate on the design by making further adjustments to the particle system as
needed.

- Test different scenarios and configurations to explore the full range of possibilities offered by dynamic
particles and sprites.

12. Document Findings:

- Document your findings, observations, and any challenges encountered during the process of designing
and simulating dynamic particles and sprites systems.

- Note down any insights or learnings gained from experimenting with different particle parameters,
forces, and collision settings.

13. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your
findings in a clear and concise manner, including screenshots or visual demonstrations if applicable.
WEEK-7

Aim:-Design and integration of 3D Spatial audio and sound effects


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Set Up Scene:

- Create a new scene or open an existing one where you want to integrate 3D spatial audio and sound
effects.

3. Import Audio Assets:

- Import the audio files or sound effects you want to use into your Unity project. Ensure they are in a
compatible format such as WAV or MP3.

4. Organize Audio Assets:

- Organize the audio assets in the project folder structure to maintain a clear and manageable hierarchy.

5. Configure Audio Listener:

- Ensure there is an Audio Listener component in the scene to capture and process audio for the player's
perspective.

6. Set Up Audio Sources:

- Add Audio Source components to the GameObjects in the scene that emit sound or require spatial audio
effects.

- Adjust the properties of the Audio Source such as volume, pitch, and spatial blend to control the audio
playback.

7. Enable 3D Spatialization:

- Enable 3D spatialization for the Audio Sources to simulate realistic positional audio effects based on the
listener's position and orientation.

- Adjust the spatial blend property to control the balance between 2D and 3D audio spatialization.
8. Adjust Audio Settings:

- Configure additional audio settings such as rolloff distance, doppler level, and spread to fine-tune the
spatial audio effects and optimize performance.

9. Implement Sound Effects:

- Integrate sound effects into the scene to enhance immersion and interactivity. Consider using audio
triggers or events to synchronize sound playback with specific actions or events in the game.

10. Design Ambient Audio:

- Design ambient audio layers or background music to create atmosphere and mood in the scene. Use audio
loops or crossfading techniques to seamlessly transition between different audio tracks.

11. Test and Iterate:

- Enter Play mode to test the spatial audio and sound effects in the scene. Move the listener around to
observe how the audio changes based on position and orientation.

- Make adjustments to the audio settings, spatialization parameters, and sound effects based on feedback
and testing results.

12. Fine-tune Audio Balance:

- Balance the audio levels of different sources and effects to ensure they complement each other and
contribute to the overall audio experience.

- Pay attention to the mix of background music, ambient audio, sound effects, and dialogue to create a
cohesive and immersive soundscape.

13. Preview and Test:

- Continuously preview and test the spatial audio and sound effects to ensure they meet the requirements
and expectations of the scene.

- Consider testing on different audio playback devices and environments to ensure compatibility and
consistency.

14. Save and Iterate:

- Save your project and iterate on the design by making further adjustments to the audio settings and
integration as needed.

- Test different scenarios and configurations to explore the full potential of 3D spatial audio and sound
effects.

15. Document Findings:

- Document your findings, observations, and any challenges encountered during the process of designing
and integrating 3D spatial audio and sound effects.

- Note down any insights or learnings gained from experimenting with different audio techniques and
effects.
16. Share Results:

- Share your results with peers, instructors, or colleagues to gather feedback and insights. Present your
findings in a clear and concise manner, including audio samples or demonstrations if applicable.

WEEK-8

Aim:- Implementation of VR navigation system(UX)


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Set Up Scene:

- Create a new scene or open an existing VR scene where you want to implement the navigation system.

3. Import VR SDK:

- Ensure you have the appropriate VR SDK installed and configured in your Unity project. Common VR
SDKs include Oculus, SteamVR, and Google VR.

4. Add VR Player Controller:

- Add a VR Player Controller GameObject to the scene to represent the player's virtual presence.

- Configure the player controller to support the desired input methods (e.g., VR controllers, keyboard,
mouse).

5. Design Navigation Interface:

- Design a user-friendly navigation interface that allows players to move and interact within the VR
environment.

- Consider using a combination of teleportation, locomotion, or point-and-click navigation methods for


intuitive interaction.

6. Implement Teleportation:

- Implement teleportation functionality to allow players to move between locations in the VR environment.

- Place teleportation nodes or markers at strategic locations within the scene to indicate valid teleportation
destinations.

7. Set Up Locomotion System:


- If using locomotion-based movement, implement a locomotion system that provides smooth and
comfortable movement within the VR space.

- Experiment with different locomotion techniques such as smooth locomotion, blink teleportation, or arm
swinging to find the most suitable option.

8. Integrate Point-and-Click Navigation:

- If applicable, integrate point-and-click navigation functionality to enable players to interact with objects
or elements within the VR environment.

- Implement raycasting or collision detection to determine the target location of the player's click or
interaction.

9. Test Navigation Controls:

- Enter Play mode to test the navigation controls and user experience within the VR scene.

- Ensure that the navigation system is responsive, intuitive, and comfortable for players to use.

10. Optimize for Comfort and Performance:

- Optimize the navigation system to minimize motion sickness and discomfort for players, especially
during extended VR sessions.

- Consider factors such as movement speed, acceleration, field of view, and visual feedback to enhance
comfort and performance.

11. Iterate Based on Feedback:

- Gather feedback from playtesting sessions and user evaluations to identify areas for improvement in the
VR navigation system.

- Iterate on the design and implementation based on user feedback to enhance usability and user
satisfaction.

12. Document Design Decisions:

- Document the design decisions, implementation details, and considerations made during the development
of the VR navigation system.

- Note down any challenges encountered and solutions implemented to address them.
WEEK-9

Aim:- Implementation of AR navigation system(UX)


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Set Up Scene:

- Create a new scene or open an existing AR scene where you want to implement the navigation system.

3. Import AR SDK:

- Ensure you have the appropriate AR SDK installed and configured in your Unity project. Common AR
SDKs include ARCore and ARKit.

4. Add AR Session and AR Session Origin:

- Add an AR Session GameObject and an AR Session Origin GameObject to the scene to manage the AR
session and anchor objects in the real world.

5. Design Navigation Interface:

- Design a user-friendly navigation interface that overlays AR elements onto the real-world environment.

- Consider using visual cues such as arrows, waypoints, or markers to guide users to their destinations.

6. Implement Wayfinding System:

- Implement a wayfinding system that calculates routes and provides directions to users within the AR
environment.

- Use AR raycasting or plane detection to determine the user's location and orientation relative to the
environment.
7. Integrate Location Services:

- Integrate location-based services (GPS, Wi-Fi, etc.) to determine the user's current location and
destination within the AR navigation system.

- Use geospatial data or mapping APIs to retrieve information about points of interest, landmarks, and other
relevant locations.

8. Overlay Navigation UI:

- Overlay navigation UI elements onto the AR scene to display route information, directions, and points of
interest to users.

- Ensure the UI elements are clear, legible, and easy to understand in the context of the user's surroundings.

9. Implement Gestural Interaction:

- Implement gestural interaction methods to allow users to interact with the AR navigation system using
gestures or touch input.

- Design intuitive gestures for actions such as zooming, panning, and selecting waypoints or destinations.

10. Test AR Navigation Controls:

- Enter Play mode to test the AR navigation controls and user experience within the AR scene.

- Ensure that the navigation system is responsive, accurate, and user-friendly in various real-world
environments.

11. Optimize for Real-world Conditions:

- Optimize the AR navigation system to perform reliably in different lighting conditions, environments,
and device specifications.

- Adjust rendering settings, lighting conditions, and object occlusion to enhance visibility and usability in
the real world.

12. Iterate Based on Feedback:

- Gather feedback from user testing sessions and usability studies to identify areas for improvement in the
AR navigation system.

- Iterate on the design and implementation based on user feedback to enhance usability and user
satisfaction.

13. Document Design Decisions:

- Document the design decisions, implementation details, and considerations made during the development
of the AR navigation system.

- Note down any challenges encountered and solutions implemented to address them.

14. Share Results:


- Share your results and findings with peers, instructors, or colleagues to gather feedback and insights.
Present your navigation system in action, highlighting its features and capabilities.

WEEK-10

Aim:- Implementation of VR hand interaction system(UX)


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2.Set Up Scene:

- Create a new scene or open an existing VR scene where you want to implement the hand interaction
system.

3. Import VR SDK:

- Ensure you have the appropriate VR SDK installed and configured in your Unity project. Common VR
SDKs include Oculus, SteamVR, and Google VR.

4. Add VR Player Controller:

- Add a VR Player Controller GameObject to the scene to represent the player's virtual presence.

- Configure the player controller to support hand tracking or hand-held controllers for interaction.

5. Enable Hand Tracking:

- If using hand tracking, enable hand tracking functionality provided by the VR SDK or third-party plugins.

- Configure hand tracking settings such as hand pose estimation, finger tracking, and gesture recognition.

6. Design Interaction Elements:

- Design virtual interaction elements such as buttons, switches, levers, and objects that players can interact
with using their hands.
- Ensure that interaction elements are visually distinct, responsive to user input, and intuitively designed for
hand manipulation.

7. Implement Hand Interactions:

- Implement hand interaction scripts and logic to handle user input and interactions with virtual objects.

- Use physics-based interactions or custom scripting to simulate realistic hand-object interactions such as
grabbing, pushing, pulling, and throwing.

8. Provide Visual Feedback:

- Provide visual feedback to users to indicate when their hands are within reach of interactive objects and
when interactions are detected.

- Use highlighting, animation, or particle effects to convey object states and interaction outcomes.

9. Optimize for Performance:

- Optimize the hand interaction system to maintain smooth performance and responsiveness, especially in
complex scenes with multiple interactive objects.

- Minimize rendering overhead, physics calculations, and script execution to ensure a consistent user
experience.

10. Test Hand Interactions:

- Enter Play mode to test the hand interaction controls and user experience within the VR scene.

- Verify that hand interactions are accurate, responsive, and comfortable for users to perform various tasks
and manipulations.

11. Iterate Based on Feedback:

- Gather feedback from playtesting sessions and user evaluations to identify areas for improvement in the
hand interaction system.

- Iterate on the design and implementation based on user feedback to enhance usability and user
satisfaction.

12. Document Design Decisions:

- Document the design decisions, implementation details, and considerations made during the development
of the VR hand interaction system.

- Note down any challenges encountered and solutions implemented to address them.

13. Share Results:

- Share your results and findings with peers, instructors, or colleagues to gather feedback and insights.
Present your hand interaction system in action, highlighting its features and capabilities.
WEEK-11

Aim:-Implementation of AR interaction system(UX)


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Set Up Scene:

- Create a new scene or open an existing AR scene where you want to implement the interaction system.

3. Import AR SDK:

- Ensure you have the appropriate AR SDK installed and configured in your Unity project. Common AR
SDKs include ARCore and ARKit.

4. Add AR Session and AR Session Origin:

- Add an AR Session GameObject and an AR Session Origin GameObject to the scene to manage the AR
session and anchor objects in the real world.

5. Enable AR Interaction Features:

- Enable AR interaction features provided by the AR SDK or third-party plugins, such as plane detection,
object tracking, and image recognition.

- Configure interaction settings to define how users will interact with AR elements in the scene.

6. Design Interactive Elements:


- Design virtual interactive elements such as buttons, menus, 3D objects, and information overlays that
users can interact with in the AR environment.

- Ensure that interactive elements are visually appealing, responsive to user input, and intuitively designed
for AR interaction.

7. Implement Interaction Logic:

- Implement interaction scripts and logic to handle user input and interactions with virtual objects in the AR
environment.

- Use event triggers, raycasting, or physics-based interactions to detect user interactions and respond
accordingly.

8. Provide Visual Feedback:

- Provide visual feedback to users to indicate when interactive elements are within reach and when
interactions are detected.

- Use animation, sound effects, or UI feedback to convey object states, selection, and interaction outcomes.

9. Optimize for Performance:

- Optimize the AR interaction system to maintain smooth performance and responsiveness, especially in
AR scenes with complex 3D models and interactions.

- Minimize rendering overhead, object occlusion, and processing time to ensure a seamless user experience.

10. Test AR Interactions:

- Enter Play mode to test the AR interaction controls and user experience within the AR scene.

- Verify that AR interactions are accurate, responsive, and intuitive for users to perform various tasks and
interactions.

11. Iterate Based on Feedback:

- Gather feedback from playtesting sessions and user evaluations to identify areas for improvement in the
AR interaction system.

- Iterate on the design and implementation based on user feedback to enhance usability and user
satisfaction.

12. Document Design Decisions:

- Document the design decisions, implementation details, and considerations made during the development
of the AR interaction system.

- Note down any challenges encountered and solutions implemented to address them.

13. Share Results:

- Share your results and findings with peers, instructors, or colleagues to gather feedback and insights.
Present your AR interaction system in action, highlighting its features and capabilities
WEEK-12

Aim:-Exploring rendering pipelines and post-processing systems


Steps:-
1. Launch Unity:

- Open Unity Hub and create a new Unity project or open an existing one.

2. Set Up Scene:

- Create a new scene or open an existing scene where you want to explore rendering pipelines and post-processing
effects.

3. Select Rendering Pipeline:

- Choose the rendering pipeline you want to explore: Built-in Render Pipeline (Standard Pipeline) or Universal Render
Pipeline (URP) or High Definition Render Pipeline (HDRP).

- Select the rendering pipeline from the project settings or create a new project with the desired pipeline preset.

4. Configure Rendering Settings:

- Configure rendering settings specific to the selected rendering pipeline, such as shadow quality, texture compression,
and render scale.

- Adjust rendering settings based on your project requirements and target platform.

5. Enable Post-processing Stack:

- If using the Built-in Render Pipeline, enable the Post-processing Stack package in the Package Manager to access
post-processing effects.

- For URP or HDRP, ensure that the post-processing package is included by default and configured in the project
settings.
6. Add Post-processing Volume:

- Add a Post-processing Volume GameObject to the scene to apply post-processing effects globally or locally within
specific areas.

- Configure the Post-processing Volume settings to control the intensity, blending, and priority of post-processing
effects.

7. Explore Post-processing Effects:

- Experiment with different post-processing effects available in the Post-processing Stack or URP/HDRP, such as
bloom, color grading, ambient occlusion, and depth of field.

- Adjust the parameters of each post-processing effect to see how they affect the visual appearance of the scene.

8. Compare Rendering Pipelines:

- Compare the visual quality and performance of different rendering pipelines by switching between them in the
project settings.

- Evaluate the rendering features, optimizations, and compatibility of each pipeline with your project requirements.

9. Test Performance Impact:

- Measure the performance impact of post-processing effects and rendering pipeline settings on frame rate and
resource usage.

- Use profiling tools and performance metrics to identify bottlenecks and optimize rendering performance.

10. Capture Results:

- Capture screenshots or record video footage of the scene with different rendering pipeline configurations and post-
processing effects applied.

- Document your observations, preferences, and findings for each rendering pipeline and post-processing setup.

11. Iterate Based on Feedback:

- Gather feedback from peers, instructors, or colleagues on the visual quality and performance of the scene with
different rendering pipelines and post-processing effects.

- Iterate on the scene design and rendering settings based on feedback to achieve the desired visual aesthetics and
performance balance.

12. Document and Share Learnings:

- Document your exploration process, including the rendering pipelines tested, post-processing effects applied, and
performance results obtained.

- Share your learnings, insights, and recommendations with others through presentations, reports, or online forums to
contribute to the community's knowledge.
WEEK-13

Aim:-Optimization and exporting a VR software build


Steps:-
1. Finalize VR Scene:

- Ensure that your VR scene is complete, including all necessary assets, interactions, and UI elements.

2. Optimize Assets:

- Optimize 3D models, textures, and other assets to reduce polygon count, texture resolution, and file size
without sacrificing visual quality.

- Use LOD (Level of Detail) models, texture compression, and mesh simplification techniques to improve
performance.

3. Adjust Lighting and Effects:

- Fine-tune lighting settings and effects to balance visual quality and performance in the VR scene.

- Optimize real-time lighting, shadows, reflections, and post-processing effects for optimal VR rendering.

4. Implement Performance Enhancements:

- Implement performance enhancements such as occlusion culling, frustum culling, and object pooling to
optimize runtime performance.

- Identify and eliminate performance bottlenecks through profiling and optimization techniques.
5. Test on Target Hardware:

- Test the VR scene on target VR hardware devices (e.g., Oculus Rift, HTC Vive) to ensure compatibility
and performance.

- Identify any performance issues or compatibility issues specific to the target hardware platform.

6. Adjust VR Settings:

- Adjust VR settings such as rendering resolution, refresh rate, and field of view to optimize performance
and comfort for users.

- Fine-tune VR locomotion mechanics, input controls, and comfort settings based on user feedback.

7. Enable VR Build Support:

- Enable VR build support in Unity for the target VR platform (e.g., Oculus VR, SteamVR) through project
settings.

- Configure VR SDK settings and input mappings for seamless integration with the VR hardware.

8. Build VR Software:

- Build the VR software project for the target VR platform using the appropriate build settings and
platform- specific configurations.

- Generate a standalone executable or package file that can be installed and run on the target VR hardware
device.

9. Test VR Software Build:

- Test the VR software build on the target VR hardware device to ensure functionality, performance, and
user experience.

- Verify that all VR interactions, UI elements, and features work as intended in the final build.

10. Optimize Build Size:

- Further optimize the VR software build size by removing unnecessary assets, textures, and code modules.

- Compress and package assets efficiently to minimize download and installation time for end users.

11. Package and Distribute:

- Package the optimized VR software build into a distributable format (e.g., APK for Oculus Quest, EXE
for PC VR) ready for distribution.

- Distribute the VR software build through official app stores, online platforms, or direct downloads,
following platform-specific guidelines.

12. Document Optimization Process:

- Document the optimization process, including techniques used, performance improvements achieved, and
lessons learned.
- Create documentation or tutorials for future reference and sharing with others in the VR development
community.

13. Collect Feedback and Iterate:

- Collect feedback from users and stakeholders on the VR software build, focusing on performance,
usability, and overall experience.

- Iterate on the VR software build based on feedback, addressing any issues or areas for improvement
identified during testing and usage.

You might also like