Professional Documents
Culture Documents
On
Bachelor of Technology
in
COMPUTER SCIENCE & ENGINEERING
Submitted By
Mohammad Saif
Roll. No. 0208CS223D09
PREFACE
Minor Project-I is an integral part of Bachelor of Engineering and each and every student has to
create the Minor Project in the 5th Semester while studying in Institute.
This record is concerned about our practical Minor Project-I during 5th Semester i.e. Third year
of B.Tech. course. We have taken our Minor Project-I in Data Compression In Backbone
Network During this Minor Project-I, we got to learn many new things about the technology and
its practical implementation. This Minor Project-I proved to be a milestone in our knowledge of
present environment. Every say and every moment was an experience in itself, an experience
which theoretical study can’t provide.
ACKNOWLEDGEMENT
It is my pleasure to be indebted to various people, who directly or indirectly contributed in the
development of this work and who influenced my thinking , behavior and acts during the course
of study.
I express my sincere gratitude to our principal, Dr. Ajay Kumar Lala,Principal ,Gyan Ganga
College of Technology,Jabalpur for providing me an opportunity to undergo Minor Project-I in
Data Compression In Backbone Network.
I am thankful to Dr.Vimmi Pandey, HOD department of Computer Science and Engineering for
her support, cooperation, and motivation provided to me during the MINOR Project for constant
inspiration, presence and blessings. Also I would like to mention my gratitude to the entire
faculty of my department for their support and suggestions.
Thank you to my Mentor Prof. Pankaj Jain ,Project Guide,department of Computer Science and
Engineering. Thank you for seeing the potential and pushing to do our best. Your unwavering
belief in our abilities has motivated us to achieve more than we ever thought possible
I also extend my sincere appreciation to all facultiyes of ,department of Computer Science and
Engineering, GGCT who provided his valuable suggestions and precious time in accomplishing
my Minor Project-I report.
Lastly, I would like to thank the almighty and my parents for their moral support and my friends
with whom I shared my day-to day experience and received lots of suggestions that my quality of
work.
Mohammad Saif
0208CS223D09
DECLARATION
I, Mohammad Saif, Roll No. 0208CS223D09, B.Tech (Semester- V) of the Gyan Ganga
College of Technology, Jabalpur hereby declare that the Minor Project-I Report entitled ” Data
Compression In Backbone Network” is an original work and data provided in the study is
authentic to the best of my knowledge. This report has not been submitted to any other Institute
for the award of any other degree.
Mohammad Saif
(Roll No.
0208CS223D09)
This is to certify that above statement made by the candidate is correct to the best of our
knowledge.
Approved by:
Certificate
This is to certify that the Minor Project-I report entitled “Data compression in
backbone network” is submitted by “Mohammad Saif” for the partial fulfillment
of the requirement for the award of degree of Bachelor of Technology in
Department of Computer Science & Engineering from Rajiv Gandhi Proudyogiki
Vishwavidyalaya, Bhopal (M.P).
TABLE OF CONTENTS
4. DURATION 12
4.1 Timeline 12
6. DESIGN TECHNIQUES 15
7. TIER ARCHITECTURE 17
8. 19
SOFTWARE PROCESS MODELS
10. DATABASE 62
11. SCREENSHOTS 65
13. CONCLUSION 79
14. REFERENCES 80
INTRODUCTION
The intended audience for data compression in a backbone network typically includes
network administrators, IT professionals, and decision-makers within an organization.
Here are the key stakeholders and their specific interests:
1. Network Administrators:
Responsibility: Network administrators are responsible for managing and
maintaining the backbone network infrastructure.
Interest: Network administrators are interested in data compression
because it can help optimize bandwidth usage, reduce congestion, and
improve overall network performance. They need to implement and
configure compression techniques to ensure efficient data transfer across
the backbone.
2. IT Professionals:
Responsibility: IT professionals, including network engineers and
technicians, are involved in the design, implementation, and
troubleshooting of network systems.
Interest: IT professionals are concerned with the technical aspects of data
compression, including selecting the appropriate compression algorithms,
integrating compression into network protocols, and addressing any
compatibility or interoperability issues that may arise.
1.4 Team Architecture:-
First user open the platform select file and then this selected file
goes for processing in our algorithm.
During processing original file get converts into encrypted string.
After that each character of String is converted into linear binary
form.
Further, we convert this linear binary in 2D Array (matrix) of
1920*1080.
After that we iterate through the matrix and check if the bit is 1 or
0.
If a bit is 0 then it represents the black or if a bit is 1 it is
represented by white.
Through this we’ll make a 1920*1080 pixel resolution of binary
image.
After completion if any string is remaining in our data then we’ll
repeat previous steps again..
When all encrypted string is converted into binary image we’ll put
rgb red as a stop mark(about 1 pixel).
After all this if image count is greater than 1 it will render into a
video.
2. Problem Statement
Description-
⮚ Our web portal, developed with HTML, CSS, and React, offers users a
seamless file selection experience with an emphasis on user-friendliness.
⮚ File uploads and processing are efficiently managed by Python, utilizing
Firebase for temporary storage. During processing, Python employs unique
color-coding for data encryption and employs blockchain technology to
ensure data integrity.
⮚ Users receive images and simplifying and enhancing the security of data
sharing.
⮚ Python efficiently restores data from uploaded images, preserving both
security and data integrity.
2.1.2 Selection of Product:-
1. User-Friendly Interface:
Implement an intuitive and user-friendly interface for both administrators
and end-users to facilitate easy interaction with the data compression
system.
2. Efficient Configuration and Settings:
Design the system to allow straightforward configuration of compression
settings, making it easy for users to adapt the compression process to
their specific needs.
3. Comprehensive Documentation and Training:
Provide detailed documentation and training resources to guide users
through the installation, configuration, and usage of the data compression
solution.
4. Real-Time Feedback and Progress Indicators:
Include real-time feedback mechanisms and progress indicators during
compression and decompression processes, allowing users to monitor and
understand the system's status.
5. Error Handling and User Assistance:
Implement effective error handling mechanisms and provide clear user
assistance in case of issues, ensuring users can easily troubleshoot and
resolve any encountered problems
3.2 Objective:-
1. Efficient Data Management:
Develop a data compression solution to optimize storage space,
facilitating efficient data management and reducing storage-related costs.
2. Enhanced Data Transfer Efficiency:
Improve data transfer efficiency across the organization's network by
implementing advanced compression algorithms and minimizing
bandwidth usage.
3. Scalability and Adaptability:
Create a scalable solution that adapts to the organization's growing data
volumes and evolving network demands while maintaining high
compression efficiency.
4. Security and Compliance:
Ensure the security of compressed data during transmission and storage,
implementing robust encryption measures to comply with industry
standards and regulations.
5. User Experience Improvement:
Enhance the end-user experience, particularly in real-time applications, by
implementing data compression that minimizes perceptible delays and
maintains data integrity
4.Duration:-
The hardware interface for the data compression project involves defining
specifications to ensure compatibility and optimal performance. This includes
specifying processor requirements, RAM considerations, compatibility with
various storage devices and network interfaces, integration with security hardware,
and support for external devices. Additionally, considerations for power,
temperature, and peripheral device integration are outlined to ensure the data
compression system operates efficiently and seamlessly interfaces with the
organization's hardware infrastructure. The goal is to establish a clear framework
for selecting or designing hardware components that meet the specific needs of the
data compression solution.
The software interface for the data compression project encompasses the
interactions between the data compression system and software components. This
includes compatibility with operating systems such as Windows, Linux, and
macOS, as well as integration with common network protocols (TCP/IP, HTTP,
FTP). The software interface must ensure seamless interaction with existing
applications, databases, and file systems within the organization. Additionally, the
system should support standard compression and decompression libraries and
provide APIs for integration with third-party software. A user-friendly interface is
crucial, allowing administrators and end-users to configure settings, monitor
compression processes, and access documentation easily. The software interface
plays a vital role in ensuring the smooth integration, usability, and effectiveness of
the data compression solution within the broader software ecosystem.
6.Design Techniques:-
Design techniques for the data compression project involve a modular approach,
breaking the system into independent modules for easy maintenance and
scalability. Careful selection of compression algorithms, including both lossless
and lossy, is essential for optimizing performance. Implementing parallel
processing enhances compression and decompression speeds, while dynamic
compression settings adapt to varying data types. Error detection and correction
mechanisms, along with robust security measures like encryption, ensure data
reliability and protection. Optimizing buffer management, cache, and load
balancing contributes to efficient resource utilization. Adaptive compression
techniques adjust to real-time data characteristics, and a user-friendly interface,
thorough documentation, and training materials enhance usability. Testing
strategies, optimization for real-time applications, and scalability planning further
contribute to a comprehensive and effective design.
7.Tier Architecture:-
9.Design
11.Screenshorts
Login and sign-up page :-
13.Conclusion:-
1. Requirements Gathering:
Activities: Engage with stakeholders, including IT professionals, network
administrators, and end-users, to gather detailed requirements for the
data compression solution.
Outcome: A comprehensive understanding of the project's goals,
technical specifications, and user expectations.
2. System Design and Architecture:
Activities: Conduct a detailed design phase, outlining the system
architecture, selecting appropriate compression algorithms, and defining
the overall structure of the solution.
Outcome: System design documents, architectural blueprints, and a clear
roadmap for development.
3. Development and Testing:
Activities: Implement the designed solution in accordance with the
established architecture. Conduct rigorous testing, including unit testing,
integration testing, and performance testing, to ensure the reliability and
efficiency of the compression algorithms.
Outcome: Functional and tested data compression solution, ready for
deployment.
4. Deployment and Training:
Activities: Roll out the data compression solution in the production
environment. Provide training sessions for system administrators, IT
professionals, and end-users on how to use and maintain the system
effectively.
Outcome: Deployed and operational data compression solution, along
with a trained user base.
5. Optimization and Continuous Improvement:
Activities: Monitor the performance of the data compression solution in
real-world scenarios. Collect feedback from users and stakeholders to
identify areas for improvement. Implement optimizations and updates as
necessary.
Outcome: An optimized and continuously improving data compression
system that aligns with evolving needs and technological advancements.
14.References
Technology Stack-
Python –
https://youtu.be/pRhtjx0dw_k?si=lv6LKFGaYjIuF1IN
Frontend:-
https://youtube.com/playlist?
list=PLu0W_9lII9agiCUZYRsvtGTXdxkzPyItg&si=pUgVTypp3KNll9
hV
Backend:-
https://youtube.com/playlist?
list=PLB97yPrFwo5hrMS7symkj4IW4v3xa_kjZ&si=O7Nd2EEXYFG8
dpqm