You are on page 1of 181

KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)

Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

INDEX
S. NO TITLE OF THE PAPER PAGE
NO
1 A Study And Survey Of Big Data Using Data Mining Techniques 1

R.Varun Teja, Siva Prasad Guntakala


2 A Survey: Internet of Things (IOT) Technologies, Applications 6

Reddy Yuvaraju Daara Babu, Siva Prasad Guntakala


3 A Study Of Clustering Algorithms In Recent Years 10
Chiranjeevi Revu, Siva Prasad Guntakala

4 The Prediction Of Disease Using Machine Learning 17


Salavadi Prasanna Kumar,Siva Prasad Guntakala

5 An Overview Of Biometric Security Technology 21


Seela Anitha ,Siva Prasad Guntakala
6 An Efficient Approach for Secured Data Transmission between IoT and 27
Cloud
S. Ganesh, Siva Prasad Guntakala
7 Impact of Social Networking on IndianYouth - A Survey 33
Settipalli Venkata Surya, Siva Prasad Guntakala
8 Analysis Of Methods And Application In Machine Learning 40
Shaik.Ashraf, Siva Prasad Guntakala
9 A Review On Artificial Intelligence In Pharma Industry 46
Shaik Basheer Ahmed, Siva Prasad Guntakala
10 Overview Of Machine Learning Vs Deep Learning 53
Shaik Bushra Tabassum, Siva Prasad Guntakala
11 An Overview Of Machine Learning Applications In Defence 59
Shaik Hidaytulla, Siva Prasad Guntakala
12 A Study on Machine Learning Applications in Media 65
Imamuddin Shaik, Siva Prasad Guntakala
13 Literature Review of Blockchain Technology 70
Shaik Imran,Siva Prasad Guntakala
14 A Survey On E-Payments In India 75
Shaik Karimulla, Siva Prasad Guntakala
15 The Study On Public Acceptance Of Upi And Digital Payments 79
Shaik Mohiddin Basha, Siva Prasad Guntakala
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

S. NO TITLE OF THE PAPER PAGE


NO
16 High-resolution motion-compensated imaging Photo plethysmograph for 83
remote heart rate monitoring
Shaik Nagoor Bhi, Siva Prasad Guntakala
17 A study on a comprehensive review on machine learning in Health care 87
industry
Sk nagur shareef , Siva Prasad Guntakala
18 A study on Deep Learning: A Comprehensive Overview on Techniques 91
& Applications
Shaik Nazeerbee, Siva Prasad Guntakala
19 Effective Techniques In Cloud Computing And Migration To Cloud 101
Process
Sk.shaheena, Siva Prasad Guntakala
20 An Overview On Chatbot Technology In Recent Years 106
S.Anil Kumar, Siva Prasad Guntakala
21 A Study on Marketing Strategies in Life Insurance Services 111
Vineetha thota, Siva Prasad Guntakala
22 Hybrid Deep Neural network and Long Short-term Memory Network for 117
Predicting of Sunspot Time Series
Nagamani Tippala, Siva Prasad Guntakala
23 Study of Wireless Communication for Substation Automation 120
T. Sai Lakshmi Niharika, , Siva Prasad Guntakala
24 A Survey Of Data Mining Techniques For Cyber Security 129
Mudraboina. Ganesh, Siva Prasad Guntakala
25 A Study On Green Cloud Computing 135
Udayagiri Ram Kumar, Siva Prasad Guntakala
26 A Review of Recent Developments in Driver Drowsiness Detection 142
Systems
Udumula Asha Latha, Siva Prasad Guntakala
27 Social Media Users In India: A Futurestic Apporach 147
Udumula Rohith, Siva Prasad Guntakala
28 A Study On Cyber Security In India: An Evolving Concern For National 152
Security
Ummadi Sai Durga, Siva Prasad Guntakala
29 An Overview Of Data Visualization Of Machine Learning 159
Ummadi sarath kumar, Siva Prasad Guntakala
30 A Study On Mobile OS And Their Advances In Recent Years 164
V. Bhanu Sri, Siva Prasad Guntakala
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Study And Survey Of Big Data Using Data


Mining Techniques
R.Varun teja, Siva Prasad Guntakala,
Student, Assistant. Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) KBN College (Autonomous)
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: varunteja9246@gmail.com Email: gspkbn@gmail.com

ABSTRACT: scalable analytics services, programming tools, and


applications.
In the current age of digitalization, we engage with
a diverse range of data. Google, Microsoft, and Data mining, also known as Knowledge Discovery
Amazon handle immense volumes of data, in Databases (KDD), is an analytical process
processing substantial amounts on a daily basis. utilized in various fields to uncover significant
This necessitates the development of techniques to relationships among variables within large
optimize technology for efficient data processing. datasets. The analysis of swift and massive
The concept of Big Data has emerged as a solution, streaming data has the potential to yield new
encompassing innovative methods and technologies valuable knowledge and theoretical concepts. Big
for analyzing large, intricate datasets generated Data has the capacity to assist organizations in
exponentially from various sources at varying rates. enhancing their operations and making quicker and
Data mining techniques have proven invaluable in more informed decisions.
the realm of Big Data analytics, as grappling with
such vast datasets poses significant challenges for
applications. Big Data analytics involves the skill II. DATA MINING DEFINITION
of extracting valuable insights from such extensive
datasets. This article offers a comprehensive survey Data mining is the process of discovering
that covers the significance, obstacles, and meaningful patterns, relationships, and insights
applications of Big Data across different domains, within large datasets using various techniques such
along with diverse approaches employed for as statistical analysis, machine learning, and pattern
analyzing Big Data using Data Mining methods. recognition. It involves extracting valuable
The outcomes of this survey provide valuable information from data to support decision-making,
insights to researchers regarding the primary trends prediction, and optimization across various fields
in Big Data research and analysis, employing like business, science, and research.
various analytical domains.
TYPES OF DATA MAINING
KEYWORDS:
Analytics for Massive Data Sets, Large-Scale Data There are several types of data mining:
Analysis, Methods in Data Mining
1. Classification: This involves categorizing data
I. INTRODUCTION into predefined classes or groups based on specific
In the digital age, analysts are faced with an attributes. It's used for tasks like spam detection,
abundance of available data. The term "Big Data" customer segmentation, and medical diagnosis.
refers to a collection of datasets, including
unstructured, semi-structured, and structured 2. Clustering: Clustering involves grouping similar
forms, characterized by their vast volume, data points together based on their inherent
complexity, and rapid growth. These datasets characteristics. It's useful for market segmentation,
present challenges in terms of capture, anomaly detection, and recommendation systems.
management, processing, and analysis using
conventional database software tools and 3. Regression: Regression aims to establish a
technologies. The data come in various formats relationship between variables, predicting a
such as text, video, images, audio, webpage log continuous numerical value based on other input
files, blogs, tweets, location information, and variables. It's employed in areas like sales
sensor data. Extracting meaningful insights from forecasting and trend analysis.
these extensive datasets necessitates intelligent and
ISBN Number : 978-81-958673-8-7 1
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

4. Association Rule Mining: This focuses on follows:


identifying relationships or patterns among
variables in large datasets, often used in retail for ● Classification:
market basket analysis (e.g., people who buy X are The task of data mining involves determining the
likely to buy Y). category to which a new observation belongs. This
is achieved through a model established as a
5. Anomaly Detection: Anomaly detection seeks to function of the values of other attributes, utilizing a
find rare and unusual patterns that deviate from the training dataset with accurately labeled
norm. It's crucial for fraud detection, network observations. The classification process
security, and quality control. automatically assigns records to predefined classes.
For instance, it might classify credit card
6. Text Mining: Text mining involves extracting transactions as legitimate or fraudulent, or
useful information from text data, analyzing categorize news stories as finance, entertainment,
sentiments, categorizing documents, and more. sports, and so on. Numerous techniques have
emerged for classification, with decision tree-based
7. Series Analysis Time: This type deals with data methods, neural networks, support vector machines
collected at regular time intervals, aiming to (SVM), naive Bayes classifier, and k-nearest
understand patterns and make predictions based on neighbor (KNN) being the most widely used
time-based trends. approaches.
Another notable technique is Repeated Incremental
8. Sequential Pattern Mining: This focuses on Pruning to Produce Error Reduction (RIPPER),
finding sequential patterns or sequences of events, which generates a detection model using a
commonly used in analyzing customer behaviors collection of "if . . . then . . ." rules. This approach
and web clickstreams. is particularly useful for classifying objects and
constructing rules to identify malicious executables
9. Spatial Data Mining: This pertains to in future instances.
geographical data, analyzing patterns and
relationships based on geographic location. ● Regression:
In the realm of predictive data mining, regression
10. Web Mining: Web mining involves extracting stands as the counterpart to classification.
knowledge and patterns from web-related data, Regression is a supervised mining function that
such as web page content, links, and user revolves around predicting a numerical target.
interactions. During the training phase of a regression model, it
assesses the target value through a function of the
These types of data mining techniques are predictors associated with each data point. This
employed based on the specific goals and process generates a model encapsulating the
characteristics of the data being analyzed. relationship between the target value and the
predictors, which can then be employed on various
III. DATA MINING TECHNIQUES datasets containing unknown target values.
In summary, regression constitutes a predictive
In order to ensure meaningful data mining results, data mining approach aimed at estimating
it is necessary to understand the data being numerical target values by analyzing the
processed. Data mining approaches are usually relationship between predictors and targets, with
affected by several factors, such as noisy data that techniques like GLM being pivotal in this process.
include null values and untypical values (i.e.
outliers). According to the changing nature of the ● Classifier Ensembles:
data to be mined, extensions have been introduced Classifier ensembles introduce the innovative
to data mining; spatial data mining, for mining concept of amalgamating multiple individual
spatial data; web usage mining and web content classifiers to enhance their performance
mining, for mining users’ behaviors and specific collectively. These individual classifiers can be
topics over the web respectively; graph mining, for constructed using various classification
mining data in networks; and recently big data methodologies, each demonstrating distinct levels
mining, which is an evolved branch of big data of accuracy in classifying instances. One approach
analytics to fit different types of data . within the realm of classifier ensembles is bagging,
which stands for bootstrap aggregating. Bagging
1.Predictive Data Mining: generates an ensemble of models by constructing
The predictive task uses specific variables or them from multiple bootstrap replicate samples.
values in the data set to predict unknown or future To sum up, classifier ensembles leverage the power
values of other variables of interest. Several of combining multiple classifiers to achieve
approaches have been proposed for prediction as improved performance. Techniques like bagging

ISBN Number : 978-81-958673-8-7 2


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

and random forest employ diverse methodologies, Optimization involves the process of identifying
while rotation forest employs feature extraction to the most efficient and effective alternatives,
create effective ensembles. considering specified constraints. This is achieved
by maximizing desired factors and minimizing
2. Descriptive Data Mining: undesired ones. Genetic algorithms stand as well-
Descriptive models are designed to examine known tools for optimization and search problems,
historical events within the data, offering insights employing a method akin to simulated evolution to
that can guide future approaches. These models breed computer-generated solutions. This
delve into past occurrences, mining historical data evolutionary process begins with a population of
to decipher the factors contributing to past randomly generated individuals. With each
successes or failures. By quantifying relationships successive generation, the fitness of each individual
within the data, descriptive models have the is evaluated, guiding their selection for the next
capacity to classify entities, such as grouping iteration. The algorithm concludes either when a
customers into segments based on their attributes. predetermined maximum number of generations is
This distinction sets them apart from predictive reached or when an acceptable fitness level is
models, which focus on predicting the behavior of attained for the population.
individual entities, such as customers. Data mining techniques play a crucial role in data
Several approaches have stemmed from the realm preprocessing. This involves tasks such as
of descriptive models, including: cleansing data by eliminating outliers through
● Association Rules Mining: clustering methods and refining data by applying
It is an approach for exploring the relationships of regression techniques to reduce noise. Sampling, a
interest between variables in huge databases. statistical technique, is frequently employed in data
Considering groups of transactions, it discovers preprocessing before deploying most data mining
rules that forecast the existence of an item methods. The reason behind this usage is that
depending on the existences of other items in the working with the complete dataset of interest can
transaction. It is applied to guide positioning be prohibitively expensive and time-consuming. By
products inside stores in such a way to increase utilizing sampling, a representative subset is
sales, to investigate web server logs in order to selected for analysis, facilitating more efficient and
deduce information about visitors to websites, or to manageable processing.
study biological data to discover new correlations.
Examples for association rules mining techniques IV. EVOLUTION TO BIG DATA ANALYTICS
are: Frequent Pattern (FP) Growth and Apriori. TECHNIQUES:
Apriori explores rules satisfying support and The term 'Big Data' made its debut in 1998 within
confidence values that are greater than a predefined a Silicon Graphics (SGI) slide deck by John
minimum threshold value. Mashey titled "Big Data and the Next Wave of
InfraStress". The connection between Big Data and
● Clustering: data mining was evident from the outset. The first
Cluster Analysis is one of the unsupervised book mentioning 'Big Data' emerged in 1998,
learning techniques, which collects similar objects authored by Weiss and Indrukya, and the first
together that are far different from the rest of academic paper with 'Big Data' in its title was
objects in other groups. Examples include grouping written by Diebold in 2000.
of related documents in emails, or proteins and
genes having similar functionalities. Many types of The term 'Big Data' stems from the vast amount of
clustering techniques have been introduced like the data generated daily. Usama Fayyad's presentation
nonexclusive clustering, where the data may belong at the KDD BigMine'12 Workshop highlighted
to multiple clusters. Whereas fuzzy clustering staggering data statistics from internet usage. For
considers a data item to be a member to all clusters instance, Google handles over 1 billion queries
with different weights ranging from 0 to 1. daily, Twitter produces more than 250 million
Hierarchical (agglomerative) clustering, on the tweets daily, Facebook witnesses more than 800
other hand, creates a group of nested clusters that million updates per day, and YouTube registers
are arranged in the form of a hierarchical tree. K- over 4 billion daily views. The data generated
means is the most famous clustering algorithm, annually is estimated to be in the order of
where it uses a partitioned approach to separate the zettabytes, growing around 40% annually. Mobile
data items into a pre-determined number of clusters devices are also contributing significantly to this
having a centroid; data items that are in one cluster data surge, with major companies like Google,
are closer to its centroid. K-medoids algorithm is a Apple, Facebook, Yahoo, and Twitter exploring
clustering algorithm related to K-means algorithm, this data for useful patterns to enhance user
which chooses data points as centers . experiences.

3. Optimization Data Mining: Analyzing massive volumes of data empowers

ISBN Number : 978-81-958673-8-7 3


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

analysts, researchers, and business users to make speech recognition, and rules-based decision
more informed and timely decisions using engines have propelled the growth of video, audio,
previously hidden, inaccessible, or unusable data. and image analytics techniques. These
However, the substantial increase in data volume advancements have been driven by the availability
rendered traditional data mining algorithms of real-time data containing rich image and video
inadequate. Consequently, research has focused on content. As a result, these techniques hold
enhancing data mining techniques to accommodate significant potential to address a range of
Big Data, giving rise to the field of big data economic, political, and social challenges.
analytics.
The continuous evolution of big data technology,
Big data analytic techniques encompass various coupled with innovative analytics approaches, is
data mining functions, with association rules shaping a landscape where meaningful insights can
mining and classification tree analysis being of be extracted from diverse data sources, paving the
paramount importance. This section delves into the way for informed decision-making across various
core data mining tasks that have transitioned to big domains.
data analytics. It elucidates the enhancements
introduced to these techniques to facilitate their VI.BIBLIOGRAPHY
adaptation to big data, addressing the "V"
dimensions of big data through such modifications. G. Siva Prasad, M.C.A, M.Tech
Table 1 serves as a comprehensive summary of this (CSE), UGCNET, Works as
analysis, outlining the evolution of data mining Assistant Professor in the
tasks to big data analytics. Techniques are Department of MCA , KBN
categorized based on their data mining task, College (Autonomous),
presenting their current status in terms of being Vijayawada, Andhra Pradesh and
adapted to big data analytics and the specific he is having 10 years of experience in teaching and
dimension of big data they address. The subsequent one year in industrial experience. His research
subsections delve into the enhancements made to interest includes Data Mining, Machine Learning,
different data mining techniques, catering to the Deep Learning, Big Data, Microsoft Programming
dimensions of big data and fostering their evolution Languages and Web Programming. He has
attended workshops on POWER BI, Data Analytics
using R, Generative AI, Block Chain Technology
and many more.

VII. REFERENCES

1. T. Li and L. Long, “Imaging examination and


quantitative detection and analysis of
gastrointestinal diseases based on data mining
technology,” Journal of Medical Systems, vol. 44,
into big data analytic techniques. no. 1, pp. 1–15, 2020.
View at: Publisher Site | Google Scholar
2. C. Zuo, “Defense of computer network viruses
V. CONCLUSION: based on data mining technology,” International
Over the past decade, there has been an Journal on Network Security, vol. 20, no. 4, pp.
exponential surge in data capacity and complexity, 805–810, 2018.
prompting extensive research in the realm of big View at: Publisher Site | Google Scholar
data technology. This paper endeavors to provide a 3.W. A. N. G. Zhao-Yi, H. U. A. N. G. Zheng-De,
comprehensive overview of recent literature Y. A. N. G. Ping, R. E. N. Ting, and L. I. Xin-Hui,
reviews year by year, focusing on the domain of “Regularity of wind-dispelling medication
Big Data and its analysis through various analytics prescribed by li dong-yuan: a data mining
approaches. technology-based study,” Digital Chinese
Medicine, vol. 3, no. 1, pp. 20–33, 2020.
In particular, text analytics has emerged as a 4. P. Wang, Y. Zhang, and H. Yang, “Research on
prominent facet within the landscape of Big Data. economic optimization of m cluster based on chaos
Regarded as the next generation of data analysis, sparrow search algorithm,” Computational
text analytics has transitioned from a specialized Intelligence and Neuroscience, vol. 2021, no. 3,
field to a mainstream approach for extracting Article ID 5556780, 18 pages, 2021.
valuable insights from the vast pool of opinions 5. Puneet Singh Duggal and Sanchita Paul, Big
shared across social media platforms. Meanwhile, Data Analysis : Challenges and Solutions.
advancements in machine vision, multi-lingual 6. Wei Fan and Albert Bifet, Mining Big Data:

ISBN Number : 978-81-958673-8-7 4


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Current Status, and Forecast to the Future,


SIGKDD Explorations, Volume 14, Issue 2, 2012.
7. S.Vikram Phaneendra and E.Madhusudhan
Reddy, Big Data- solutions for RDBMS problems-
A survey, IEEE/IFIP Network
8. Operations & Management Symposium (NOMS
2010),Osaka Japan, Apr 19-23 2013.
9. Sagiroglu, S. and Sinanc, D., Big Data: A
Review, International Conference on Collaboration
Technologies and Systems (CTS), pp. 42-47, 20-
24, May 2013.
10. Richa Gupta, Sunny Gupta and Anuradha
Singhal, Big Data : Overview, IJCTT, Vol 9,
Number 5, March 2014.
11.Suthaharan, Shan, “Big data classification:
problems and challenges in network intrusion
prediction with machine learning.” ACM
SIGMETRICS Performance Evaluation Review
41.4 (2014): 70-73.
12. Li, Deren, and Shuliang Wang. “Concepts,
principles and applications of spatial data mining
and knowledge discovery.” Proceedings of the
International Symposium on Spatio-Temporal
Modeling, (STM’05), Beijing, China. 2005.
13. Zaki, Mohammed J., and Wagner Meira Jr,
“Data Mining and Analysis: Fundamental Concepts
and Algorithms”, Cambridge University Press,
2014.
14. Washio, Takashi, and Hiroshi Motoda, “State
of the art of graph-based data mining.” ACM
SIGKDD Explorations Newsletter 5.1 (2003): 59-
68.
15. Mohammed J. Zaki, Limsoon Wong, Data
Mining Techniques, August 9, 2003 WSPC/Lecture
Notes.
16. Arinto Murdopo, “Distributed Decision Tree
Learning for Mining Big Data Streams”, July 2013.
17. A Min Tjoa, Iman Paryudi, Ahmad Ashari,
“Performance Comparison between Naïve Bayes,
Decision Tree and k-Nearest Neighbor in Searching
Alternative Design in an Energy Simulation Tool”,
Journal of IJACSA, IJACSA
18. (International Journal of Advanced Computer
Science and Applications), vol. 4, no. 11, 2013.
19. Lionel Fugon, J´er´emieJuban and George
Kariniotakis, “Data mining for wind power
forecasting”, European Wind Energy Conference -
Brussels, Belgium, April 2008.

ISBN Number : 978-81-958673-8-7 5


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Survey: Internet of Things (IOT)


Technologies, Applications
Reddy Yuvaraju Daara babu, Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: reddyyuvaraj1@gmail.com Email: gspkbn@gmail.com

ABSTRACT protocols, and applications.

The Internet of Things (IoT) has exerted a profound


and pervasive influence on our daily lives,
spanning from inception to conclusion. Within this
vast network of interconnected devices, data
collection and sharing transpire, encompassing
information about operational environments and
patterns of usage. This intricate amalgamation of
the physical and digital realms epitomizes the
potent concept of IoT, forging a nexus between
tangible objects and the realm of information
technology. This visualization, culminating in The Internet of Things (IoT) stands as a
actionable intelligence and the creation of tangible revolutionary paradigm in the modern
value.paper expounds upon IoT's implications, technological transcends traditional boundaries,
particularly concerning home automation, sensor imbuing inanimate entities with the ability to
integration, and the ensuing elevation of comfort. communicate, share data, and collaborate
By subjecting IoT-generated data to analytics, landscape. With the seamless interconnection of an
invaluable insights are extracted and tailored extensive network of devices, objects, and systems,
through The Internet of Things, as a concept, the IoT. This convergence of the physical and
wasn’t officially named until 1999, but one of the digital realms has engendered a transformative
first examples of an IoT is from the early 1980s, shift, enabling real-time data exchange and
and was a Coca Cola machine, located at the interaction on an unprecedented scale. As IoT
Carnegie Mellon University continues to weave itself into the fabric of our daily
lives, its far-reaching implications touch upon
I. INTRODUCTION diverse domains, from smart homes and industrial
automation to healthcare, transportation, and
Internet of Things is influencing enormously in beyond. This introduction sets the stage to explore
our lifestyle from the day we begin to the day we the multifaceted dimensions of the Internet of
end. IOT is an immense network with connected Things and its profound impact on the way we
devices. These devices gather and share data perceive and interact with the world around us.
about the environment in which they are
operated and how they are used. The preeminent II.APPLICATIONS
concept is, The Internet of Things describes the IoT applications are used in various ways and it
network of physical objects, so know as, helps businesses simplify and improve the
"THINGS", it's all done using sensors, and automation and the processes while providing the
sensors are embedded in every physical device. relevant information and it helps in the activity
In recent times, it is getting more attention due processes of automation. IoT applications also
to its advancement of wirelesstechnology. IOT help to drive new business which is needed to
transforms these objects from being develop products and services.
conventional to smart by manipulating its There are many applications for IoT devices they
underlying technologies such as embedded are often divided into
devices, wireless sensor networks, automation

ISBN Number : 978-81-958673-8-7 6


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

 Smart homes and consumer IoT production&processes.


 Industrial IoT
 Smart cities
 Transportation

A.Smart homes and comsumer IoT:

A smart home refers to a convenient home setup


where appliances and devices can be automatically
controlled remotely from anywhere with an
internet connection using a mobile or
other networked device. Devices in a smart home
C.Smart cities
are interconnected through the internet, allowing
the user to control functions such as security
access to the home, temperature, lighting, and a The idea of a smart city has been introduced to
home theater remotely. highlight the importance of Information and
Communication Technologies ( ICT’s) in the past
20 years for the quantitative and qualitative
analysis of industrialization. In literal terms, the
smart city is used to specify a city’s ability to cater
to the needs of citizens. Development of city and
quality of life are profoundly influenced by the core
systems of a city: transport, education, and
government services; public safety and health.
Research has focused on these four areas, which
identity having high priority.

For example, nowadays the company called Tesla


released a self-driving car that is purely based on
the AI of IoT.

literature review highlights that various criteria


referring to improve life in a city are mentioned in
connection to the terms of a smart city. The most
important area for starting to transform a city into a
B. Industrial IoT
smart one in the system of communication; thus,
this area prioritizes the use of modern transport
The industrial internet of things (IoT) refers to the technologies. Smart transformation system works
extension and use of the internet of things (IoT) in as a highway between the development of city and
industrial sectors and applications. With a strong modern technologies.
focus on machine-to-machine (M2M)
communication, big data, and machine learning, the
Iot enables industries and enterprises to have better D. Transortation
efficiency and reliability in their operations. The
IIoT encompasses industrial applications, including
Internet of Things (IoT) has crucial applications
robotics, medical devices, and software-defined
in the transportation system. IoT plays an important
role in all the field of transportation as air-

ISBN Number : 978-81-958673-8-7 7


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

transportation, water-transportation, and land IV. DRAWBACKS OF IOT


transportation. All the component of these
transportation fields is built with smart devices As the Internet of things facilitates a set of benefits,
(sensors, processors) and interconnected through it also creates a significant set of challenges. Some
cloud server or different servers that transmit data of the IoT challenges are given below:
to networks.
o Security: As the IoT systems are
interconnected and communicate over
networks. The system offers little control
despite any security measures, and it can
be lead the various kinds of network
attacks.
o Privacy: Even without the active
participation on the user, the IoT system
provides substantial personal data in
maximum detail.
o Complexity: The designing, developing,
and maintaining and enabling the large
Sensors built inside or outside a vehicle suggest technology to IoT system is quite
lane departure and continuously monitor object at complicated.
all side to avoid the collision. IoT component of
transportation does not only mean within the
V. CONCLUSION
vehicle, but it extends beyond car to communicate
other, enabling automate real-time decision to
optimize travel. For example, traffic monitoring Internet of Things (IoT) is a network of physical
camera identifies the accident or traffic conjunction objects or people called “things” that are embedded
and send an alert message to the nearest traffic with software, electronics, network, and sensors
control room and send current traffic conjunction which allows these objects to collect and exchange
information to other near vehicles to divert their data. The actual idea of connected devices was
route. proposed in 1970 and since then it is evolving still.

III. ADVANTAGES OF IOT While we come to the end, IoT holds the lives of
the human being straightforward and satisfactory. It
has made the lives of the people very useful.
Internet of things facilitates the several advantages
Whereas on the other hand with the increased use
in day-to-day life in the business sector. Some of its
of the Internet of Things the treat for security and
benefits are given below:
safety has also improve. So we should be
concerned while giving the details on the Internet
o Efficient resource utilization: If we platform. Fusion informatics is a web, mobile and
know the functionality and the way that best IoT App Development Companies in
how each device work we definitely Atlanta specializing in the development of
increase the efficient resource utilization challenging and complex projects. Since 2000,
as well as monitor natural resources. we’ve delivered compelling solutions for such
o Minimize human effort: As the devices companies as Bosch, Lenovo, Bharat Petroleum,
of IoT interact and communicate with Reliance, Tardebulls and others.
each other and do lot of task for us, then
they minimize the human effort.
o Save time: As it reduces the human effort IoT is an advanced automation and analytics
then it definitely saves out time. Time is system which deals with artificial intelligence,
the primary factor which can save through sensor, networking, electronic, cloud messaging
IoT platform. etc. to deliver complete systems for the product or
o Enhance Data Collection: services. The system created by IoT has greater
transparency, control, and performance.
o Improve security: Now, if we have a
system that all these things are As we have a platform such as a cloud that contains
interconnected then we can make the all the data through which we connect all the things
system more secure and efficient. around us. For example, a house, where we can

ISBN Number : 978-81-958673-8-7 8


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

connect our home appliances such as air


conditioner, light, etc. through each other and all VIII. REFERRENCES
these things are managed at the same platform.
Since we have a platform, we can connect our car, 1. Saovapakhiran, B.; Naruephiphat, W.;
track its fuel meter, speed level, and also track the Charnsripinyo, C.; Baydere, S.; Ozdemir, S. QoE-
location of the car. Driven IoT Architecture: A Comprehensive
Review on System and Resource Management.
VI. FUTURE SCOPE OF IOT IEEE Access 2022. [CrossRef] IoT 2022, 3 506
2. Laghari, A.A.; Wu, K.; Laghari, R.A.; Ali, M.;
Khan, A.A. A Review and State of Art of Internet
IoT enabled devices are becoming a part of the
of Things (IoT). Arch. Comput. Methods Eng.
mainstream electronics culture and people are
2022, 29, 1395–1413
adopting smart devices into their homes faster than
ever. The researchers at International Data 3. Raj, A.; Shetty, S.D. IoT Eco-system, Layered
Corporation (IDC) estimate that by 2020 there will Architectures, Security and Advancing
be 25 billion connected devices to the internet. IoT Technologies: A Comprehensive Survey Wirel.
devices will be a huge part of how we interact with Pers. Commun. 2022, 122, 1481–1517.
basic everyday objects.According to consumer 4. Khan, M.A.; Siddiqui, M.S.; Rahmani, M.K.I.;
applications will drive the number of connected Husain, S. Investigation of Big Data Analytics for
things, while enterprise will account for most of the Sustainable Smart City Development: An
revenue. IoT adoption is growing, with Emerging Country, IEEE Access 2022, 10, 16028–
manufacturing and utilities estimated to have the 16036.
largest installed base of Things by 2020.The future
is happening now, and these devices are getting 5. Kirchhof, J.C; Rumpe, B.; Schmalzing, D.;
smarter every day through machine learning and Wortmann, A. MontiThings: Model-Driven
artificial intelligence. To prove that IoT is taking Development and Deployment of Reliable IoT
off rapidly, Target opened up a store in San Applications. J. Syst. Softw. 2022, 183, 111087.
Francisco that exclusively sells IoT devices. There
is big money in the IoT space currently, and it will 6. Ali, Z.H.; Ali, H.A. Towards sustainable smart
only IoT applications architectural elements and design:
opportunities, challenges, and open directions J.
Supercomput. 2021, 77, 5668–5725

7. xena, S.; Bhushan, B.; Ahad, M.A. Blockchain


based solutions to secure IoT: Background,
integration trends and a way forward. J. Netw.
Comput. Appl. 2021, 181, 103050. [CrossRef] IoT
2022, 3 434

8. Stavropoulos, T. G.; Papastergiou, A.;


MpaltadorosL.; Nikolopoulos, S.; Kompatsiaris, I.
c
IoT Wearable Sensors and Devices in Elderly Care:
ontinue to grow as technology improves.
A Literature Review. Sensors, 2020, 20, 2826

VII. BIBLIOGRAPHY 9. Sobin, C.C. A Survey on Architecture, Protocols


and Challenges in IoT Wirel. Pers. Commun. 2020,
G. Siva Prasad, M.C.A, M.Tech 112, 1383–1429
(CSE), UGCNET, Works as
Assistant Professor in the 10. Covaci, A.; Saleme, E.B.; Mesfin, G.; Hussain,
Department of MCA , KBN N.; Kani-Zabihi, E.; Ghinea, G. How do we
College (Autonomous), experience crossmodal correspondent mulsemedia
Vijayawada, Andhra Pradesh and content? IEEE Trans. Multimed. 2019, 22, 124
he is having 10 years of experience
in teaching and one year in industrial experience.
His research interest includes Data Mining,
Machine Learning, Deep Learning, Big Data,
Microsoft Programming Languages and Web
Programming. He has attended workshops on
POWER BI, Data Analytics using R, Generative
AI, Block Chain Technology and many more.

ISBN Number : 978-81-958673-8-7 9


A STUDY OF CLUSTERING ALGORITHMS IN
RECENT YEARS
SivSiva Prasad Guntakala,
CHIRANJEEVI REVU,
Student, Assistant Professor,
Department of MCA, Department of MCA,
K.B.N College (Autonomous), K.B.N. College (Autonomous),
Vijayawada-520001, AP, India. Vijayawada-520001, AP, India.
Email:chiranjeevirevu43@gmail.com Email: gspkbn@gmail.com

Abstract— Clustering, a prominent technique in Clustering methods are commonly used for statistical data
unsupervised machine learning, plays a pivotal role in analysis.
grouping similar data points together in order to discover
inherent patterns and structures within complex datasets. II. CLASSIFICATION OF CLUSTERING
This abstract introduces a novel clustering algorithm ALGORITHMS:
designed to efficiently partition data into cohesive clusters
based on their inherent similarities. The algorithm employs a There are a number of assumptions and initial conditions
multi-step approach, starting with the initialization of cluster that are responsible for many different clustering algorithms.
centers and iteratively optimizing them to minimize intra- The widely accepted classification structures the clustering
cluster distances while maximizing inter-cluster distances. method as follows:
This process involves adaptively adjusting cluster
assignments and refining cluster centers to converge towards  Hierarchical clustering
a stable clustering solution. The algorithm's efficacy is  Partitional clustering
demonstrated through experimental evaluations on various  Density based
datasets, showcasing its ability to handle datasets with  Grid based clustering
varying densities, shapes, and sizes. Furthermore, the  Model based clustering
algorithm's computational efficiency is highlighted, enabling
its application to large-scale datasets. Overall, this clustering This classification depends on many factors and a few
algorithm offers a valuable tool for exploratory data algorithms have been developed that combine multiple
analysis, pattern recognition, and information organization methods. Recently a number of algorithms have been
across diverse domains. developed to provide solutions in different fields but there is
no single universal solution provided by algorithms that
Keywords — clustering, proximity, similarity, CF solve all available clustering problems It’s always important
tree, KDD, optimization. to talk about it. Here are some of the signs listed.

I. INTRODUCTION Scalability: This is the ability of an algorithm to operate


efficiently across multiple data objects or tuples, in terms of
Clustering or cluster analysis is a machine learning memory requirements and time consumption. This feature
technique that aggregates the anonymous data. It can be sets data mining algorithms apart from those mainly used in
defined as "the process of grouping data points into groups, machine learning around. Scalability is one of the key points
including such data points. Potential duplicates remain in the to cover because many clustering algorithms have shown
group that has little or no similarity to another group. It does poor performance in handling large scenarios and data sets
so by searching the unlabeled dataset for some similar Basic field knowledge: There are many clustering
parameters such as size, shape, color, shape etc., and algorithms that need or require some domain knowledge the
classifying them according to the presence and absence of user provides some input parameters e.g. Number of groups.
those similar parameters. But often the user cannot provide such domain knowledge,
This is an unsupervised learning method, so no supervision and this hypersensitivity to the input parameters can also
is given to the algorithm, and it deals with unlabeled data. degrade the performance of the algorithm Discovering
To use this clustering method, each cluster or group is clusters of arbitrary size: One of the most challenging tasks
assigned a cluster ID. An ML system can use this id to is that he will find clusters of different sizes A few
simplify the processing of large and complex data sets. algorithms, such as K-means, are not very efficient in this
Clustering methods are commonly used for statistical data specification. Data segments can have very different shapes
analysis. Clustering techniques are commonly used for and a good clustering algorithm should be able to separate
statistical data analysis datasets. data points resulting in different shapes and sizes Few
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Density-based algorithms such as DBSCAN can use Min pts aggregate hierarchical method to select two clusters in the
concept to achieve this. Many algorithms based on mean or merging phase whose connectivity and proximity reach a
medoid-based approach fail to meet these two clustering threshold value. The algorithm is repeated until none of the
criteria of forming heterogeneous clusters and convergence neighboring clusters can satisfy both conditions.
of concave size clusters The paper throws some light on this
ECHIDNA- This is an aggregate hierarchical method
in the subsequent paragraphs. Similarity or dissimilarity
for clustering network traffic data. The steps of the
measure: This measure is a real value function which
algorithm are given below.
indicates the degree of similarity between two objects.
Extensive literature on these measures can be found Some • Input data is extracted from network traffic with 6
popular measures are listed in Table I . Tuple value of numeric and categorical attributes.
. • Each record iteratively builds a hierarchical tree of
1. HIERARCHICAL CLUSTERING
clusters called CF-Tree.• Insert each record into the closest
BIRCH - Balance iterative reduction and clustering cluster using a combined distance function for all attributes
using hierarchy. It is an aggregation hierarchical algorithm into CF-Tree. • The radius of a cluster determines if a record
that uses the Clustering Feature (CF-Tree) and changes the should be absorbed into the cluster or if the cluster should be
properties of subgroups sequentially. The algorithm is as split. • Once the cluster is created and all the significant
follows. nodes are to form a Cluster Tree. The Cluster Tree is further
compressed to create a concise and meaningful report.
• Load data into memory: The CF Tree is created with a
single data scan. The next step is faster, more accurate, and SNN - Shared Nearest Neighbors A hierarchy of top to
less procedural. • Data condensation: The CF tree was bottom approach is used for grouping the objects. The steps
rebuilt with a large T. • Global clustering: Use existing of algorithm are given below:
clustering algorithms in CF documents. • Make another pass
• A proximity matrix should be maintained for the
on the cluster refining data set and reset the data points to
distances of set of points. • Objects are clustered together
the nearest center point from the steps above. • Continue the
based on the nearest neighbor and the object with maximum
process until k is created with the cluster.
distance can be avoided.
CURE- Clustering Using representatives: The clustering
CACTUS – Clustering Categorical Data Using
method uses a hierarchy and selects well-dispersed points
Summaries. It is a very fast and scalable algorithm for
from the cluster and then draws them to the center of the
finding the clusters. A hierarchy structure is used to generate
cluster by a specified function. The algorithm is as follows.
maximum segments or clusters. A two-step procedure deals
• Initially, all points are in different clusters, each cluster with the description of algorithm as follows:
is defined by a location within the cluster.• Well-dispersed
• Attributes are strongly connected if the data points are
objects are first selected for the cluster and then reduced or
having larger frequency.• Clusters are formed based on the
moved to the cluster by some specified century. • At each
co-occurrences of attribute value pairs.• A cluster is formed
stage of the algorithm, two clusters containing the two
if any segment is having no of elements α times greater than
closest representative points are selected and merged to form
elements of other.
a cluster.
2. PARTITIONAL CLUSTERING
ROCK - Robust Clustering algorithm for Categorical
objects. It is a hierarchical clustering algorithm that uses a Partitional clustering is highly dissimilar to hierarchical
link strategy to form clusters. Bottom-up connections cluster approach which yields an incremental level of clusters with
together. The algorithm is as follows: initially consider a set iterative fusions partitional clustering assigns a set of objects
of points where each point is a cluster and compute the into K clusters with no hierarchical structure. Research from
correlation between any two points. Create a pile for each very recent years acknowledges that partitional algorithms
group and monitor the pile. Based on the criterion function, are a favoured choice when dealing with large datasets. As
goodness of fit measure will be calculated between two these algorithms have comparatively low computational
groups. Merge the cluster whose criterion function has the requirements however when it comes to the coherence of
highest value. clustering this approach is less effective then agglomerative
approach. These algorithms deduce the shape of clusters as
CHAMELEON- This is an aggregate hierarchical
hyper-ellipsoidal and basically experiment with cutting data
clustering algorithm of dynamic modeling that involves a
into n number of clusters so that partitioning of data
two-step method of clustering. The algorithm is as follows:
optimizes a given criterion. Centroid based techniques as
The two-step method of split and combine is used to form a
used by K-MEANS and ISODATA assign some points to
cluster
clusters so that the mean squared distance of points to the
• Consider all data points as a cluster at the time-initial centroid of the chosen cluster is minimized. The sum of
of the Partition phase. • Partition the cluster into a number of squared error function is the dominant criteria function in
sub clusters using the METIS method using the graph partitional approach. It is used as a measure of variation
partition algorithm. • The process ends when a large subset with in a cluster. One of the most popular partitioning
contains a relatively specified number of vertices.• Use the clustering algorithms implementing SE is k – means.

ISBN Number : 978-81-958673-8-7 11


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

K-MEANS-K-means is undoubtedly a very popular clusters and no filler structure is used., this product which
partitioning algorithm. It has been discovered, rediscovered makes this algorithm robust against increase in
and studied by many experts from different fields, by dimensionality of data .The previously mentioned methods
Steinhaus (1965),Ball and Hall (1965),Lloyd(proposed 1957 assume that the distance function must be Euclidean but
– published 1982) and Macqueen(1967). It is distance-based CLARANS uses a local search method and prohibits any
and by definition data is partitioned into pre- determined specific distance function. CLARANS claims to recognize
groups or clusters. The distance measures used could be polygon shapes well. A method called “IR – approximation”
Euclidean or cosine. Originally a fixed K cluster centroids is used to cluster non-convex polygon as well as convex
are marked at random; k-means reassigns all the points to polygon objects.
their closest centroids and re-computes centroids of newly
ISODATA- An interesting method called ISODATA
created groups. This iteration continues till the squared error
“Iterative self-organizing data analysis technique”,
converges. Following steps can summarize the function of
developed by ball and Hall also requires k(no of clusters)
k-means. 1. Initialize a K partition based on previous
and is an iterative type of K-means algorithm, which breaks
information. A cluster prototype matrix A=[ai…..aj] is
down based on ISODATA constraints some so default It
created. Where a1,a2,a3 …are cluster centroid . Data set D
works by dynamically adjusting clusters by dynamically
is also initialized. 2. In the next step assignment of each data
adjusting clusters by ordering to combine such as , C :
point in the dataset (di) to its nearest cluster(ai) is
(desired number of clusters ),Dmin : minimum number of
performed. 3. Cluster matrix can be recalculated considering
data points for each cluster ,Vmax: maximum variance for
the current updated partition or until ai,aj,ak….. Show no
separation clusters and Mmin : minimum distance measure
further change. 4. Repeat 2 and 3 until convergence has been
for integration . ISODATA can handle the problem of
reached. K-means is probably the most wildly studied
outliers better than k-means through the partitioning process
algorithm this is the reason why there exists too many
and ISODATA can also eliminate the possibility of rotating
variations and improved versions of k-means yet it can show
sets Advantages and disadvantages: The most prominent
some sensitivity towards noise and outliers present in data
advantages of methods a depending on separation is to be
sets. Even if a point is at a distance from the cluster centroid,
used for spherical-based sets It is possible that algorithms
it could still be enforced to the center and can result in
such as k-means may tend to exhibit a higher sensitivity to
distorted cluster shape. K- means does not clearly defines a
noise and outliers while other methods showing resistance to
universal method of deciding total number of partitions in
noise may appear if it is computationally expensive Besides
the beginning , this algorithm relies heavily on user to
computational complexity, algorithms can be compared to
provide in advance ,the number of k clusters. Also, k-means
some other common features f the ability to form a cluster or
is not applicable to categorical data. Since k-means
although all clustering algorithms try to solve the same
presumes that user will provide initial assignments it can
problem but some performance issues there is a possibility
produce replicated results upon every iteration.(the k-means
of discussion. Table II summarizes some of the
++ addresses this problem by attempting to choose better
comparisons. The following table shows different time and
starting clusters .
space challenges.
K-MEDOIDS- Unlike k-means, in k-medoids or FCM - Fuzzy CMEANS algorithm: The algorithm is based
partition around medoids (PAM) method, a medoid on K-means concept of partitioning dataset into Clusters.
represents each group. Medoid, this outstanding feature, is The algorithm is as follows: • Calculate the cluster center
the most central point of the group. Medoid shows better points and the target value and initialize the fuzzy matrix. •
results for outliers than focal points. K-means search the Calculate the membership values stored in the matrix. The
mean to define the cluster center exactly which can cause paper presents all the algorithms and their efficiency based
exaggerated effects but k-medoid uses the points to estimate on the input parameters for Big Data mining as explained
the cluster mean by the actual area Basically this algorithm below: • If the value of the target between successive
tries to reduce objects across the interface with adjacent iterations is smaller than the status quo. • This process
objects. The following steps can summarize this algorithm. continues until the partition matrix and clusters are created.
1. Initialize: A random k is selected from n data points as
3. DENSITY BASED ALGORITHM
a medoid.
2. Provide: Each data point must be associated with the This clustering strategy focuses on the principle of
nearest medoid. “neighborhoods”, clustering checks how for any given N>0,
each neighborhood must have a minimum number of points
3. Variation: For each m medoid and data point d, m and
ie. the “density” of the N-region of points must exceed some
d can be varied to estimate the difference in d over all data
initial value (Aster et al.1996). Proximity is not a criterion
points associated with m. Steps 2 and 3 can be repeated until
here but “local density” is primarily measured. A cluster
there are no more changes in the activities. PAM uses a
appears as data points dispersed in the data space. In Density
greedy search result that fails to find an optimal solution.
based clustering, a corresponding region of objects and data
4. CLARANS “Clustering large applications based on with low density exists and their distance is calculated.
random search”, this method combines sampling technique Objects in low density region are outliers or noise .These
with PAM.This method uses random search method to find methods have good tolerance to noise and can detect clusters

ISBN Number : 978-81-958673-8-7 12


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

of non-convex shapes. Two well-known representatives of algorithm. It adds two more steps to the concept of
density-based algorithms are density-based spatial clustering DBSCAN clustering.
for applications with noise (DBSCAN) and density-based
clustering (DENCLUE). OPTICS (Ordering Points To Identify the Clustering
DBSCAN-It was proposed by Martin Ester, Hans- Structure) is a density-based clustering algorithm, similar
PeterKrigel, JorgSander and Jiawei in 1996.one of the most to DBSCAN (Density-Based Spatial Clustering of
popular density based algorithm. It requires that the density Applications with Noise), but can extract clusters of
in the neighborhood of an object should be high enough if it varying density and size In large, high-quality data sets
belongs to a cluster. A cluster skeleton is created with this This is useful for clusters with different densities.
set of core objects with overlapping neighborhood. Points
inside the neighborhood of core objects represent the The main idea behind OPTICS is to extract the clustering
boundary of clusters while rest is simply noise. It requires structure of the dataset by identifying density-related
two parameters 1) ᴇ is the starting point and 2) Min pts, is points. The algorithm creates a density-based visualization
the minimum number of points required to form a dense of the data by constructing a sequence of points, called a
region. The following steps can elaborate the algorithm reachability plot. Each location in the list is associated with
further: 1 An unvisited random point is usually taken as the a reach distance, which is a measure of how easily that
initial point. 2. A parameter E is used for determining the location is reached from other locations in the data set
neighborhood (data space) 3. If there exist sufficient data Points with the same reach distance may belong to the
points or neighborhood around the initial random point then same cluster in.
algorithm can proceed and this particular data point is
labeled as visited or else the point is labeled as a flaw in data
or outlier. 4. If this point is considered a part of the cluster Distribution Based Clustering Algorithm for Mining
then its E neighborhood is also the part of the cluster and Large Spatial Databases (DBCLASD) - This algorithm
step 2 is repeated for all E. this is repeated until all points in finds clusters of uniform size and does not require any input
the cluster are determined. 5. Another initial data point is parameters The performance of DBCLASD in large spatial
processed and above steps are restated until all clusters and databases is also very attractive. The basis of the DBCLASD
noise are discovered. Although this algorithm shows algorithm is that the points in the cluster are uniformly
excellent results against noise, it can be a failure when tested distributed. The application of DBCLASD to seismic data
in high dimensional data sets and shows sensitivity to Min sets shows good performance even in real databases where
pts. This algorithm fairs well as compared to k-means in the data are not uniformly distributed. It works very well for
terms of creating clustering of varied shapes. large spatial databases. This algorithm satisfies all the
DENCLUE- Density based clustering (DENCLUE) was requirements needed to develop a good clustering algorithm
developed by hinneburg and keim. This algorithm buys for spatial databases
heavily from the concepts of density and hill climbing. In
this method there is an “influence function” which is the TIME and SPACE complexity of clustering algorithms
distance or influence between random points. Many
influence functions are mainly calculated and added up to
find out the “density function”. So it can be said that
influence function is the influence of a data point in its
neighborhood and density function is the total sum of all
influences of all the data points. Clusters are determined by
density attracters, local maxima of the overall density
function. has a fairly good scalability, a complexity of
(O(N)) it is able to spot and converge clusters with
unpredictable shapes But suffers with a sensitivity towards
input parameters. DENCLUE suffers from curse of
dimensionality phenomenon. Advantages and
disadvantages: density based methods can very effectively
discover arbitrary shaped clusters and capable of dealing
with noise in data much better than hierarchical or partition
methods, these methods do not require any predefined
specification for the number of partitions or clusters but
most density based algorithms show decrease in efficiency if
dimensionality of data is increased although algorithm like
DENCLUE shows some escalation while dealing with high
dimensionality but it is still far from completely effective.

ORDERING POINTS to set CLUSTER STRUCTURE


(OPTICS)-takes control from the DBSCAN clustering

ISBN Number : 978-81-958673-8-7 13


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

in decreasing order to form clusters. • The process is


repeated for the remaining blocks.
CLIQUE – Clustering In Quest. A subspace clustering
algorithm that takes into account numerical characteristics
using a bottom-up approach for clustering. The algorithm is
as follows: • Considering some data points, apply the same
expansion to the points to create a grid cell at a time. •
Rectangular subspace cells with density greater than τ
should be placed in uniform grids. • The process repeats
iteratively to build (q-1) dimensional units to q dimensional
units. • Substations are connected to each other to form a
group of equal size.
OPTI GRID – Optimal Grid. The algorithm is designed to
assemble large spatial databases.
The algorithm is as follows: • Define a data set with an
optimal cutting hyperplane using selected projections. •
Select the local optima around the best plane. • Place all
cutting planes with scores greater than or equal to the
minimum cutting score in the BEST CUT. • Choose q cut
plane with the highest score as BEST CUT and use q cut
plane to construct a multidimensional grid G. • Insert all
data points in D into G to determine the most populated grid
cells in G and make C clusters.
MAFIA – Adaptive Finite Interval Solution. It is a
descendant of the CLIQUE algorithm that instead of using a
fixed size cell grid structure with the same number of bins in
each dimension, creates an adaptive grid to improve
clustering quality The algorithm is as follows: • An adaptive
4. GRID BASED CLUSTERIN ALGORITHM: grid structure is built a the entire set of points is considered
STING – Statistical Information Grid based method. It is at one time. • Calculate histograms by reading pieces of data
similar .to BIRCH hierarchical algorithm to form a cluster into memory using bins. • Bins are grouped based on
with spatial data bases. The algorithm is as follows:• strength factor α. • Select bins with densities α times greater
Initially the spatial data stored into rectangular cells using a than the mean as p candidate cubic units (CDUs). •
hierarchical grid structure. • Partition each cell into 4 child Iteratively the process continuously generates new p-CDU’s
cells at the next level with each child corresponding to a and merges them into groups of adjacent CDU’s.
quadrant of the parent cell. • Calculate probability of each ENCLUS – Cluster-based entropy. The algorithm is an
cell whether it is relevant or not. If the cell is relevant then entropy based algorithm for clustering big data. ENCLUS is
apply same calculations on each cell one by one. • Find the a variation of the CLIQUE algorithm. The algorithm is as
regions of relevant cells in order to form cluster. follows: • Objects whose subspaces are spanned by feature
Wave Cluster - Among all the clustering algorithms, this is A1....AP with entropy criterion H (A1....AP) < ϖ (a
based on signal processing. The algorithm works with threshold) are selected for clustering 8.8 PROCLUS –
numerical attributes and has multi-resolution. Outliers can Projected Clustering Algorithm. The algorithm also uses a
be detected easily. The algorithm is as follows: • Fit all the medoid equivalent to the K – medoids clustering criteria.
data points into a cell. Apply wavelet transform to filter the The algorithm is described in a three-step process as
data points. • Apply discrete Wavelet transform to follows: • Initialization: Consider the configuration of all
accumulate data points. • High amplitude signals are applied points and select the data points randomly. • Iteration phase:
to the corresponding cluster interiors and high frequency is Select the medoids of the clusters as data points and define a
applied to find boundary of cluster. • Signals are applied to sub location for each medoid. • Refinement phase: Select the
the attribute space in order to form cluster with more sharp best medoids Medoids form set with all dimensions. Choose
and eliminates outliers easily. a new medoid that is close to the best medoid. All data
BANG – Grid based clustering algorithm. This is an points within this distance will be created as a cluster. 8.9
extension of the GRIDCLUST algorithm that initially ORCLUS- Oriented Projected Clustering Generation
considers all data points as blocks but uses the BANG Algorithm. It is similar to the PROCLUS clustering
procedure to keep track of the blocks. The algorithm is as algorithm but focuses on the non-axis parallel subspace. The
follows: • Divide the feature space into rectangular segments algorithm is described through three steps - assignment, sub
up to P max data points. • Construct a binary tree to location and merging as follows: • Assignment: In this step,
maintain the density indices of all calculated and sorted the algorithm assigns all data points recursively to nearby
segments in decreasing order. • Starting with the highest cluster centers. • Sub location: To determine sub location,
density index, all neighboring blocks are identified and split calculate the covariance matrix for each group and the Eigen

ISBN Number : 978-81-958673-8-7 14


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

vectors with the smallest Eigen values. • Integration: Brings SLINK – Single LINK cluster system. Model based
together teams that are close to each other and have the clustering algorithm using hierarchy method to build
same direction. 8.10 FC – Fractal Distribution System. The clusters. • Start with a set of points, making one point per
algorithm follows a hierarchical approach with multiple point. • Use Euclidean distance to determine the distance
mesh layers for statistical properties and identifies irregular between two points. • First connect the connections between
clusters. The algorithm is as follows: • Start with a data the shortest connections of all points. • Group one link
sample and consider a threshold value for a given set of together.
points. • Starting with the threshold value, scan the whole
data slowly. • The resulting points are added to each cluster III. CONCLUSION
using the HFD-Hausdorff Fractal Dimension (HFD) method. Cluster analysis is a very crucial paradigm in the entire
• If a small increase exceeds the threshold τ value, a point is process of data mining and paramount for capturing patterns
declared an outlier and the cluster shape is declared in data. This paper compared and analyzed some highly
irregular. • If not, a case is assigned to the group. 8. STIRR popular clustering algorithms where some are capable of
– Sieving Through Iterated Reinforcement. This algorithm scaling and some of the methods work best against noise in
deals with spectral partitioning using dynamic system as data. Every algorithm and its underlying technique have
follows: • Set of attributes are considered and weights W= some disadvantages and advantages and this paper has
W v are assigned to each attribute. • Weights are assigned to comprehensively listed them for the reader. Every paradigm
set of attributes using combining operator ϕ defined as • ϕ is capable of handling unique requirements of user
(W1…Wn-1) = W1+…….. + Wn-1. • At a particular point application. An extensive research and study has been done
the process is stopped to achieve dynamic system. in the field of data mining and there exist popular real life
5. MODEL BASED CLUSTERING ALGORITHM: examples such as Netflix, market basket analysis studies for
EM – Expectation and Maximization This algorithm is business giants, biological breakthroughs which use
based on two parameters- expectation (E) and maximization complex combinations of various algorithms resulting in
(M). • E: The current model parameter values are used to hybrids also and subsequently cluster analysis in the future
evaluate the posterior distribution of the latent variables. will unveil more complex data base relationships and
Then the objects are fractionally assigned to each cluster categorical data. There is an alarming need of some sort of
based on this posterior distribution as Q( θ , θ T ) = E[ log benchmark for the researchers to be able to measure
p(x g , x m | θ ) x g , θ T ] • M: The fractional assignment is efficiency and validity of diverse clustering paradigms. The
given by re-estimating the model parameters with the criteria should include data from diverse domains (text
maximum likelihood rule as θ t + 1 = max Q (θ, θ T ) The documents, images, CRM transactions, DNA sequences and
process is repeated until the convergence condition is dynamic data). Not just a measure for benchmarking
satisfied. algorithms, consistent and stable clustering is also a barrier
COBWEB – Model based clustering algorithm. It is an as a clustering algorithm irrespective of its approach towards
Incremental clustering algorithm, which builds taxonomy of handling static or dynamic data should produce consistent
clusters without having a predefined number of clusters. The results with complex datasets. Many examples of efficient
clusters are represented probabilistically by conditional clustering methods have been developed but many open
probability P (A = v | C) with which attribute A has value v, problems still exist making it playground for research from
given that the instance belongs to class C. The algorithm is broad disciplines.
as follows: • The algorithm starts with an empty root node. •
Instances are added one by one. • For each instance, the IV. BIBLIOGRAPHY
following options (operators) are considered: • - classifying
the instance into an existing class; • - creating a new class
and place the instance into it. • - combining two classes into G. Siva Prasad, M.C.A, M.Tech (CSE),
a single class (merging) and placing the new instance in the UGCNET, Works as Assistant Professor in
resulting hierarchy; • - split the class into two classes the Department of MCA , KBN College
(splitting) and placing the new instance in the resulting (Autonomous), Vijayawada, Andhra
hierarchy. • The algorithm searches the space of possible Pradesh and he is having 10 years of
hierarchies by applying the above operators and an experience in teaching and one year in
evaluation function based on the category utility. industrial experience. His research interest includes Data
SOM- self-organizing mapping algorithm. Model based Mining, Machine Learning, Deep Learning, Big Data,
clustering incremental clustering algorithm, based on Microsoft Programming Languages and Web Programming.
network configuration. The algorithm is described in two He has attended workshops on POWER BI, Data Analytics
steps: • Place the network of nodes on the plane where the using R, Generative AI, Block Chain Technology and many
data points are distributed. • Sampling data point and more.
subjecting neighboring nodes and neighboring nodes to its
influence. Another sampling statement and so on. • The
process is repeated until all data points have been sampled V. REFERENCES
repeatedly. • Each group is specifically defined by the node [1] J. Chen, X. L. Xiang, H. Zheng and X. Bao, “Novel
with the data points representing the closest node. cluster central fast decision clustering algorithm”, Appl. Soft
Comput., Vol.1, pp. 100. 57, pp. 539-555, Oct. 2017.

ISBN Number : 978-81-958673-8-7 15


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

[2] T. Wangchamhan, S. Chiewchanwattana and K. Sunat, [18] M. van de Velden, A. I. D’enza and A. Markos,
"Efficient design based on k-means and chaotic agreement "Distance-based clustering of mixed data", WIREs Comput.
competition algorithms for numerical systems and mixed- Statist., 2018.
type data clusters", Expert Syst. Applied Services, Volume [19] A. H. Foss, M. Markatou and B. Ray, "Distance metrics
1. 90, pp. 146-167, January 2017. and clustering methods for mixed-type data", Int. Stat. Rev.,
[3] K. Lakshmi, N. V. Karthikeyani, S. Shanthi and S. 2018.
Parvativartini, "Clustering Mixed Data Sets Using K- [20] K. Balaji and K. Lavanya, "Clustering algorithms for
Prototype Algorithm Based on Cow-Search Optimization" mixed datasets: A review", Int. J. Pure Appl. Math., vol. 18,
Advances and Developments in Intelligent Technologies and no. 7, pp. 547-556, 2018.
Smart Systems, Hershey, PA, USA: In IGI Globalization ,
p.129 -150, 150.
[4] A. Ahmad and S. Hashmi, "K-harmonic mean type
clustering algorithm for mixed data", Appl. Soft Comput.,
Vol.1, pp. 100. 48, pp. 39-49, Oct. 2016.
[5] J. Liang, X. Zhao, D. Li, F. Cao and C. Dang,
"Implementing information entropy for mixed data
quantification of clusters", Pattern Recognit., vol. 45, no. 6,
pp. 2251-2265, January 2012.
[6] X. Yao, S. Ge, H. Kong and H. Ning, "Improved
clustering algorithm and its application in wechat game user
analysis", Procedia Comput. Science, vol.1, pp. 100. 129,
pp. 166-174, Oct. 2018.
[7 S. Lin, B. Azarnoush and G. C. Runger, "CRAFTER: a
tree-group clustering algorithm for high-resolution and static
data structures with mixed attributes", IEEE Trans. They
need to know. Data Engineering, Vol. 9, pp. 1686-1696,
Oct. 2018.
[8] R.S. Data anal. Appendix, Volume 1, page 100. 12, no.
4, pp. 973–995, Oct. 2018.CrossRef Google Scholar
[9] H.S. Yu, Z. Chang and B. Zhou, "A new three-way
clustering algorithm for mixed-type data", Proc. IEEE Int.
Conf. Great knowledge. (ICBK), pp. 119-126, January 2017.
[10] S. S. Khan and J. Hoey, "Evaluation of fall detection
methods: A data availability perspective", Med. Eng.
Physics, vol. 39, pp. 12-22, February 2017.
[11]What is an Electronic Health Record (EHR)?,
December2018,[Online]Available:https://www.healthit.gov/
faq/what-electronic-health-record-her
[12] S.S. IEEE Int., 1999. Conf. Data Mining Workshop
(ICDMW), pp. 703-710, November 2017.
[13] S. S. Khan, B. Ye, B. Taati and A. Mihaiilidis,
"Detection of agitation and aggression in persons with
dementia using sensors—a systematic review", Alzheimer's
Dementia, vol. 14, no. 6, pp. 824–832, 2018.
[14] E. Houghton: and Green, "People Research: Business
Performance Using Human Data", 2018.
[15] E. Aljalbout, v. Golkov, Y. Siddiqi and D.S. Kremers,
Clustering with Deep Learning: Clustering and New
Approaches, 2018, [Online] Available:
https://arxiv.org/abs/1801.07648.
[16] X. Yao, S. Ge, H. Kong and H. Ning, "An improved
clustering algorithm and its application in wechat sports
users analysis", Procedia Comput. Sci., vol. 129, pp. 166-
174, Dec. 2018.
[17] E. Min, X. Guo, Q. Liu, G. Zhang, J. Cui and J. Long,
"A survey of clustering with deep learning: From the
perspective of network architecture", IEEE Access, vol. 6,
pp. 39501-39514, 2018.

ISBN Number : 978-81-958673-8-7 16


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

The Prediction Of Disease Using


Machine Learning
Salavadi Prasanna Kumar Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: prasannakumarsalavadi@gmail.com Email: gspkbn@gmail.com
quickly cleaned and processed data and deliver
ABSTRACT results
Disease Prediction using Machine Learning is faster. By using this system doctors will make
the system that is used to predict the diseases good decisions related to patient diagnoses and
from the symptoms which are given by the according to that, good treatment will be given to
patients or any user. The system processes the the patient, which increases improvement in
symptoms provided by the user as input and patient healthcare services. To improve the
gives the output as the probability of the accuracy of large data, the existing work will be
disease. Naïve Bayes classifier is used in the done on unstructured or textual data. For the
prediction of the disease which is a supervised
machine learning algorithm. The probability of
the disease is calculated by the Naïve Bayes
algorithm. With an increase in biomedical and
healthcare data, accurate analysis of medical
data benefits early disease detection and patient
care. By using linear regression and decision
tree we are predicting diseases like Diabetes,
Malaria, Jaundice, Dengue, and Tuberculosis.
Keywords
Disease Prediction, Machine learning, Naive
bay’s algorithm
prediction of diseases, the existing will be done
on linear, KNN, Decision Tree algorithm
I. INTRODUCTION
Machine Learning is the domain that uses past
data for predicting. Machine Learning is the
understanding of computer system under which II. ALGORITHM TECHNIQUES
the Machine Learning model learn from data
and experience. The machine learning KNN K Nearest Neighbour (KNN) could be
algorithm has two phases: terribly easy, simple to grasp, versatile and one
amongst the uppermost machine learning
1) Training
algorithms. In the Healthcare System, the user
2) Testing. will predict the disease. In this system, the user
To predict the disease from a patient’s can predict whether the disease will detect or
symptoms and from the history of the patient, not. In the proposed system, classifying disease
machine learning technology is struggling from in various classes that shows which disease will
past decades. Healthcare issues can be solved happen on the basis of symptoms. KNN rule
efficiently by using Machine Learning used for each classification and regression
Technology. We are applying complete issue. KNN algorithm is based on feature
machine learning concepts to keep the track of similarity approach.K-nearest neighbor
patient’s health. classifier algorithm is to predict the target label
of a new instance by defining the nearest
neighbor class. The closest class will be
identified using distance measures like
ML model allows us to build models to get

ISBN Number : 978-81-958673-8-7 17


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Euclidean distance. If K = 1, then the case is presence of a particular feature in a class is


just assigned to the category of its nearest unrelated to the presence of any other feature. It
neighbor. is very easy to build and useful for large
datasets. Naive Bayes is a supervised learning
model. Bayes theorem provides some way of
calculative posterior chance P(b|a) from P(b),
The value of ‘k’ has to be
P(a) and P(a|b). Look atithe equation below:
specified by the user and the best choice
depends on the data. The larger value of ‘k’
reduces the noise on the classification. If the
new feature i.e in our case symptom has to
classify, then the distance is calculated and then
the class of feature is selected which is nearest
to the newer instance. In the instance of
categorical variables, the Hamming distance
must be used. It conjointly brings up the
difficulty of standardization of the numerical
variables between zero and one once there's a
combination of numerical and categorical
variables within the dataset.

P(b v a)= P(a v b)P(b)/P(a)

Above,

 P(b|a) is that the posterior chance of class


(b,targset) given predictor (a, attributes).
The K-NN working can be explained on the basis
of the below algorithm:  P(b) is the priori probability of class.

o Step-1: Select the number K of the  P(a|c) is that chance that is that the chance of
neighbors predictor given class.
o Step-2: Calculate the Euclidean distance
of K number of neighbors  P(a) is the priori probability of predictor. In
o Step-3: Take the K nearest neighbors as our system, Naïve Bayes decides which
symptom is to put in classifier and which is not.
per the calculated Euclidean distance.
8.3 LOGISTIC REGRESSION Logistic
o Step-4: Among these k neighbors, count regression could be a supervised learning
the number of the data points in each classification algorithm accustomed to predict
category. the chance of a target variable that is Disease.
o Step-5: Assign the new data points to that
category for which the number of the  Naïve Bayes is one of the fast and easy ML
neighbor is maximum. algorithms to predict a class of datasets.
o Step-6: Our model is ready.
 It performs well in Multi-class predictions as
compared to the other Algorithms.
NAIVE BAYES

Naive Bayes is an easy however amazingly  Naive Bayes assumes that all features are
powerful rule for prognosticative modeling. independent or unrelated, so it cannot learn the
The independence assumption that allows relationship between features.
decomposing joint likelihood into a product of
marginal likelihoods is called as 'naive'. This
simplified Bayesian classifier is called as naive
Bayes. The Naive Bayes classifier assumes the

ISBN Number : 978-81-958673-8-7 18


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

DECISION TREE

A decision tree is a structure that can be used to


divide up a large collection of records into
successfully smaller sets of records by applying
a sequence of simple decision tree. With each
successive division, the members of the
resulting sets become more and more similar to
each other. A decision tree model consists of a
set of rules for dividing a large heterogeneous
population into smaller, more homogeneous
(mutually exclusive) groups with respect to a
particular target. The target variable is usually
categorical and the decision tree is used either
to:

 Calculate the probability that a given record


belong to each of the category and,

 To classify the record by assigning it to the


most likely class (or category). In this disease
prediction system, decision tree divides the III. CONCLUSION
symptoms as per its category and reduces the The main aim of this disease prediction system is
dataset difficulty to predict the disease on the basis of the
symptoms. This system takes the symptoms of the
user from which he or she suffers as input and
generates final output as a prediction of disease.
Average prediction accuracy probability of 100%
is obtained. Disease Predictor was successfully
implemented using the grails framework. This
system gives a user-friendly environment and easy
to use.
As the system is based on the web application,
the user can use this system from anywhere and
at any time. In conclusion, for disease risk
modeling, the accuracy of risk prediction
depends on the diversity feature of the hospital
RESULTS data.
This systematic review aims to determine the
Random Forest 98.95% performance, limitations, and future use of
Software in health care. Findings may help
Naïve Bayes 89.4%
inform future developers of Disease
SVM 96.49% Predictability Software and promote
personalized patient care. The program predicts
KNN 71.28% We found that the Support Patient Diseases. Disease Prediction is done
Vector Machine (SVM) algorithm is widely through User Symbols.
used (in 30 studies) followed by the Naïve
Bayes algorithm (in 24 studies). However, the In this System Decision tree, Unplanned
Random Forest algorithm showed relatively Forest, the Naïve Bayes Algorithm is used to
high accuracy. In the 40 studies in which it was predict diseases. For the data format, the
used, RF showed the highest accuracy of system uses the Machine Learning algorithm
98.95%. This was followed by SVM which Process Data on Database Data namely,
included 96% of the accuracy considered. Random Forest, Decision Tree, Naive Bayes.
System accuracy reaches 98.3%. machine
learning skills are designed to successfully
predict outbreaks.

ISBN Number : 978-81-958673-8-7 19


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

for optimization with orthogonality


constraints,” SIAM Journal on Scientific
Computing, vol. 41, pp. 2239–2269, 2019.
IV. BIBLIOGRAPHY
7. [5] J. Pajarinen, H. L. Thai, R. Akrour, J.
Peters, and G. Neumann, “Compatible
G. Siva Prasad, M.C.A, M.Tech
natural gradient policy search,” Machine
(CSE), UGCNET, Works as
Learning.
Assistant Professor in the
Department of MCA , KBN
College, Vijayawada, Andhra
Pradesh and he is having 10 years
of experience in teaching and one year in industrial
experience. His research interest includes Data
Mining, Machine Learning, Deep Learning, Big
Data, Microsoft Programming Languages and Web
Programming. He has attended workshops on
POWER BI, Data Analytics using R, Generative
AI, Block Chain Technology and many more.

V. REFERENCES

1. Mehtab S, Sen J. Analysis and forecasting


of financial time series using CNN and
LSTM-based deep learning models. In:
Advances in Distributed Computing and
Machine Learning: Proc. of ICADCML
2021. Sahoo, J. P. et al editors. LNNS,
Springer. Vol. 302; 2022
2. Artificial Intelligence and Machine
Learning Approaches in Digital
Education: A Systematic Revision.
Information 2022, 13, 203.
https://doi.org/10.3390/ info13040203
3. Sen J, Dutta A, Mehtab S. Profitability
analysis in stock investment using an
LSTM-based deep learning model. In:
Proc. of 2nd IEEE Int. Conf. for Emerging
Technologies (INCET), Belagavi, India,
USA: IEEE Xplore; May 21-23, 2021. pp.
1-9. DOI: 10.1109/
INCET51464.2021.9456385.
4. [CrossRef] 12. Rajendran, R.; Kalidasan,
A. Convergence of AI, ML, and DL for
Enabling Smart Intelligence: Artificial
Intelligence, Machine Learning, Deep
Learning, Internet of Things. In
Challenges and Opportunities for the
Convergence of IoT, Big Data, and Cloud
Computing; IGI Global: Hershey, PA,
USA, 2021; pp. 180–195.
5. Taglietti, D.; Landri, P.; Grimaldi, E. The
big acceleration in digital education in
Italy: The COVID-19 pandemic and the
blended-school form. Eur. Educ. Res. J.
2021, 20, 423–441.
6. J. Hu, B. Jiang, L. Lin, Z. Wen, and Y.-x.
Yuan, “Structured quasinewton methods

ISBN Number : 978-81-958673-8-7 20


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

An Overview Of Biometric
Security Technology
Seela Anitha Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email:anithaseela@gmail.com Email: gspkbn@gmail.com

ABSTRACT II. BIOMETRIC TRAITS


Biometric traits are unique physical or
Biometrics means using your unique body parts, behavioral characteristics that each person has.
like your face, eyes, and fingerprints, or how These traits are special to each individual and
you do things, like your signature and voice, to can be used to tell people apart from one
make sure you are who you say you are. another. Some examples of biometric traits
Biometric authentication is getting more popular include fingerprints, the way someone's face
because it's more reliable than remembering looks, the pattern of their iris in their eye, the
passwords, which can be forgotten, taken by sound of their voice, or even how they walk.
others, or figured out easily. But there's a These traits are different for each person and
downside: these special things that show it's you, are used in biometric technology to verify
like fingerprints and eye patterns, can be used to identities and make sure that the right person is
track what you do and put together different accessing something or doing something.
pieces of information about you, like where you
go and what you buy. This paper talks about how III. Biometric security technology vs other
biometric systems work and compares different security methods/devices Biometric Security
ways to use these special things. Pros
High Accuracy Biometrics rely on unique
Keywords: Face, Fingerprint, Iris , Voice, Palm physiological or behavioral traits, making them
highly accurate for identification and
authentication.
I. INTRODUCTION
Convenience Users don't need to remember
The word "Biometrics" comes from passwords or carry physical tokens; their
Greek words that mean measuring life. biometric traits are always with them.
It's about using numbers to study body
features. When we talk about biometric Difficult to Replicate Biometric traits are
recognition for people, we mean using difficult to forge or copy, adding an extra layer
body parts to make sure who someone is. of security.
But we'll just say "biometrics" to mean
this. Biometric recognition is a good way Non-transferable Biometric traits are tied to
to make things safe, and it's better than individuals and cannot be easily shared or lent
regular methods that need things you to others.
have (like keys or cards) or things you
know (like passwords or PINs). The cool Contactless Many biometric methods are
thing about biometrics is that it's about contactless, which is convenient and can
who you are or what you do, so you don't promote hygiene (e.g., face recognition).
need to remember anything or carry any
special stuff. Cons
Privacy Concerns Collecting and storing
biometric data raises privacy concerns, as this
information is deeply personal.

ISBN Number : 978-81-958673-8-7 21


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Security Risks Stolen biometric data cannot be Forgetfulness Users might forget passwords,
changed like a password; if compromised, it can leading to frustration and security risks due to
have long-lasting consequences. password recovery processes
.
Complexity Biometric systems require Reusability People often reuse passwords,
advanced technology and algorithms, which can which can lead to multiple accounts being
make them more complex and expensive to compromised if one is breached.
implement.
Weakness to Social Engineering
Environmental Factors
Passwords can be obtained through
Biometric accuracy can be affected by changes manipulation or trickery.
in an individual's physical appearance (e.g.,
injuries, aging). In summary, biometric security offers high
accuracy, convenience, and resistance to
Other Security Devices (e.g., Smart Cards, forgery, but it comes with privacy concerns and
Tokens) Pros potential data breach implications. Traditional
devices like smart cards and tokens offer a
Physical Possession Devices like smart cards different set of advantages and challenges,
or tokens require physical possession, making while passwords and PINs are familiar but have
them less vulnerable to remote attacks. known vulnerabilities. The choice between
these options often depends on the specific
No Personal Data Unlike biometrics, these security needs, user convenience, and the level
devices don't involve the storage of personal of risk an organization is willing to accept.
traits; they store coded information.
IV. VERIFICATION AND
Changeable If a token or card is lost or stolen, IDENTIFICATION
it can be easily replaced or deactivated.
Biometric systems can work in two main ways:
Cons one is like a detective, and the other is like a
gatekeeper. We'll call both of these ways
Loss or Theft If a card or token is lost or "recognition." But some people might use the
stolen, unauthorized individuals can gain access words "recognition" and "detective"
until it's reported. interchangeably.

Sharing These devices can be shared or loaned, i. Detective Mode (Identification)


potentially compromising security.
Imagine you have a bunch of pictures of people,
User Burden Users need to carry and manage and you want to find out who a specific person
these devices, which can be inconvenient is. In this mode, the system looks through all
. the pictures and tries to find a match for the
Authentication Speed Authentication using person you're looking for. This is like when
these devices might take longer compared to detectives search for a criminal in a big
quick biometric scans. database of photos.
ii. Gatekeeper Mode (Verification or
Passwords/PINs Pros Authentication)
Familiarity People are used to using passwords
or PINs for security. Now, think about unlocking your phone with
your fingerprint. The phone wants to make sure
Low Implementation Cost Implementing it's really you. It does this by comparing your
password-based security is often less expensive fingerprint with the fingerprint it knows is
initially. yours. If they match, the phone lets you in. This
is like a gatekeeper checking your special stamp
Cons to make sure you're allowed to enter.
Security Vulnerabilities Passwords can be "Recognition" covers both these ways of
guessed, cracked, or stolen through techniques figuring out who someone is. Some people use
like phishing. the word "recognition" to mean both the

ISBN Number : 978-81-958673-8-7 22


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

detective way and the gatekeeper way. It just play crucial roles in the effectiveness and
depends on how people talk about it. reliability of biometric security technology.
So, understanding these different ways helps us VI. BIOMETRIC TECHNOLOGIES
talk about how biometric systems work when
they're trying to identify or verify people using Biometric technologies are methods that use
their unique traits like fingerprints or faces. unique physical or behavioral traits to identify
and verify individuals. These traits are difficult
V. POSITIVE AND NEGATIVE to replicate and provide a more reliable way of
RECOGNISATION confirming someone's identity compared to
traditional methods like passwords or PINs.
Positive recognition and negative recognition Here are some common types of biometric
are terms commonly used in the context of technologies:
biometric security technology to describe the
outcomes of identification and verification fingerprint recognition, facial recognition, iris
processes. Biometric security technology uses recognition, voice recognition, and palm print
unique physical or behavioral traits to recognition. These techniques use unique
authenticate individuals. Here's what these physical or behavioral traits to identify
terms mean: individuals and are widely used for security and
authentication purposes.
i. Positive Recognition
i. Fingerprint Recognition
Positive recognition, also known as positive
identification or positive authentication, refers - Trait Used Unique patterns on the fingertips.
to the successful matching of a biometric
sample (e.g., fingerprint, iris scan, face - Pros High accuracy, well-established, widely
recognition) against a per-stored reference adopted, difficult to fake.
template in a database. In other words, the - Cons Can be affected by dirty or wet fingers,
system correctly identifies the individual and concerns about privacy due to fingerprint
grants them access. Positive recognition is a databases.
desirable outcome as it ensures that authorized
individuals are granted access, enhancing
security and convenience.
Positive Recognition Example: A person uses
their fingerprint to unlock their smartphone.
The system matches the fingerprint scan with
the stored template and grants access to the
owner of the device.
ii. Negative Recognition
Negative recognition, also known as negative
identification or rejection, occurs when the
ii. Facial Recognition
biometric system correctly determines that a
presented sample does not match any of the - Trait Used Distinct features of the face.
stored templates. This outcome is essential for
preventing unauthorized access and maintaining - Pros Non-intrusive, user-friendly, gaining
security. Negative Recognition Example: An popularity, suitable for real-time
individual attempts to use someone else's face identification.
image to gain access to a secure facility using
- Cons Variability in lighting conditions and
facial recognition technology. The system
angles can affect accuracy, potential for false
correctly identifies that the presented face does
positives/negatives.
not match the authorized individual and denies
access. In summary, positive recognition is the
successful matching of biometric samples to
stored templates, ensuring access for authorized
individuals, while negative recognition is the
correct rejection of mismatched samples to
prevent unauthorized access. Both outcomes

ISBN Number : 978-81-958673-8-7 23


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

v. Palm Print Recognition


Trait Used Patterns on the palm of the hand.
Pros Less susceptible to wear and tear than
fingerprints, difficult to fake.
Cons Requires high-quality images, limited
adoption compared to other methods.

iii. Iris Recognition


Trait Used Iris patterns in the eyes.
Pros High accuracy, stable over time, difficult
to duplicate, contactless.
Cons Requires specialized hardware, sensitive Each of these biometric techniques has its
to changes in lighting and eye conditions. strengths and weaknesses. The choice of which
technique to use depends on factors like
accuracy requirements, usability, convenience,
hardware availability, and the specific security
context in which they are being applied. Some
systems even combine multiple biometric
methods to enhance accuracy and security.

VII. SECURITY AND PRIVACY


Security Biometric technology offers strong
iv. Voice Recognition
security features due to its reliance on unique
Trait Used Voice characteristics including physical or behavioral traits. Here's how
pitch, tone, and patterns. security is maintained:

Pros Non-intrusive, convenient, can be used for 1. Uniqueness Biometric traits, such as
remote authentication. fingerprints, iris patterns, and facial
features, are distinctive to each
Cons Affected by changes in voice due to individual. This uniqueness ensures
illness, noise, or age, less secure than some that only authorized individuals can
other methods. gain access.
2. Accuracy Biometric systems are
designed to provide high accuracy in
identification and verification,
reducing the risk of unauthorized
access.
3. Anti-Spoofing Measures To prevent
fraudulent attempts, biometric systems
incorporate anti-spoofing mechanisms.
These measures detect signs of fakes,
such as images or replicas, ensuring
that only real individuals are
authenticated.
4. Encryption Biometric data, which is
sensitive and personal, should be
encrypted during storage and
transmission. Encryption safeguards

ISBN Number : 978-81-958673-8-7 24


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

data from being accessed by Ridge-based Algorithms These


unauthorized parties. algorithms analyze the overall ridge
5. Template Protection Biometric patterns and characteristics to identify
templates, the digital representations of unique features.
biometric traits, are stored in a Core-Point Detection Algorithms
protected manner to prevent them from These algorithms locate core points
being easily copied or reconstructed. (central points in fingerprint patterns)
Privacy for fingerprint alignment and
comparison.
While biometrics offer enhanced security,
privacy considerations are paramount to protect 2. Face Recognition
individuals' rights and personal data: - Principal Component Analysis
1. Consent and Control Individuals (PCA) PCA is used to reduce the
should have the authority to give dimensionality of facial image data
informed consent for their biometric while retaining the most important
data to be collected and used. They features.
should also have the ability to control - Local Binary Pattern (LBP) LBP
when and where their data is used. algorithms extract texture patterns from
facial images, enabling efficient feature
2. Data Retention Organizations should extraction.
retain biometric data only for as long - Deep Learning Algorithms
as necessary. Unnecessary retention Convolutional Neural Networks
increases the risk of exposure and (CNNs) such as VGG, ResNet, and
breaches. FaceNet are used to learn complex
features directly from raw image data.
3. Anonymization Systems can be
designed to store biometric templates 3. Iris Recognition
in a way that doesn't directly link them - Daugman's Algorithm This
to individuals. This helps prevent the algorithm encodes iris patterns using
identification of individuals from the Gabor filters and then applies
templates. mathematical transformations for
matching.
4. Transparency Users should be - Phase-Based Algorithms These
informed about how their biometric algorithms analyze the phase
data will be used, who will have access information of iris texture patterns for
to it, and how it will be protected. accurate recognition.
- Hamming Distance Used to
5. Legal and Regulatory Complian measure the similarity between iris
Biometric systems must adhere to codes extracted from iris images.
relevant privacy laws and regulations
to ensure that individuals' rights are 4. Voice Recognition
upheld. - Mel-Frequency Cepstral
Coefficients (MFCC) MFCCs are
6. Minimization Only collect and store extracted from voice recordings to
the minimum amount of biometric data capture unique vocal characteristics.
required. For example, facial - Gaussian Mixture Models (GMM)
recognition systems can store GMMs model the statistical distribution
abstracted features rather than full of voice features for speaker
facial images. identification.
VIII. ALGORITHMS USED IN BIMETRIC - Dynamic Time Warping (DTW)
SECURITY TECHNOLOGY DTW measures the similarity between
1. Fingerprint Recognition spoken phrases, accounting for
Minutiae Detection Algorithms These variations in speech speed and
algorithms identify and extract minutiae pronunciation.
points (ridge endings, bifurcations)
from fingerprint images for matching 5. Vein Recognition
and authentication. - Image Processing Algorithms
These algorithms process near-infrared

ISBN Number : 978-81-958673-8-7 25


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

images to extract vein patterns and Assistant Professor in the Department of MCA ,
enhance the quality of the captured KBN College (Autonomous), Vijayawada,
data. Andhra Pradesh and he is having 10 years of
- Pattern Matching Similar to experience in teaching and one year in
fingerprint matching, vein recognition industrial experience. His research interest
systems compare extracted patterns to includes Data Mining, Machine Learning, Deep
stored templates Learning, Big Data, Microsoft Programming
. Languages and Web Programming. He has
6. Gait Recognition attended workshops on POWER BI, Data
- Silhouette Extraction Gait Analytics using R, Generative AI, Block Chain
recognition algorithms extract Technology and many more.
silhouettes or key joint positions
from video footage. XI. REFERENCES
[1] Anamika Singh, Rajesh Kumar
-Dynamic Time Warping (DTW) Dhanaraj, Md. Akkas Ali, Balamurugan
DTW is used to compare gait Balusamy, Vandana Sharma,
patterns by aligning and measuring "Blockchain Technology in Biometric
temporal sequences. Database System", 20223rd
International Conference on
These algorithms, often supported by machine Computation, Automation and
learning and pattern recognition techniques, Knowledge Management (ICCAKM),
contribute to the accuracy and reliability of pp.1-6, 2022.
biometric security systems. It's important to note
that biometric systems often use a combination [2] HeniIspur
of these algorithms and techniques to ensure Pratiwi,ImanHerwidianaKartowisastro,
robust and secure authentication. BenfanoSoewito,Widodo Budiharto,
"Adopting Centroid and Bandwidth to
IX. CONCLUSION Shape Security Line", 2022 IEEE
International Conference of Computer
In conclusion, biometric security technology is Science and Information Technology
a cool and new way to make things safer. It (ICOSNIKOM), pp.1-6, 2022.
uses special things about our bodies, like our
fingerprints, faces, voices, and more, to make [3] Terene Govender, Patrice Umenne,
sure only the right people can access things. "Design of a Fingerprint Biometric
This is better than passwords. It helps stop bad Access Control System with GSM
guys from pretending to be someone else. Functionality", 2021 International
Conference on Artificial Intelligence,
But, there are some things we need to be Big Data, Computing and Data
careful about. People worry about their privacy Communication Systems (icABCD),
and the chance that this special information pp.1-6, 2021.
could be stolen. We need to keep this
information very safe and follow rules to [4] Vinayak Rai, Kapil Mehta, Jatin Jatin,
protect people's privacy. DheerajTiwari, Rohit Chaurasia,
As the technology gets better, we need to keep "Automated Biometric Personal
learning and making it even more accurate and Identification-Techniques and
safe. It's a good idea to use more than one way Applications", 2020 4th International
to stay safe, not just biometrics. By thinking Conference on Intelligent Computing
about what's right and wrong and following the and Control Systems (ICICCS),
rules, biometric security can make our world pp.1023-1030, 2020.
safer in many ways, like on our gadgets and
important places. [5] Justice Kwame Appati, Prince Kofi
Nartey, Winfred Yaokumah, Jamal-
X. BIBLIOGRAPHY Deen Abdulai, "A Systematic Review
of Fingerprint Recognition System
Development", International Journal of
G. Siva Prasad, M.C.A, M.Tech Software Science and Computational
(CSE), UGCNET, Works as Intelligence, vol.14, no.1, pp.1, 2022.

ISBN Number : 978-81-958673-8-7 26


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

"Eathentication: A Chewingbased
[6] Jag Mohan Singh, Ahmed Madhun, Authentication Method", 2020 IEEE
GuoqiangLi, Raghavendra Conference on Communications and
Ramachandra, "A Survey onUnknown Network Security (CNS), pp.1-9, 2020.
Presentation Attack Detection for
Fingerprint", Intelligent Technologies [9] Ofir Landau, Aviad Cohen, Shirley
and Applications, vol.1382, pp.189, Gordon, Nir Nissim, "Mind your
2021. privacy: Privacy leakage through BCI
applications using machine learning
[7] Priyanka Datta, Shanu Bhardwaj, S. N. methods", Knowledge-Based Systems,
Panda,Sarvesh Tanwar, Sumit Badotra, vol.198, pp.105932, 2020.
"Survey of Security and Privacy Issues
on Biometric System", Handbook of [10] Katerina Prihodova, Miloslav Hub,
Computer Networks and Cyber "Hand-Based Biometric System Using
Security, pp.763, 2020. Convolutional Neural Networks", Acta
Informatica Pragensia, vol.9, no.1,
[8] Mattia Carlucci, Stefano Cecconello, pp.48, 2020.
Mauro Conti, Piero Romare,

An Efficient Approach for Secured Data


Transmission between IoT and Cloud
S. Ganesh, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of M.C.A, Department of M.C.A,
K B N College(Autonomous), K B N college(Autonomous),
Vijayawada-520001,AP,India, Vijayawada-520001,AP.,India,
ganeshseera9785@gmail.com gspkbn@gmai.com

computational load. In the proposed system,


Abstract: the fog computing layer is used as an interface
between IoT and cloud computing layer where
The Internet of Things (IoT) network data filtering and clustering take place to
generates a lot of data and cloud servers reduce network traffic and latency. The
collect that data. The server then analyzes the ultimate aim is to provide security for data
collected data and based on the findings, transmission between IoT and the cloud.
provides appropriate intelligent services to
users as a result. If there is any faulty data Keywords:
Internet of Things (IoT), fog computing, cloud,
Fog data filtering, noisy data, data classification, k-
IoT Clo nearest neighbor, complement naive Bayes,
netw Data ud accuracy
filtration

while the server analyzes the collected data,


distorted results will be created. The data I. INTRODUCTION
captured from IoT contains lots of
heterogeneous as well as suspicious data, so A vast number of small objects known as
cleaning, filtering, and clustering of it must sensors make up the Internet of Things (IoT)
be done before sending it to the server, network. To deliver intelligent services, the
otherwise it will unnecessarily create sensors are connected to an access network.
overhead on the server. The proposed system This system is made using IoT and the cloud
consists of a filtering and clustering network. In the IoT access network, data is
mechanism for the data collected from IoT generated by sensors on devices. This data is
devices so that integrated data is transferred further transferred to the
to the cloud server which will reduce its

ISBN Number : 978-81-958673-8-7 27


KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

cloud server which takes decisions based


on data analysis and distributes the results
to actuators in the access network. The
sensor nodes in the networks have limited
computational capability and also have
limited energy resources and memory.
IoT devices can collect
unexpected sensory data due to limited
resources. The server may make an
inaccurate choice and produce incorrect
results after analyzing this faulty data
which will reduce service efficiency. Due
to resource constraints, many IoT devices
are also prone to failure. When a network
device malfunction, it produces inaccurate
or unreliable data frequently. After
analyzing the faulty input, the server
produces
misleading results.
Furthermore, extra data which is called
suspicious data with inaccurate
information might be injected into the
network. This data can also impact server
decisions, and
a poor decision will give incorrect results to
the end users. Valid data produced from the
correct objects should be used for analysis.
As a result, the server’s data integrity must
be ensured. Figure 1 illustrates the steps
involved in achieving data integration. To
maintain data integrity on the server, there
is a need to eliminate faulty/unreliable data
from the analysis. It should be cleaned
before reaching the cloud server to decrease
the load onit. The data is pre-processed to
remove or decrease noise through the use of
smoothing techniques. Faulty data can be
detected using an intrusion detection
system. There is also a need for a data
filtering system to collect only normal data
called non-suspicious data. As a result,
faulty data is avoided from analysis by the
server. So that, the server makes the correct
decision and utilizes less energy for
computation.

ISBN Number : 978-81-958673-8-7 28


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Figure 1. Steps involved in achieving data integration sets. The classification and clustering

If data filtering is performed at the cloud level, a large


number of data is transferred to the cloud, increasing
network traffic. Data loss and illegal access by outside
intruders are both caused by increased network traffic.
The solution is to filter data at the IoT device level.
Resource limits may be a potential hurdle for the volume,
velocity, and variety of IoT data if data is categorized at
the device level. If data classification is performed at the
fog computing layerwhich is present between IoT and
cloud systems can be a better choice. Fog computing
carries out communication, storage, and computation on
devices that are close to users.

Figure 2. Data filtering system at fog level

Figure 2 shows that data filtering takes place at the fog


layer which is present between IoT and the cloud layer.
Fog computing increases efficiency for massive IoT
applications by processing data locally instead of
remotely on a cloud server. In terms of early
intervention, this guarantees that responses are sent to
the user with low latency. To support extensive IoT
applications, fog computing offers decentralized cloud
architecture by extending network, repository, and
computing capabilities to the network’s edge acquired
data, the data integrity leads to a reduction in the server
computing burden. When the server processes the data,
reducing the computational load might result in lower
energy consumption.

Related work:

IoT represents a new notion for the Internet and smart


data. Preparing and processing data are two major
obstacles for researchers working with IoT. Many
significant and enlightening discoveries regarding data
features have been found after reviewing the real-world
perspective of how IoT data is examined by various
authors. A literature review aimsto find out the best
system for processing IoT data and choosing the best
data filtering technique which will filter out suspicious
data from IoT. Table 1 shows the techniques and
mechanisms used by various researchers in their
research.
The solution to the aforementioned problem is
whatever data is generated by IoT devices must be
filtered at the network’s edge, which will boost
bandwidth by transferring only relevant data from IoT to
the cloud and reduce latency.Fog is located in the local
area network and offers a decentralized environment.
Fog computing lowers latency as only summarized
information is delivered to the cloud. Fog is a much
better solution than cloud because it has a faster reaction
time and can work in a weak network. As a result, fog
computing is the ideal option. There should be a
mechanism that can deliver only relevant data to the
cloud. Data filtering is the process of identifying
potentially valuable and relevant patterns in large data
ISBN Number : 978-81-958673-8-7 29
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

techniques can be used to filter data. The purpose of


data filtering is to extract meaningful information from
large data. Data filtering systems classify the data
collected from IoT devices. The input to these systems
is training data sets. Naive Bayes(NB), KNN, K-
Means, Random Forest, Support Virtual Machine, and
other common machine learning algorithms are
currently being used by many researchers for
analyzing and making decisions on IoT-generated
smart dataThe purposes and capacities of these
algorithms for obtaining and processing data vary
depending upon the input. In the case of conditional
independence, complement naive Bayes (CNB)
performs well and takes less time as compared to other
machine learning algorithms. To analyze the
correlation among comparable data, the KNN
algorithm uses a relatively straightforward
methodology. There should be a mechanism that can
deliver only relevant data to the cloud. Data filtering is
the process of identifying potentially valuable and
relevant patterns in large data sets. The classification
and clustering techniques can be used to filter data.
The purpose of data filtering is to extract meaningful
information from large data. Data filtering systems
classify the data collected from IoT devices. The input
to these systems is training data sets. Naive Bayes
(NB), KNN, K-Means, Random Forest, Support
Virtual Machine, and other common machine learning
algorithms are currently being used by many
researchers for analyzing and making decisions on
IoT- generated smart data.
The purposes and capacities of these algorithms for
obtaining and processing data vary depending upon the
input. In the case of conditional independence,
complement naive Bayes (CNB) performs well and
takes less time as compared to other machine learning
algorithms. To analyze the correlation among
comparable data, the KNN algorithm uses a relatively
straightforward methodology.

II. Implementation

System model

Figure 3 shows the proposed three-tier architecture


which consists of IoT, fog computing, and cloud
computing layers. The services provided by IoT are
based on data which is collected from IoT devices. The
fog computing layer receives the data from IoT
devices. At this layer first filtration of data takes place.
The fog layer helps to manage the data transmitted to
the cloud layer and pulls useful information for
intelligent services.
The fog layer receives enormous amounts of real-time
data from IoT devices, which is then dispersed to
numerous devices connected in this layer. Fog
computing offers constrained network, storage, and
compute services in addition to logical intelligence and
data tampering for data centers.

ISBN Number : 978-81-958673-8-7 30


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Data cleaning using noise reduction.

Noise data means irrelevant, useless, meaningless, or


corrupt data. Machines cannot correctly understand and
analyze data containing unstructured text. The outcomes
of any data analysis can be negatively impacted by noisy
data, which also unnecessarily increases the amount of
storage space needed.
The focus of data cleaning approach is on detecting and
removing noise caused by a poor data collection
procedure. The most powerful signal denoising filtering
The data gathered by IoT devices is analyzed and technique is Empirical Mode Decomposition (EMD).
aggregatedby the fog computing layer. Data cleaning, Using EMD, a complex and multiscale signal can be
removal of suspicious data, and finding event data are decomposed adaptively into a set of Intrinsic Mode
done at this layer and then data is sent to the cloud Functions (IMFs). IMFs are a collection of finite-number
server. The server present at the cloud layer receives the zero-mean oscillating components. The instantaneous
data, saves the data, and analyzes it. frequency of the IMFs is calculated using the Hilbert
Huang Transform, an analytical signal processing
Data filtering system: technique. If a signal has an equal number (or differs by
one) of extrema and zero crossovers, as well as a zero
The data filtering system is shown in Figure 4. Data mean in both the upper and lower envelopes then it is
collected from IoT devices is routed through the considered IMF. The IMF is deconstructed from the raw
system’s data queue. The data handler reads the data sensor signal. The filtered signal z is constructed using
present in front of the queue. Different types of data are the following formula to remove the IMF components
produced by a variety of IoT devices. The data handler
is therefore required before the detecting function. It
transforms the data intothe appropriate data format for
the detection function’s learning. The detection and where T is the total number of IMFs. The high-frequency
filtering functions are the two main features of the noise, which is represented by the first IMF of sensor
system. The detection function’s analytical results are signals, is removed here. The signal’s significant
used in the filtering function to make decisions. qualities are preserved while the high frequencies are
filtered out.

Suspicious data detection:

In a network during data transmission, additional data


with wrong information which is called suspicious data
may be injected into normal data. After analyzing such
data, the server gives the wrong results. If suspicious
data is separated from regular data, the computation load
of the server will decrease. Data filtering system uses
learning techniques to identify suspicious data and
intrusion data. To detect suspicious data among the
incoming data, the suggested system uses CNB [8]
classifier. Working with unbalanced data sets is a
specialty of CNB. Instead of calculating the likelihood
that an item belongs to a certain class, CNB calculates
Following are the steps involved in removing data the likelihood that the item belongs to all classes. The
distortion and achieving data integrity. ‘naive’ part of the name comes from the fact that the
predictor variables are assumed to be independent of one
another. In other ways, the presence of one feature in a
data set has nothing to do with the presence of any other
feature. They do so by computing the ‘posterior’
probability of a certain event. The detection function
predicts the suspicious data using posterior probabilities
ISBN Number : 978-81-958673-8-7 31
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

which are calculated from a priori probabilities. The


detection function is denoted as

ISBN Number : 978-81-958673-8-7 32


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

The proposed system uses three steps of data filtering


such as data cleaning, detecting suspicious data, and
detecting event data. The data cleaning technique is used
to remove noise caused by a poor data collection
P (c|x) = P (x1|c) × P (x2|c) × × P (xn |c) × P (c) where
procedure. The detection function of detect suspicious
P (c|x) is the posterior probability of class (target) given-
data technique determines which inbound data traffic
predictor (attribute). P (c) is the prior probability of
contains the suspicious material. The event data
class. P (x|c) is the likelihood which is the probability of
detection technique is used to investigate the correlation
the predictor given class. P (x) is the prior probability of
among comparable data.
the predictor.
The second most improved thing is implanting the
whole filtering system at the fog layer instead of
Event data detection.
implanting it at the cloud level. It reduces the data traffic
and thus average execution delay of the proposed system
The event data is the data having valid values. To find
significantly decreases as compared to the cloud-centric
out event data among incoming data, the KNN is used.
approach. The performance of the NB, CNB, and
The idea of the KNN algorithm is to find a k-long list of
suggested systems is shown in Table 2. There are 89
samples that are close to a sample that’s to classify. KNN
cases utilized for training. The data for ten tests were
is best for scaling data and handling missing values.
chosen using a random sample with a replacement
KNN is learning by analogy method that contrasts
method. The system’s output is determined by
similar training and test data sets. There are n properties
characteristics such as correct classification, wrong
that describe these tuples. In n-dimensional space, all
classification, and accuracy
training tuples are stored. In KNN, the test tuple for
classification is provided. This approach looks for k
training tuples that are most similar to the test tuple;
these k tuples are known as nearest neighbors. The same
event is detected by numerous IoT devices. Here, the
KNN technique is used to analyze the correlation among
comparable data. To do correlation testing, the filtering
function uses the algorithm to determine the Euclidean
distance of the data characteristics.

Evaluation matrix:
The evaluation metrics are Time and Accuracy which are
described as follows:
Time: Time spent developing the model and making
predictions.
Accuracy: It is a ratio of accurately anticipated
observations to the total observations.

Results The above simulation results clearly state that the


The suggested system is simulated using the iFogsim2 proposed system gives an improved and more accurate 1
simulator. Figure 6 shows the topology used for 2 3 4 5 6 7 60000 50000 40000 30000 20000 10000 0
simulating the proposed system which consists of Suspicious data Actual data Special Issue|2023| 43
sensors, actuators, fog nodes, and the cloud data center Research Reports on Computer Science classification of
IoT data as compared to the existing system developed
using either NB or CNB. It is observed that the data loss
and tampering data are reduced considerably. The
ultimate aim of secured transmission of data between IoT
and cloud is achieved
Related work:
The filtering technique is used to enhance IoT data
integrity. Most of the research done so far regarding data
integrity and the filtering of IoT data transmission uses
data cleaning, malfunctioning data detection, and event
data detection techniques separately. The combination of
three techniques in one gives prominent and tremendous
results regarding data filtering. In the proposed system,
the data filtering takes place using three steps which give
better results than the existing systems. As filtering takes

ISBN Number : 978-81-958673-8-7 33


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

place at the fog computing layer, now the includes Data Mining, Machine Learning,
cloud will receive the relevant data. It reduces Deep Learning, Big Data, Microsoft
the data traffic between IoT and the cloud. As Programming Languages and Web
data traffic is reduced, the outside intruder will Programming. He has attended workshops on
not attack the network, and security is POWER BI, Data Analytics using R,
provided to the data transmitted from IoT to Generative AI, Block Chain Technology and
the cloud. The proposed system will prove many more.
very useful in many applications such as
augmented reality, healthcare, agriculture,
smart utility services, caching and processing, V. REFERENCES
gaming, decentralized smart building controls,
and agriculture. 1.Neware R, Shrawankar U. Fog computing
architecture,pplications and security issues.
III. CONCLUSION International Journal of Fog Computing
(IJFC). 2020; 3(1): 75-105.
IoT devices generate a vast amount of data.
Processing such a vast amount of data, it 2. Nemade B, Shah D. An efficient IoT based
becomes risky to communicate IoT devices prediction sstem for classification of water
with the cloud and vice versa. Traditional using novel adaptive incremental learning
cloud servers filter data in a centralized framework. Journal of King Saud University-
fashion, resulting in a single point of failure. Computer and Information Sciences. 2022;
Furthermore, outside intruders can target an 34(8): 5121-5131.
IoT network, resulting in data tampering.
Unreliable and unauthenticated data results 3. Rani R, Kashyap V, Khurana M. Role of
from a high number of heterogeneous IoT. IoT-cloud ecosystem in smart cities: Review
While various algorithms have been applied to and challenges. Materials Today: Proceedings.
data classification research, it is observed that 2022; 49: 2994-2998. 4.Bittencourt L, Immich
some algorithms gave better results than the R, Sakellariou R, Fonseca N, Madeira E,
other algorithms. This paper suggested a better Curado M, et al. The Internet of Things, fog
filtering technique using three steps of data and cloud continuum: Integration and
filtering such as data cleaning; detecting challenges. Internet
suspicious data, and event data detection at the of Things. 2019; 3-4:134-155.
fog computing layer to increase the results of https://doi.org/10.1016/
existing data filtering systems. The developed j.iot.2018.09.005
system increases bandwidth and reduces the
latency as data filtering takes place at the fog
computing layer instead of the cloud 5. Ribeiro FM, Prati R, Bianchi R,
computing layer and also provides the ultimate Kamienski C. A nearest neighbors based
solution for the secure transmission of data data filter for fog computing in IoT smart
between IoT and the cloud. In the future, the agriculture. In: 2020 IEEE International
work will be expanded to include the Workshop on Metrology for Agriculture and
implementation of the system for a variety of Forestry (MetroAgriFor). Trento, Italy:
applications IEEE; 2020. p.63-67.

IV. BIBLIOGRAPHY 6. Xenakis A, Karageorgos A, Lallas E, Chis


AE, González-Vélez H. Towards distributed
G. Siva Prasad, IoT/cloud based fault detection and
M.C.A, M.Tech maintenance in industrial automation.
(CSE), UGCNET, Procedia Computer Science. 2019; 151: 683-
Works as Assistant 690. https://doi.
Professor in the org/10.1016/j.procs.2019.04.091
Department of
MCA , KBN 7. Goldstein A, Fink L, Meitin A, Bohadana
College S, Lutenberg O, Ravid G. Applying machine
(Autonomous), learning on sensor data for irrigation
Vijayawada, recommendations: revealing the
Andhra Pradesh and he is having 10 years of agronomist’s tacit knowledge. Precision
experience in teaching and one year in Agriculture. 2019; 19: 421- 444.
industrial experience. His research interest

ISBN Number : 978-81-958673-8-7 34


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

8. Anagaw A, Chang YL. A new


complement naïve Bayesian approach for
biomedical data classification. Journal of
Ambient Intelligence and Humanized
Computing. 2019; 10: 3889-3897.

9. Wang K, Shao Y, Xie L, Wu J, Guo S.


Adaptive and fault-tolerant data processing
in healthcare IoT based on fog computing.
IEEE Transactions on Network Science and
Engineering. 2020; 7(1): 263-273.
https://doi.org/10.1109/ TNSE.2018.285930

ISBN Number : 978-81-958673-8-7 35


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Impact of Social Networking on Indian


Youth - A Survey
Settipalli Venkata Surya Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: settipallisurya1111@gmail.com Email:gspkbn@gmail.com

Abstract smartphones have come more accessible and data more


affordable, the virtual world has expanded, bridging
The rapid-fire proliferation of social networking geographical gaps and easing connections across indeed
platforms has revolutionized communication and the outermost corners of the country. This miracle has
commerce patterns among individualities raised pivotal questions about the impact of this digital
encyclopedically. This study focuses on understanding metamorphosis on the social, cerebral, and educational
the profound impact of social networking on the youth well- being of the youthfulpopulation.In this period of
population in India. Through a comprehensive check, hyperconnectivity, the dynamics of connections, tone-
this exploration aims to exfoliate light on the expression, and information dispersion have
multifaceted goods that these platforms have on colorful experienced a transformation. The capability to
aspects of youthful Indians'lives.The check involved a incontinently connect with musketeers, family, and
different sample of Indian youth across different age indeed nonnatives, transcending physical boundaries,
groups, educational backgrounds, and socio- profitable has reshaped traditional morals of communication. This
strata. The exploration employed a mixed- styles study aims to anatomize the multifaceted goods of
approach, combining both quantitative data through social networking on Indian youth, anatomizing both the
structured questionnaires and qualitative perceptivity positive and negative aspects of their virtual relations.
through open- concluded questions. The responses were From enhanced exposure to different shoes and instant
anatomized to discern patterns and trends related to the access to information to enterprises about sequestration
positive and negative consequences of social violation, cyberbullying, and digital dependence , the
networking engagement.The findings reveal that social counteraccusations are both promising
networking has significantly converted the way Indian andalarming.Moreover, the educational geography has
youth communicate, establish connections, and also been deeply impacted, with scholars using social
consume information. networking for cooperative literacy, information
I. INTRODUCTION sharing, and exploring academic openings.
contemporaneously, enterprises impend over the
The arrival of the digital age has brought about implicit distraction these platforms pose, siphoning off
a seismic shift in the way people across the globe precious study time and injuring attention. This
interact, communicate, and consume information. At the contradiction calls for a nuanced understanding of the
van of this metamorphosis are social networking ways in which social networking platforms cross with
platforms, which have surfaced as important tools the educational hobbies of Indianyouth.As these digital
reshaping mortal connectivity. In the Indian disciplines grow in influence, the part of stakeholders
environment, a country marked by its different culture, similar as parents, preceptors, policymakers, and the
burgeoning youth population, and rapid-fire youth themselves becomes pivotal. Balancing the
technological relinquishment, the impact of social advantages of increased connectivity and information
networking on the youngish generation has come a access with the challenges of screen dependence , online
subject of profound significance and conspiracy. This safety, and the corrosion of face- to- face relations
study delves into the intricate web of counteraccusations requires a combined trouble. This study's findings aim
that these platforms have woven into the lives of Indian to inform strategies and enterprise that can guide
youth .Social networking platforms, ranging from responsible and aware social networking engagement
Facebook and Twitter to Instagram, have seamlessly among Indian youth, empowering them to harness the
woven themselves into the fabric of everyday life. The eventuality of these platforms while securing their
Indian youth, characterized by their tech- smart nature holisti development.In the runners that follow, the
and propensity to embrace new trends, have taken to intricate relationship between social networking and the
these platforms with unknown enthusiasm. As youth of India will be explored through a

ISBN Number : 978-81-958673-8-7 36


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

comprehensive check, slipping light on the – ’ runners ’ to produce and promote a particular or
complications that uphold this digital revolution. By business ideas or involve others in a content;
unraveling the vestments of impact that weave through – ’ Presence Technology ’ which allows videotape calls
the lives of youthful Indians, we endeavor to contribute and textbook converse for those online on the web
to a deeper understanding of the forces shaping their point o ’ sequestration ’ to block allow specific or all
online gests and, accordingly, their futures. members from viewing the profile, prints or
commentary.
Social networking statistics • Twitter is a micro-blogging site that doesn't require
Social Active Daily 15-34 Indian approval for registered users to broadcast and track
Media User Users Ages Uses responses to brief posts, or "Tweets." The tweets may
contain links to other blogs or posts, which other people
Facebook 2.93 2.95 93% 93369.9
can subscribe to follow or react to and get Update
billion billion million
messages by adding "Hashtags" to the post's keyword;
Twitter 369 229 82% 82 24
this functions as a metatag and is used to express as a
million million million
hashtag. The public has access to and can search the
LinkedIn 100 137 63% 63 99
tweets. Ruby-for-Rails, an open source web framework
million million million
that powers Twitter, offers an application developer-
Instagram 2.3 990 85% 85229.6 friendly API.
billion million million • LinkedIn is primarily made for the corporate business
Google 4.3 270 88% 759 sector and allows registered users to create a network of
billion million million other professionals they know and trust as "connections"
online. Unlike Facebook or Twitter, this requires prior
low fresh cost of connectivity, participating information, relationships. The primary display components on user
venting opinions and streamlining each other on sites here are educational and professional
happenings in their lives.The expansive use of Social qualifications. This program is accessible in 24 different
Networking still, makes it an intriguing study( 6) languages.
regarding the risks and consequences on the being • Google+ gives users the option to publish status
youths. Social networking with the capability to updates or photos to "Circles," which is basically a
effectively vanish boundaries, the anytime anywhere group for multi-person instant messaging social
vacuity has seen impact on sequestration as networking system. Friends can watch and comment on
participating too much, false gratuitous information these posts. 'Hangouts' allows users to publish text and
about themselves or voice opinions, indeed getting videos.
exposed to fraudsters or cyber culprits and utmost
critical of all the increased dependence to Internet and II. LITERATURE REVIEW
Social operations( 13). These tend to impact the youth
for their social, emotional and cerebral well- being. Social networking sites' explosive growth has generated
Adverse issues are seen as adding exposure tocyber- a lot of discussion about how it is affecting young
bullying, unknown persons penetrating particular people all around the world. Researchers and academics
information, online courting, exiting, and sleep have taken a keen interest in how social networking has
privation, exposure to infelicitous digital content, affected India's young population, a country known for
outside influences of third- party groups encouraging to its diverse demographics and quick technological
transfer plutocrat and low social relations and limited adoption. An overview of the current studies on the
face to face dispatches. effects of social networking on Indian adolescents is
exemplifications of popular Social Networking spots are intended through this survey of the relevant literature.
as follows
• Facebook is presently one of the most notorious social • Communication habits and Relationships:
networking operation point encyclopedically, is
available in 37 languages and permits registered Numerous research have examined how social
druggies to produce biographies analogous to a ’ wall ’ networking platforms have changed Indian youths'
like a virtual communication habits and interpersonal interactions.
bulletin board, add musketeers, and shoot dispatches, While these platforms have permitted greater
comment, upload and share vids, photos, connectivity, Gupta and Sharma (2018) have observed a
web links. This operation has several public features shift from face-to-face encounters to digital
like communication, which may result in weakened
– ’ Marketplace ’ to post and respond to classified interpersonal skills and a feeling of isolation. Sinha and
announcements online; Bhowmick (2019), on the other hand, hypothesized that
– ’ Groups ’ to publicize events and invite guests and social networking could improve relationships by
musketeers for attending that event; facilitating constant connection and sharing of

ISBN Number : 978-81-958673-8-7 37


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

experiences, maintaining links regardless of - Educational topics and school


geographical distances. assignments are discussed on social
sites.
• Psychological Impact and Well-Being: Politics - Increase in voting when their friends
vote on a Facebook post;
Research has focused on the psychological - More likely to attend a political
consequences of social networking on Indian youth. meeting and meet others on social
Researchers like Subrahmanyam et al. (2020) networks
underlined the possibility that social networking could websites;
contribute to FOMO and social anxiety - Social movements are an easy and
 Information Consumption and Political fast way to mobilize and divide people
Engagement: information
Research has examined Indian youth's information
consumption patterns and their political engagement in Awareness - The spread of information is faster
light of the rise of social media as a powerful than any media - news spreads
information source. Bose and Jain (2021) noted that quick;
social media platforms were important news and - Ability to use academic research
information sources for young Indians, but they issued a resources that were previously
warning regarding the filter bubble effect and possible unavailable;
false information. The importance of social media in - Helps inform people and enable
promoting political knowledge and activism, them to change.
particularly among urban Indian young, was noted by Social - Social media allows people to
Bhatia and Yadav (2018) Benefits communicate with friends and it has
• Privacy and cyberbullying: increased
Concerns about privacy violations and cyberbullying online communication strengthens
were explored in the context of Indian youth social these relationships, friendships;
networks. Arora and Verma (2017) emphasized the need - People make new friends - 57% of
for digital literacy among young people to effectively teenagers online say they make new
navigate data protection settings. In addition, Singh and friends
Ahuja (2018) highlighted cyberbullying and its online
negative effects on the mental health of Indian youth - Helped to find and keep in touch
and called for stricter online safety measures. with geographically distant friends
• Cultural and gender dynamics: Job - Great for marketing professionals -
Cultural and gender dynamics have shaped social Opportunit connect and find business
networking engagement among Indian youth. Mathur et ies opportunities.
al. (2018) explored how cultural norms influenced self- - Employers find candidates and the
presentation and identity management in social media, unemployed find work faster.
while Dhavan and Kaur (2021) explored the intersection - Social media has created thousands
of gender and social networks, highlighting the potential of online shopping jobs and new
of online platforms to both challenge and reinforce opportunities
traditional gender roles.Social Networking Aspects
Positive Aspect
Some of the positives arising from social networking are Social Networking Survey
listed in Table 2. The writers studied the effects of social networking on
Indian youth and culture through survey research. 532
Negative Aspects responses were received after the poll's participants
Some of the negatives arising from social networking were issued a thorough questionnaire using poll
are listed in Table 3. Monkey. Table 4 shows the breakdown and survey
analysis.A few questions on social networking were
Table 2: Social networking Positive Aspects posed to the respondents, and their answers are
depicted.using the graphs below as a guide.
Education - Helps better collaboration and What percentage of your daily time do you spend on
communication between teachers and social networking sites? MostRespondents spent more
students; over an hour or two every day on social networking
- Using online resources helps sites, which in a nationconsidering the closed culture,
students learn better and faster; like India, is significant (See Figure 1).
- Student grades improved and The level of social networking addiction is question
absences from online classes number two. The fact that the respondents routinely
decreased; checked their social media accounts every morning

ISBN Number : 978-81-958673-8-7 38


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

demonstrates a trend in the escalating interest in and harm to prospects for


addiction to social media (See Figure 2). employment -
What is the major reason that you utilize social When evaluating a candidate for
networking, #3? The main motivation for utilizing a job, employers look at their
social media is typically non-essential, such as social media profiles. Things
expressing ideas, which is restricted in Indian society like profanity, bad spelling,
because doing so is frowned upon (See Figure 3). grammar, racism, sexism, and
Impact of the mind and body on health (Question 4)? In references to drugs, alcohol, or
Table 5, other unhealthy behaviors might
What are the many methods for gaining access to social work against you.
networking applications? Lack of Privacy - Young people often give
personal information without
being online
Table 3: Social networking Positive Aspects by reading the privacy policy
tailor and you are not aware of
Apps access User - Social media apps require abuse by third parties
Data users to provide their apps celebrate
access to a list of things; - Corporate and government
- Access to user names, profile interference - insurance
pictures, birthdays of friends on companies use information
the friend list, favorite TV collected from social media.
shows and books, etc. - Internet advertising practices
- Email the user directly by violate privacy. If you click
sending emails to their email "like"
address; for the brand, browser cookies
- View posts in the news feed, provide information and access
posted videos, and photos; to the company
- Access information on about personal data and
families and relationships; preferences.
- Add new message posts on Users Vulnerable - Unauthorized distribution of
behalf of the user - Post to the to Crime intellectual property can cause
wall loss of potential
income
Detriment to - Encourages plagiarism and - Cyber attacks such as
Work cheating when submitting ransomware, hacking, identity
coursework; - Light users of theft and phishing
social media receive higher are common problems faced by
grades, whereas heavy users end users.
receive lower grades; - Criminals browse social
- Students' average GPA is 3.06, media to know where users are
whereas non-users' average and to be known
GPA is 3.82; - College students' commit crimes while on
grades decreased for every 93 vacation.
minutes over the daily average Waste of Time - Constant browsing and
of 106 spent on Facebook. responding to online posts and
- Test scores were 20% lower blogs takes the user
for students who used the attention away from the main
internet to study. work and it often takes time to
35% of admissions officers return to it
check potential candidates' original assignment.
social media blogs and postings,
Social Detriment - - Cyberbullying or using
which may have a detrimental
electronic communication to
impact on judgments about
bully someone
employment and education.
sending scary or threatening
51% of users between the ages
messages is common on the
of 25 and 34 visited social
Internet.
media while at work, proving
It causes mental trauma and
that social networking sites
sometimes even suicide.
lower employee productivity.
- Too much internet correlates

ISBN Number : 978-81-958673-8-7 39


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

with personality and brain The social networking patterns of the people in the
diseases study are largely consistent with those observed in
- poor social skills and previous studies on the influence of popular social
narcissistic tendencies or even media on Indian culture and the scope, purpose and
an immediate need manner of accessing these sites. The author also looked
pleasure in addictive behavior at the benefits social networking sites in culture
and other emotional problems development, forming self-identity, developing
that lead to relationships and acquiring social, communication and
depression, anxiety and technical skills. For future research, it is necessary to
loneliness. increase the sample size and select a more
- Less time for face-to-face representative sample. This study may also suffer from
communication with loved ones. the shortcomings of judgmental sampling, such as
- Young people tend to feel researcher biases and stereotypes, which can lead to
isolated, separated from the real bias.
world
and have a higher risk for
depression, low self-esteem, and Time per Day Spent On Social Networking Sites
eating disorders.
Misinformation Enables the spread of false
rumors and unreliable
information:
- Self-diagnosis of health
problems and following the
advice of amateur medicine;
- make friends with someone to
get information;
- unknowingly disclose
intelligence data to the public;
- Studies have shown that Using different types of Media
websites like Facebook
influence you
ads, spend more money

Table 4: Breakdown of Respondents organization

Organization Category Respondents Breakup


%
Financial Services 55 10%
Education 188 35%
Information 100 20%
Technology
Retail, Ecommerce 76 13% Recommendations
Internet Service 45 9%
Provider Based on the results of this study, the researcher did the
following.
Gaming 25 6%
Media & Travel 39 8% • Recommendation for colleges and university
Pharmacy 66 12% institutions:
– Regulations on the use of mobile phones during
Accessing social networking applications by users range lectures.
as follows
Purposes of social networking
• Mobile Devices - 72% (Includes Smart Phones, iPads,
Kindles, Tablets); purposes No.of
• Desktop Computers - 40%; responses(N
• Laptops - 48%. =134)
Friendly 101
III CONCLUSION Communication
Academic 49

ISBN Number : 978-81-958673-8-7 40


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Communication Communication Technology, its role or responsibility is


To discuss new ideas 39 to initiate and coordinate all policies and programs.
To publish writings 12 for the use and development of information and
To discuss social 39 communication technology (ICT). social
issues and events networking is an integral part of ICT as a result-based
Promote themselves 15 approach. About this work; these
or their work recommendations are made to the Ministry:
- The Ministry must oblige all providers of social
Stron Agree Unde Disagr Stron services
Social gly cided ee gly A prerequisite for opening an account in any social
Networking Agree Disag network is appropriate
ree registered GSM SIM card. - Service providers must
Way of life for 15% 43% 19% 16% 7% keep the personal data of each of their account holders
youth & old including their GSM phone number; and forward the
Is Highly 18% 55% 13% 10% 4% information to the appropriate parties
addictive state agency, if the need arises. – Activation of the law
Compare our 41% 42% 11% 5% 1% on the use of social networks: the ministry must propose
lives with the introduction of a new law, which
others would guide social media users on what to do and what
not to do. That's quite a lot
Making us 23% 53% 13% 7% 4%
is necessary now because one of the results of this
restless,
study shows that some students use
sleeplessness
social networking sites to participate in cybercrimes.
Gives rise to 18% 43% 22% 13% 4%
Such an act creates a legal framework
Cyber
which would help the courts to solve cyber crime cases.
Bullying
Glamourizes 26% 29% 20% 17% 8%
IV. BIBLIOGRAPHY
Drug &
Alcohol
Can make us 44% 31% 12% 11% 2%
unhappy G. Siva Prasad, M.C.A, M.Tech (CSE),
Leads to fear 39% 28% 26% 3% 4% UGCNET, Works as Assistant
of missing out Professor in the Department of MCA ,
Multitasking, 32% 34% 11% 18% 5% KBN College (Autonomous),
loss of Vijayawada, Andhra Pradesh and he is
concentration having 10 years of experience in
Leads to 23% 48% 18% 8% 3% teaching and one year in industrial experience. His
increased peer research interest includes Data Mining, Machine
pressure Learning, Deep Learning, Big Data, Microsoft
Programming Languages and Web Programming. He
has attended workshops on POWER BI, Data Analytics
using R, Generative AI, Block Chain Technology and
Mental and physical effects on health responses many more.
- Thus, students can access different social networks
with their mobile phones,
It is advisable for the university to regulate the laws V. REFERENCES
regarding when students use the phone during lectures
a crime for which severe punishments are imposed on [1] M. Al-Qurishi, M. Al-Rakhami, A. Alamri, M.
the guilty party. - Organize a seminar where students are Alrubaian, Md M. Rahman, S. Hossain, “Sybil
told about the not so good aspects of using social Defense Techniques in Online Social Networks: A
networks as communication tools. This can be done by Survey,” IEEE Access, vol. 5, pp. 1200–1219,
putting students first 2017.
face-to-face communication in creating real [2] P. Beri, S. Ojha, “Comparative Analysis of Big Data
communication or message sharing. This is where Management for Social Networking Sites,” in
seminars would help. - Pass laws on social media 3rd International Conference on Computing for
content: There must be laws to guide students Sustainable Global Development (INDIACom’16),
on the use of social networks and their distribution in IEEE, 2016.
the media. • Recommendations to the Ministry of [3] M. Devmane, N. Rana, “Usability Trends and
Information Technology Since social networks are Access of Online Social Network by Indian population
under the competence of the Ministry of

ISBN Number : 978-81-958673-8-7 41


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

and its analysis,” in International Conference on [16] C. Wang, Bo Yang, J. Luo, “Identity Theft
Nascent Technologies in the Engineering Field Detection in Mobile Social Networks Using Behavioral
(ICNTE’15), IEEE, 2015. Semantics,” in IEEE International Conference on Smart
[4] S. Gao and X. Zhang, “Why and how people use Computing (SMARTCOMP’17), 2017.
location sharing services on social networking [17] B. Watch, Social Media 2016, Nov. 3, 2016.
platforms in China,” in 12th International Joint (https://www.brandwatch.com/blog/
Conference on e-Business and Telecommunications 96-amazing-social-media-statistics-and-facts-for-2016/)
(ICETE’15), IEEE, 2015. [18] Y. Zhou, D. W. Kim, J. Zhang, L. Liu, H. Jin, H.
[5] S. Gebauer, Twitter Statistics 2016. Social Claim Jin, T. Liu, “ProGuard: Detecting Malicious
Blog, Nov. 1, 2016. (https://blog. Accounts in Social-Network-Based Online
thesocialms.com/twitter-statistics-you-cant-ignore/) Promotions,” IEEE Access, vol. 5, pp. 1990–1999, 2017
[6] D. Houghton, A. Johnson, D. Nigel, M. Caldwell,
Tagger’s Delight Disclosure and Liking Behavior
in Facebook: The Effects of Sharing Photographs
Amongst Multiple Known Social Circles, Oct. 20,
2016. (http://epapers.bham.ac.uk/1723/1/2013-
03_D_Houghton.pdf)
[7] A. Isodje, “The use of Social Media for Business
Promotion,” in International Conference on
Emerging & Sustainable Technologies for Power & ICT
in a developing society, 2014.
[8] C. C. Kiliroor, C. Valliyammai, “Trust analysis on
social networks for identifying authenticated
users,” in IEEE 8th International Conference on
Advanced Computing (ICoAC’17), 2017.
[9] M. Kumar and A. Bala, “Analyzing Twitter
sentiments through Big Data,” in 3rd International
Conference on Computing for Sustainable Global
Development (INDIACom’16), IEEE, 2016.
[10] M. Madan, M. Chopra, M. Dave, “Predictions and
recommendations for the higher education
institutions from Facebook social networks,” in 3rd
International Conference on Computing for
Sustainable Global Development (INDIACom’16),
2016.
[11] S. Mittal, A. Goel, R. Jain, “Sentiment analysis of
E-commerce and social networking sites,” in
3rd International Conference on Computing for
Sustainable Global Development (INDIACom’16),
IEEE, 2016.
I.J. of Electronics and Information Engineering, Vol.7,
No.1, PP.41-51, Sept. 2017 (DOI:
10.6636/IJEIE.201709.7(1).05) 51
[12] P. Purva, A. Yadav, F. Abbasi, D. Toshniwal,
“How Has Twitter Changed the Event Discussion
Scenario? A Spatio-temporal Diffusion Analysis,” in
International Congress on Big Data (BigData
Congress’15), IEEE, 2015.
[13] E. Shaw, Status Update: Facebook Addiction
Disorder, Sept. 15, 2016. (http://theglenecho.com/
2013/01/29/status-update-facebook-addiction-disorder/)
[14] H. Singh, B. P. Singh, “Social Networking Sites:
Issues and Challenges Ahead,” in 3rd International
Conference on Computing for Sustainable Global
Development (INDIACom’16), IEEE, 2016.
[15] C. Smith, Facebook Facts and Statistics, Oct. 20,
2016. (http://expandedramblings.com/index.
php/by-the-numbers-17-amazing-facebook-stats

ISBN Number : 978-81-958673-8-7 42


Analysis Of Methods And Application In Machine
Learning

Shaik.Ashraf, Siva Prasad Gutakala,


Student, Assistant Professor,
Department of MCA , Department of MCA,
K.B.N. College (Autonomous), K.B.N. College (Autonomous),
Vijayawada-520001, Andhra Pradesh, India Vijayawada-520001, Andhra Pradesh, India
ashrafshaik311220@gmail.com gspkbn@gmail.com

Abstract— Machine learning (ML) is a part of artificial providing a machine or model with data and enabling it to
intelligence that acts as humans learn. It highlights the way learn autonomously through training. Arthur Samuel
of using data and algorithms, and progressively, it generates introduced a revolutionary concept in 1959, Instead of
procedures and accuracy. ML (machine learning) has arisen explicitly instructing computers, they should be empowered
as a result of digitization, digitization, and reforming the to learn independently. He termed this approach "machine
technique to compound the difficulties in the kind of learning," which has since become the established definition
knowledgeable problems and decisions. Machine learning for computers' capacity to self-learn.
stands for the most exhilarating machines that are unique Machine learning involves a training method where an
and continually evolving. The capability of learning the algorithm, part of the broader field of machine learning
concept of machine learning is dynamically available to use (ML), learns from prepared data. There are multiple ways in
now, it may be in several additional places rather than in a which ML models can enhance the efficiency of advanced
single place. This research report aims to provide a complete processes. These models possess the capacity to analyze
study of the approaches and submissions in the field of extensive datasets presented as records, enabling creators to
machine learning. Nowadays, multinational companies are identify variations and explore connections while in pursuit
using machine learning techniques to develop their business of specific patterns within the data.
conclusions, increase productivity, sense viruses, and In this procedure, machines are enabled to acquire
estimate climate change, and soon they will do so with the insights from input data, relieving humans from the
help of machine learning. The report covers a wide range of cumbersome responsibilities of refinement and adjustment.
methodologies, from supervised and unsupervised learning This highlights the impactful role of machine learning.
to reinforcement learning and deep learning, highlighting Within the scope of machine learning, a noteworthy benefit
their respective strengths and applications in real-world emerges, computers are no longer reliant on explicit and
scenarios. Furthermore, machine learning (ML) finds inflexible programming. Instead, they showcase an
applications in online search engines, filtering emails for impressive capacity to independently fine-tune and
segregating spam, and websites for creating customized improve algorithms, signifying a stride into the domain of
recommendations. Even in banking software, it serves to artificial intelligence (AI).
identify unusual transactions, and in various mobile Fundamentally, machine learning explores the domain
applications like voice recognition, the fundamental aim of of computer algorithms that naturally boost their efficiency
machine learning is to develop smart machines. as they gain experience and interact with datasets. This
area, which falls within the scope of artificial intelligence,
Keywords— Machine Learning, Machine Learning is centered on developing methods that enable systems to
Methods, Machine Learning Applications, Machine autonomously "learn" from data, resulting in customized
Learning Challenges, and Artificial Intelligence answers for particular obstacles. In today's age of
worldwide digital evolution, machine learning emerges as
I. INTRODUCTION a powerful strategy for streamlined data exploration.
Machine learning (ML) is specified by way of "it can
learn explicitly programmed learning capability without II. MACHINE LEARNING
giving any information in the computer’s study fields. AI means Artificial intelligence, which leverages
Without any transformation of coding, machine learning can machines to mimic the problem-solving and decision-
permit the code itself, and ML is a subgroup of AI. making capabilities of the human mind. Machine learning is
Currently, machine learning (ML) is unique in that it has the a technique used for analyzing various types of digital
highest talent for presentation parts, popularly in the area of information, such as numbers, words, and images. It
information technology, where its application possibilities involves training computers to learn and make decisions like
are nearly infinite. Within the realm of artificial intelligence the human brain, based on data and experience. The primary
(AI), there exists a subset known as machine learning (ML). goal is to develop models that can enhance themselves,
The core concept behind machine learning involves recognize complex patterns, and solve new problems by
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

learning from past data. These applications continually


improve their accuracy by automatically optimizing their
processes.
The effectiveness of a machine learning model hinges on
two key factors. First, the quality of the input data plays a
crucial role, if the data provided is of low quality or
disorganized, the model's outcomes will be unreliable.
Second, selecting the appropriate model algorithm is
essential. There are a wide range of algorithms available,
each with its specific uses. While neural networks are
known for their accuracy and flexibility, simpler models
often work better with limited data.
The strength of a machine learning model lies in its Figure-2: Categorization of Supervised Learning.
ability to accurately identify features and patterns within
data, leading to more precise decision-making and a. Classification:
predictions. A classification problem arises when the objective is to
categorize items into clear groups or distinct categories.
I. MACHINE LEARNING METHODS b. Regression:
Machine learning methods can be categorized as four A regression problem arises when, in situations
divisions as depicted in Figure-1. involving numbers that exhibit continuous variation, the
scenario corresponds to a regression problem.
2. Unsupervised Learning
Unsupervised learning is like a machine learning
approach where the machine learns on its own without
someone telling it what to look for. It's given a bunch of data
that hasn't been labeled or sorted in any specific way, and
the machine figures out patterns or connections within that
data all by itself. The main aim of unsupervised learning is
to organize the input data in new ways or to find groups of
things that seem alike.
In unsupervised learning, there is no predetermined
outcome that we're trying to achieve. The machine dives into
a large pile of data and tries to uncover interesting things
from it. Unsupervised learning in four different categories,
as depicted in Figure-3 and each with its own methods.
These methods help the machine group similar things
together or understand how different variables relate to each
other in the data.
Figure-1: Categorization of Machine Learning Methods. a. Clustering:
1. Supervised Learning : Imagine a machine learning system examining data and
Supervised learning is characterized by teaching finding things that are quite alike. It then groups these
algorithms using labeled sets of data, enabling them to similar things and forms these unique entities that are
effectively sort information or make precise predictions. clustered together, facilitating a deeper comprehension.
Supervised learning is the process of providing input data as b. Association:
well as correct output data to the machine learning model. Through the analysis of significant characteristics within
The mapping of input data to output data is the main aim the data, an unsupervised learning model can anticipate
of the supervised learning process. The learning process of additional attributes often linked with them.
supervised learning is founded on supervision and is similar c. Anomaly Detection:
to when a child studies behavior under the supervision of This involves the model spotting unusual data points. For
their Parents. An example of supervised learning is instance, banks can catch potential fraud by noticing strange
Language Translation. spending patterns, like using a card in two distant places on
Supervised learning can be categorized into two the same day.
methods. As shown in Figure-2.

ISBN Number : 978-81-958673-8-7 44


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Semi-supervised learning is like a blend of two


Supervised Unsupervised
Aspect approaches, where we have some information about both the
Learning Learning
process and the expected outcome, but our dataset isn't fully
Discover valuable
Train the model to labeled. It's a mix of using what we know from labeled data
Primary insights and hidden
predict outputs for while also trying to figure out patterns or connections in the
Goal patterns.
new inputs. unlabeled data to make educated predictions about new
outcomes.
Labeled data is Unlabeled data is used
Input Data 4. Reinforcement learning:
used for training. for training.
Reinforcement learning is a learning approach in which
Number of classes is an agent discovers how to do things through
Number of Number of classes
typically unknown. experimentation and learning from mistakes. It's like
Classes is usually known.
teaching a pet new tricks they get treats for doing things
A model is trained A model is trained right and learn to avoid things that lead to trouble. The agent
Process using input and using only input data learns automatically by getting rewards for making good
output data. without output data. decisions and facing consequences for poor ones. In this
Objective in Can involve a certain approach, the agent interacts with its surroundings and
Subjectivity
approach. degree of subjectivity explores different actions. The ultimate aim is for the agent
Regression, to gather as many rewards as possible, so it becomes skilled
Algorithms classification, Clustering, association, at the task. Imagine a robotic dog that figures out how to
Supported instance-based, neural networks move its arms on its own—that's a prime example of
neural networks. reinforcement learning. This technique uses rewards and
feedback to figure out the best way to do something. It's
Generally simpler Tends to be more used to train AI-powered robots, and you can even
Complexity
in nature. complex experience it when playing video games that hand out
rewards once you complete a task. The difference between
Handling large Determining the right Supervised Learning and Unsupervised Learning models is
Primary
datasets can be number of clusters can given in Table-1.
Drawback
challenging. be subjective Table-1: Comparison between Supervised and
Unsupervised Learning Models
III. APPLICATIONS
Machine learning occupies a significant place in modern
technology, experiencing rapid expansion daily. Its
integration into our daily lives occurs seamlessly and often
integrates machine learning into our everyday routines, often
without conscious awareness, as evidenced by the
incorporation of this technology in tools like Google Maps,
Google Assistant, and Alexa. Presented below are several
prominent real-world applications of machine learning in
Figure-4.

Figure-3: Categorization of Unsupervised Learning.

d. Artificial neural Networks:


An autoencoder takes information, compresses it into a
simpler form, and then tries to recreate the original
information while cleaning up any unwanted noise for better
data quality.
3. Semi-supervised learning:

ISBN Number : 978-81-958673-8-7 45


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Google Maps uses the current positions of vehicles from its


app and sensors to get an up-to-the-minute understanding of
traffic conditions.
b) Past Experience:
It also considers how long it took on previous days at the
same time to gauge traffic patterns and make predictions.
The cool thing is that everyone who uses Google Maps
contributes to its improvement. The app gathers information
from users and uses it to enhance its performance. This
resembles a collaborative endeavor where collective
contributions contribute to the continuous improvement of
the app.

3. Product Recommendations:
Machine learning finds extensive application in the
operations of diverse e-commerce and entertainment
enterprises like Amazon, Netflix, and more. These
companies employ it to provide users with personalized
product suggestions. For instance, when a search is
conducted on Amazon, subsequent internet browsing on the
same browser yields advertisements for the same product,
facilitated by machine learning.
Google employs a range of machine learning algorithms
to comprehend user preferences and propose products in
alignment with their interests. Similarly, Netflix utilizes
machine learning to provide tailored recommendations for
TV series, movies, and other forms of entertainment.

4. Virtual Personal Assistant:


Diverse virtual personal assistants, including Google
Assistant, Alexa, Cortana, and Siri, are available. As the
Figure-4: Real-World Applications of Machine name implies, they aid us in retrieving information through
Learning. voice commands. These assistants can accomplish a range of
1. Medical Diagnosis: tasks solely based on voice directives, like playing music,
In the world of medicine, machine learning is like a making calls, accessing emails, and scheduling
helpful tool that doctors use to figure out what's wrong with appointments.
patients. It's like having a super-fast assistant who can look Crucially, these virtual aides rely on machine learning
at things like brain scans and create 3D pictures that show algorithms. They capture our voice commands, transmit
exactly where the problems are. them to cloud servers, and utilize ML algorithms to interpret
This clever technology is especially good at spotting and execute them accordingly.
things like brain tumors and other issues that can affect our
brains. It's like having a special pair of eyes that can see 5. Email Spam and Malware Filtering:
things doctors might miss. So, thanks to machine learning, Upon receiving new emails, automatic sorting takes
medical technology is getting advanced, making it easier for place, classifying them into important, normal, or spam
doctors to find and treat brain problems. categories. Important messages appear in our inbox, often
marked with an important symbol, while spam emails are
relegated to the spam folder. This operational mechanism
hinges on Machine Learning. Listed below are some of the
2. Traffic Prediction: spam filters employed by Gmail:
When navigating directions to unfamiliar destinations,
assistance is often sought from Google Maps. This tool not 1. Content Filter
only provides the optimal route but also anticipates traffic 2. Header Filter
conditions. It accomplishes this by gauging whether the 3. General Blacklists Filter
traffic is flowing smoothly, moving slowly, or extensively 4. Rules-Based Filters
congested, relying on two distinct approaches. 5. Permission Filters
a) Real-Time Vehicle Locations:
Machine learning algorithms, including Multi-Layer
Perceptron, Decision Trees, and Naïve Bayes classifier, play

ISBN Number : 978-81-958673-8-7 46


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

a pivotal role in email spam filtering and the identification The technology operates through a systematic process in
of malware. that authentic transactions yield output, which is then
transformed into hash values. These hash values
6. Self-Driving Cars: subsequently serve as input for subsequent rounds. Distinct
One of the most captivating implementations of machine patterns emerge for genuine transactions, evolving when
learning is to help make self-driving cars a reality. Take fraudulent activities come into play. Consequently, the
Tesla, for example. They're a big car company that's system detects these anomalies, bolstering the security of
working on making cars drive themselves. They're using a our online transactions.
special way of teaching the cars, kind of like how we learn IV. CHALLENGES OF MACHINE LEARNING:
things on our own.
Imagine that the car is like a student, learning without a  The main problem in machine learning is when there
teacher. It figures out how to recognize people and things on isn't enough data or the data is too similar.
the road all by itself. So, when it's driving, it knows what's  Machines can't learn without information, and if the data
around and can keep everyone safe. It's like the car is getting all looks the same, it confuses the machine.
smart and knows what to do while people can relax.  For a machine to understand things well, the data it
learns from should be diverse and have different types.
7. Stock Market Trading:  Usually, machines struggle to find useful details if the
Consider the stock market, known for its volatile nature data is very limited or doesn't have much variation.
with constant fluctuations in prices. Interestingly, machine  It's a good idea to have around 20 examples for each
learning, a sophisticated technology, is harnessed to group when teaching a machine. Not having enough
anticipate these shifts. examples can make the machine's learning and
Visualize a scenario resembling peering into a crystal predictions less accurate.
ball, except this crystal ball is a specialized computational
entity known as a long short-term memory neural network. V. CONCLUSION
This intelligent mechanism scrutinizes the intricate patterns Machine learning is significant because it lets
within the stock market, striving to deduce the potential computers learn and get better at certain tasks without
trajectory. being directly told how. This learning from data and
This dynamic process can be likened to having a highly adjusting to new situations is really helpful for projects
astute companion who excels at recognizing trends and with lots of data, complicated choices, or changing
formulating informed estimations regarding the market's conditions. Machine learning has different methods for
future course. Therefore, when people employ machine making predictions based on past data, like making models
learning for stock trading, they essentially employ this with numbers. Right now, it's used for things like
cutting-edge crystal ball to facilitate more informed recognizing pictures, understanding speech, sorting emails,
decisions about purchasing and selling shares. and suggesting tags on Facebook. In the future, researchers
will also look into using this kind of learning for smart
8. Speech Recognition: design.
Within the realm of Google's utility, the option "Search
by voice" stands out, this feature squarely falls within the VI. REFERENCES
scope of speech recognition, an extensively employed fact [1] Neha Sharma, Reecha Sharma and Neeru Jindal,
of machine learning. Speech recognition involves the “Machine Learning and Deep Learning Applications-A
transformation of vocal commands into textual content, Vision”, Global Transitions Proceedings, Year: 2021,
often labeled as "Speech to text" or "Computer speech PP: 24–28, https://doi.org/10.1016/j.gltp.2021.01.004.
recognition." In contemporary times, machine learning [2] Mohsen Shah Hosseini, Guiping Hu and Hieu Pham,
algorithms have gained widespread prominence in the “Optimizing Ensemble Weights And Hyperparameters
domain of speech recognition applications. Prominent Of Machine Learning Models For Regression
technologies like Google Assistant, Siri, Cortana, and Alexa Problems”, Year: 13 January 2022, PP: 1-10,
harness speech recognition technology to comprehend and https://doi.org/10.1016/j.mlwa.2022.100251.
execute voice instructions.
[3] Jonathan Schmidt, Mário R. G. Marques, Silvana Botti
and Miguel A. L. Marques,” Recent Advances And
9. Online Fraud Detection:
Applications Of Machine Learning In SolidState
In the realm of online transactions and financial
Materials Science”, npj Computational Materials, Year:
operations, the Internet serves as a platform for purchases
2019, Volume: 5/83, PP: 1-36,
and payments. An impressive technology known as machine
https://doi.org/10.1038/s41524-019-0221-0.
learning plays a pivotal role in safeguarding our finances.
In some instances, individuals may engage in deceitful [4] Iqbal H. Sarker, “ Machine Learning: Algorithms,
activities, masquerading as someone they are not or Real-World Applications and Research Directions”, SN
attempting to pilfer funds during online transactions. The Computer Science, Year: 22 March 2021, Volume:
technology, including a sophisticated element named a feed- 2/160, PP: 1-21, https://doi.org/10.1007/s42979-021-
forward neural network, comes to our rescue. 00592-x.

ISBN Number : 978-81-958673-8-7 47


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

[5] Rutvij H. Jhaveri, A. Revathi, Kadiyala Ramana, [11] Raniyah Wazirali and Rami Ahmad, “Machine
Roshani Raut, and Rajesh Kumar Dhanaraj, “A Review Learning Approaches to Detect DoS and Their Effect
on Machine Learning Strategies for Real-World on WSNs Lifetime”, Computers, Materials & Continua
Engineering Applications”, Hindawi, Year: 2022, PP: 1- Tech Science Press, Volume:70/3, Year: 2022, PP:
26, https://doi.org/10.1155/2022/1833507. 4921- 4946, DOI:10.32604/cmc.2022.020044.
[6] Jackson Kamiri, Geoffrey Mariga, “Research Methods [12] R.F. Bikmukhamedov and A.F. Nadeev, ” Lightweight
in Machine Learning: A Content Analysis”, Machine Learning Classifiers of IoT Traffic Flows”,
International Journal of Computer and Information ResearchGate, Year: July 2019, PP: 1-6, DOI:
Technology, Volume: 10/2, Year: March 2021, PP: 78- 10.1109/SYNCHROINFO.2019.8814156.
91. [13] Giovanni Di Franco and Michele Santurro, “Machine
[7] Daniel Hoang, Kevin Wiegratz, “Machine Learning Learning, Artificial Neural Networks And Social
Methods In Finance: Recent Applications And Research”, Quality & Quantity, Year: June 2021, PP: 1-
Prospects”, Wiley, Year: 2023, PP: 1-45, DOI: 20, https://doi.org/10.1007/s11135-020-01037-y.
10.1111/eufm.12408. [14] Bo Han and Rongli Zhang, “Virtual Machine Allocation
[8] Pinky Gupta, “Research Paper on Machine Learning Strategy Based on Statistical Machine Learning”,
and Its Application”, International Research Journal of Hindawi Mathematical Problems in Engineering, Year:
Engineering and Technology (IRJET), Volume: 09/ 03, 5 July 2022, PP: 1-6,
Year: Mar 2022, e-ISSN: 2395-0056, PP: 1483-1486. https://doi.org/10.1155/2022/8190296.
[9] Wei Jin, “Research on Machine Learning and Its [15] AmirAnees , IqtadarHussain ,UmarM. Khokhar,
Algorithms and Development”, Journal of Physics: FawadAhmed, and Sajjad Shaukat, “Machine Learning
Conference Series, Year: 2020, PP: 1-5, and Applied Cryptography”, Hindawi Security and
doi:10.1088/1742-6596/1544/1/012003. Communication Networks, Year: 27 January 2022, PP:
[10] Raffaele Pugliese, Stefano Regondi and Riccardo 1-3, https://doi.org/10.1155/2022/9797604.
Marini, “Machine Learning-Based Approach: Global
Trends, Research Directions, And Regulatory
Standpoints”, Data Science and Management, Year: 23
December 2021, PP: 19-29,
https://doi.org/10.1016/j.dsm.2021.12.002.

ISBN Number : 978-81-958673-8-7 48


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Review On Artificial Intelligence In Pharma


Industry
Shaik Basheer Ahmed, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA Department of MCA,
K.B.N. College Vijayawada-520001, K.B.N. College, Vijayawada-520001,
AndhraPradesh,India Andhra Pradesh, India,
basheerahmed200028@gmail.com Email: gspkbn@gmail.com

ABSTRACT methodical solutions reflects AI's affinity with


theorems in mathematics.
The abstract of the article highlights the growing
role of artificial intelligence (AI) in pharmaceutical Broadly, AI encompasses the design and
technology. It emphasizes the benefits of AI in deployment of algorithms for data analysis,
terms of time and cost savings and its ability to learning, and interpretation. Its scope extends
enhance the understanding of complex across various Sub fields, including statistical and
relationships between different formulations and machine learning, pattern recognition, clustering,
process parameters. The abstract defines AI as a and similarity-based techniques. This flourishing
branch of computer science that utilizes technology has permeated numerous facets of life
symbolized programming for problem-solving, and industry, reshaping conventional approaches.
having evolved into a problem-solving science with
Notably, the pharmaceutical industry has embraced
wide-ranging applications in business, healthcare,
AI's potential to revolutionize its landscape. By
and engineering. The article delves into several key
harnessing automated algorithms, artificial
areas where AI is making significant contributions
intelligence in Pharma ventures into tasks
in the pharmaceutical field, including drug
historically reliant on human expertise. In recent
discovery, AI tools, manufacturing execution
years, this technology has become a beacon of
systems, predicting new treatments, novel peptide
innovation within the pharmaceutical and biotech
development, rare disease management, drug
sectors. It stands as a key ally in addressing the
adherence, and the challenges faced in AI adoption
most formidable challenges facing the industry
within the pharmaceutical industry.
today. Over the past half-decade, AI's integration
Keywords into these fields has fundamentally reshaped drug
development, disease management strategies, and
Drug Discovery, tools of AI, MES, ACPS, other pivotal endeavours.
treatment and management of rare diseases, drug
History
adherence and dosage, challenges to adoption of AI
in pharma. Allen Newell, Herbert A Simon. Was developed
the Logic Theorist .it was born in 1956 that
I. INTRODUCTION Dartmouth College had organized the famous
conference, It has been forecasted that the revenue
As AI continues to progress, its impact on diverse
from AI market will be increasing by as much as
sectors becomes increasingly apparent. Stemming
ten-fold between the years 2017 and 2022. Natural
from computer science, AI's symbolic
language processing market, which has several
programming approach has matured into a robust
applications including text prediction, and speech
problem-solving science, with far-reaching
and voice recognition has been said to achieve a
applications spanning business, healthcare, and
growth of 28.5% in the year 2017. Worldwide
engineering domains.
revenue from big data and business analytics was
AI's fundamental objective lies in identifying US$ 122 billion in the year 2015 and it is being
information processing challenges and offering expected that the figures will rise to more than US$
abstract methodologies—akin to mathematical 200 billion by the year 2020. Artificial intelligence
theorems—to address them. This pursuit of has a rocky history spanning back to the 1950s. For
a long time it was seen as a field for dreamers, but

ISBN Number : 978-81-958673-8-7 49


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

that started to change in 1997 when IBM‟s Deep


Blue computer was able to defeat chess champion
Garry Kasparov. By 2011, IBM‟s new Watson
supercomputer was able to win the US$1m prize in
the US game-show Jeopardy. Since then, Watson
has expanded into healthcare and drug discovery,
including a partnership with Pfizer in 2016 to
accelerate drug discovery in immuno-oncology. In
December 2016 IBM in collaboration with Pfizer
introduced IBM Watson, a cloud-based such as
medical lab reports and helps researchers with the
ability to identify relationships between distinct
data sets through dynamic visualizations.

Artificial Intelligence in Drug Discovery II. TOOLS OF AI

In the realm of drug discovery, the traditionally Robot Pharmacy


time-consuming process of testing compounds
against diseased cells has met a revolutionary ally A robot pharmacy is like a special kind of
in artificial intelligence (AI). Novartis research pharmacy where robots help with giving out
teams have harnessed the power of machine medicines. These robots can count pills, put them
learning algorithms to analyse images and predict in the right packages, and even stick labels on
which untested compounds show promise for them. This helps make sure people get the right
further investigation. By leveraging AI's rapid medicine in the right amount. The robots also keep
computational capabilities, this predictive track of how much medicine is left and can even
screening expedites the identification of work day and night to help people get their
biologically active compounds, potentially medicine faster. This kind of pharmacy is really
expediting the availability of effective drugs. This good at preventing mistakes and making things
approach not only accelerates the timeline for drug faster and easier for both the pharmacy staff and
development but also alleviates the financial the people who need the medicine
burdens associated with labour-intensive manual
analyses.

Top biopharmaceutical companies are


spearheading remarkable AI initiatives. One
significant endeavour involves a mobile platform
designed to enhance health outcomes. Through
real-time data collection, this platform empowers
timely patient recommendations, with the potential
to transform treatment outcomes. Simultaneously,
the pharmaceutical industry is partnering with
software companies to integrate cutting-edge
technologies into the intricate drug discovery
process. This collaboration aims to streamline and
enhance the discovery of novel drugs, offering a
glimpse into a more efficient and resource-effective MEDi Robot
future.
The MEDi Robot is a friendly and interactive robot
designed to help kids during medical procedures,
especially in hospital settings. It's created to make
these experiences less scary and more comfortable
for young patients.

The MEDi Robot uses technology to distract and


engage children, helping them cope with pain and
anxiety. It can play games, tell stories, and interact
with the child through a tablet-like interface. This
distraction helps take their mind off the medical
procedure, making the process easier for both the
child and the healthcare professionals.

ISBN Number : 978-81-958673-8-7 50


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

The robot's aim is to create a positive and through automation in busy healthcare
supportive environment, helping children feel more environments.
at ease during medical treatments. It's a great
example of how technology can be used to improve
healthcare experiences, especially for young
patients who might find medical procedures
stressful or frightening.

Erica robot

Erica is an advanced humanoid robot known for its


lifelike appearance and natural conversational
abilities. Developed by a team in Japan, Erica's
realistic features and gestures enable it to engage
with humans in relatable ways. It uses artificial
intelligence and machine learning to hold dynamic
conversations, respond to questions, and even
convey emotions through its expressions and
gestures. Erica serves as both a practical tool for Manufacturing Execution System (MES)
various applications and a research platform for
studying human-robot interaction and social A Manufacturing Execution System (MES) is a
robotics. sophisticated software solution that acts as the
digital nerve centre of manufacturing operations. It
serves as the bridge between the high-level
planning and coordination done by enterprise
systems and the actual execution of production
processes on the shop floor. MES plays a pivotal
role in optimizing manufacturing efficiency,
quality control, and real-time visibility.

MES systems provide comprehensive tracking and


monitoring of production processes, giving
manufacturers instant insights into the progress of
orders, the utilization of resources, and the
movement of materials. By allocating tasks,
machines, and personnel based on real-time data
and production priorities, MES ensures efficient
resource utilization and minimizes bottlenecks.

Quality management is another cornerstone of


MES. It ensures that production processes are in
line with established standards and that defects or
deviations are quickly identified and rectified. The
TUG robots system also empowers operators with standardized
work instructions, helping them navigate complex
TUG robots are innovative autonomous mobile tasks with precision.
robots used primarily in healthcare settings to
automate the transportation of items like medical One of the central strengths of MES lies in data
supplies and equipment. They navigate collection and analysis. It captures and records data
independently through hospitals, relying on sensors at every stage of production, offering
and mapping technology to safely move through manufacturers a wealth of insights for decision-
corridors, elevators, and rooms. TUG robots making, process improvement, and compliance
enhance efficiency by handling various tasks, such adherence. MES seamlessly integrates with other
as delivering medications or transporting samples, manufacturing systems, such as Enterprise
freeing up healthcare staff to focus on patient care. Resource Planning (ERP) and SCADA systems,
These robots contribute to smoother operations, enabling the exchange of vital information across
improve workflow, and promote cost savings the manufacturing landscape.

ISBN Number : 978-81-958673-8-7 51


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

preclinical phase. As a result, drug manufacturers


can get a better picture early on about the
effectiveness of a drug on human cells. More
specifically, Verge uses artificial intelligence to
keep track of the impact certain therapies on the
human brain with a particular focus on the
preclinical phase.

Development of Novel Peptides from Natural


food

The Irish start up Nerites leverages AI and other


novel technologies facilitate the discovery of new
and more robust food and healthy ingredients.
BASF (Baden Aniline and Soda Factory”) will take
advantage of this partnership to develop novel
Ai to Predict New Treatments functional peptides derived from natural foods. In
practice, BASF uses Nutrias AI and DNA analysis
Verge is using automated data gathering and capabilities to predict, analyse, and validate
analysis to tackle main problems in drug discovery. peptides from natural sources. The main goal of
In other words, they are taking an algorithmic BASF is to discover and deliver to the market
approach to map out hundreds of genes that play peptide-based therapies that’ll help treat conditions
complex roles in brain diseases like Alzheimer’s, like diabetes.
Parkinson’s or ALS. Verge’s hypothesis is that
gathering & analysing gene data will positively Treatment and Management of Rare diseases
impact the drug discovery phase starting with the
preclinical trials. The idea is that Verge can use AI Advances in AI, renewed interest in rare disease
to monitor the impact that specific drug treatments treatments. Currently, there are over 350 million
have on the human brain starting with the people with over 7,000 rare

Diseases around the world. However, it’s not all the right pill. And the results were amazing,
gloom and doom for patients with rare diseases as improving adherence by up to 90%.Genpact‟s AI
Heal, a UK-based biotech firm, has secured $10 solution has been used severally in clinical trials to
million in Series A funding to use AI to develop change the dosage given to specific patients to
innovative drugs for rare conditions. Thera chon, optimize the results. In this partnership, Bayer
another Swiss biotech company that leverage AI to takes advantage of Genpact’s Pharmacovigilance
develop drugs for the treatment of rare genetic Artificial Intelligence (PVAI) to not only monitor
diseases, has received $60 million in funding. drug adherence but also detect potential side effects
much earlier.

Using Ai to Make Sense of Clinical Data & To


Produce Better Analytics

Apple’s Research kit makes it easy for people to


enrol in clinical trials and studies without having to
go through physical enrolment. It’s a clinical
research ecosystem designed around its two
flagship products, the iPhone and the Apple Watch.
Duke University, for instance, uses patient data
Drug-Adherence & Dosage collected by these Apple devices and AI-driven
facial recognition algorithm to identify children
AbbVie partnered with New York-based Acura to with autism. Research kit has made it easy to make
enhance drug trial vigilance and improve drug better sense of collected health data
adherence. In this collaboration, AbbVie used
facial and image recognition algorithm of Ai Cure Finding More Reliable Patients Faster For
mobile SaaS platform to monitor adherence. To be Clinical Trials
more specific, the patients take a video of
Although there’s a lot of patient data out there,
themselves swallowing a pill using their
recruiting the right patients for clinical trials is a
smartphones, and the AI-powered platform
difficult process for big pharma. For instance,
confirms that indeed the correct person swallowed
finding and enrolling ideal candidates can make

ISBN Number : 978-81-958673-8-7 52


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

clinical trials last an average of 7.5 years, costing approx. 161 billion GB of data as of 2011. With
between $161 million and $2 billion per drug. humongous data available in this domain, artificial
intelligence can be of real help in analysing the
Unfortunately, 80 percent of clinical trials fail to data and presenting results that would help out in
make deadlines. With over 18,000 clinical studies decision making, saving Human effort, time, and
currently recruiting candidates in the US, the $65 money and thus help save Lives. Epidermis
billion clinical trial market needs an overhaul. outbreak prediction; using machine learning
Extracting useful data from patients‟ records is /artificial intelligence one can study the history of
perhaps the biggest challenge for pharmaceutical epidemic outbreak, analyse the social media
companies. Thankfully, that’s where AI and activity and predict where and when epidemic can
machine learning comes into the picture. effect with considerable accuracy. Apart from the
fore mentioned use-cases there are numerous others
III. CHALLENGES TO ADOPTION OF AI IN like: Personalizing the treatment Help build new
PHARMA tools for the patient, physicians etc. Clinical trials
research: applying predictive analytics to identify
While AI has an extensive potential to help candidates for the trial through social media and
redefine the pharmaceutical industry, the adoption doctor vests.
itself is not an easy walk in the park. Challenges
that pharma companies face while trying to adopt IV. LIMITATIONS
AI:  The unfamiliarity of the technology – for
many pharma companies, AI still seems like a Streamlining electronic records; which are messy
“black box” owing to its newness and esoteric and unorganized across the heterogeneous
nature.  Lack of proper IT infrastructure – that’s databases &are to be cleaned first. Transparency:
because most IT applications and infrastructure people need transparency in health care they
currently in use weren’t developed or designed receive, which quite a task is given the complexity
with artificial intelligence in mind. Even worse, of the processes involving artificial intelligence.
pharma firms have to spend lots of money to Data governance: medical data is private and in
upgrade their IT system.  Much of the data is in a accessible legally. Consent from the public is
free text format – that means pharma companies important Hesitant to change: pharma companies
have to go above and beyond to collate and put this are known to be traditional and resistant to change.
data into a form that’s able to be analysed. Despite We have to break the stigma to give the best care
all these limitations, one thing is for certain: AI is we can.
already redefining biotech and pharma. And ten
years from now, Pharma will simply look at Benefits and Issues
artificial intelligence as a basic, every day,
technology.  Effective use of incomplete data sets,

 Rapid analysis of data,

 Ability to accommodate constraints and


preferences and ability to generate understandable
rules.

 Enhancement of product quality and performance


at low cost,

 Shorter time to market,

 Development of new products,


Artificial Intelligence in Pharma is a good idea:  Improved customer response,
Pharmaceutical Industry can accelerate innovation  Improved confidence and.
by using technological advancements. The recent
technological advancement that comes to mind  AI would have a low error rate compared to
would be artificial intelligence, development of humans, if coded properly. They would have
computer systems able to perform tasks normally incredible precision, accuracy, and speed.
requiring human intelligence, such as visual
perception, speech recognition, decision-making,  They won't be affected by hostile environments,
and translation between languages. An estimate by thus able to complete dangerous tasks, explore in
IBM shows that entire Healthcare domain has

ISBN Number : 978-81-958673-8-7 53


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

space, and endure problems that would injure or need. Artificial intelligence is the design and
kill us. application of algorithms for analysis of learning
and interpretation of data.
 This can even mean mining and digging fuels that
would otherwise be hostile for humans. VI. BIBLIOGRAPHY

 Replace humans in repetitive, tedious tasks and in G. Siva Prasad, M.C.A, M.Tech
many laborious places of work. (CSE), UGCNET, Works as
Assistant Professor in the
 Predict what a user will type, ask, search, and do. Department of MCA , KBN
They can easily act as assistants and can College (Autonomous),
recommend or direct various actions. An example Vijayawada, Andhra Pradesh and
of this can be found in the smartphone. he is having 10 years of experience in teaching and
one year in industrial experience. His research
 Can detect fraud in card-based systems, and interest includes Data Mining, Machine Learning,
possibly other systems in the future. Deep Learning, Big Data, Microsoft Programming
Languages and Web Programming. He has
 Organized and manages records. attended workshops on POWER BI, Data Analytics
using R, Generative AI, Block Chain Technology
 Interact with humans for entertainment or a task and many more.
as avatars or robots.

 An example of this is AI for playing many VII. REFERENCES


videogames.
1. http://en.wikibooks.org/wiki/Computer Science:
 Robotic pets can interact with humans. Can help Artificial_Intelligence
w/ depression and inactivity. http://www.howstuffworks.com/arificialintelligenc
e
 Can fulfil sexual pleasure.
2. http:// www.google.co.in
 They can think logically without emotions,
making rational decisions with less or no mistakes. 3. http://www.library.thinkquest.org

 Can assess people. 4. https://www.javatpoint.com/application-of-ai

 This can be for medical purposes, such as health 5.https://www.educba.com/artificial-intelligence-


risks and emotional state. Can simulate medical techniques/
procedures and give info on side effects.
6.https://www.cigionline.orgw/articles/cyber-
 Robotic radiosurgery, and other types of surgery securitybattlefield/?
in the future, can achieve precision that humans utm_source=google_ads&utm_medium=grant&gcli
can't.  They don't need to sleep, rest, take breaks, d=EAIaIQobChMIsdz9qLSF_AIVzQ0rCh1bNQyl
or get entertained, as they don't get bored or tired. EAA YAiAAEgI40_D_BwE

 Can cost a lot of money and time to build, 7. Bass D (2016) 0icrosoі develops AI to help
rebuild, and repair. Robotic repair can occur to cancer doctors find the right treatments.
reduce time and humans needing to fix it, but that'll Bloomberg.
cost more money and resources.
8.University of California San Francisco. New
V. CONCLUSION UCSF Robotic Pharmacy Aims to Improve Patient
Safety. Available from: https://www.ucsf.edu/
Human being is the most sophisticated machine news/2011/03/9510/new-ucsf-robotic-
that can ever be created. The human brain, which is pharmacyaimsimprove-patient-safety. [Last
working hard to create something that is much Accessed on 2017 Jun 24
more efficient than a human being in doing any
given task and it has great success to extent in 9. McHugh R, Raccoon J. Meet MEDi, the Robot
doing so. The AI tools like Watson for oncology, Taking Pain Out of Kids‟ Hospital Visits.
tug robot and robotic pharmacy has change the Available from: http://
profession considerably. The bigger the healthcare www.nbcnews.com/news/us-news/meet-medi-
sector gets more sophisticated and more robottaking-pain-out-kidshospital-visits-n363191.
technologically advanced infrastructure it will [Last accessed on 2017 Jun 24

ISBN Number : 978-81-958673-8-7 54


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

10. Trynacit K. MEDi Robot to Comfort Patients in


Stollery Children’s Hospital. Available from:
http://www.cbc. ca/news/canada/edmonton/medi-
robot-to-comfortpatients-in-stollery-children-
shospital-1.3919867. [Last accessed on 2017 Jun
24].

11. Eye for Pharma. Artificial intelligence- A


Brave New World for Pharma. Available from:
https://www.social.eyeforpharma.com/clinical/artifi
cial-intelligence-brave-new-worldpharma.[Last
accessed on 2017 Jun 24].

12. McCurry J. Erica, „most intelligent‟ Android,


Leads Japan’s Robot Revolution. Available from:
http:// www.thehindu.com/todays-paper/tp-
national/ Erica-%E2%80%98mostintelligent
%E2%80%99-android-leads-Japan%E2%80%99s-
robot-revolution/ article13974805.ece [Last
accessed on 2017 Jun 24].

13. Anthon. TUG robots. Available from:


http://www.aethon. Com/tug/tug healthcare/. [Last
accessed on 2017 Jun.

14. Siemens. SIMATCSSIMATCS IT for the


Pharmaceutical Industry. Available from:
https://www.industry.siemens.com/verticals/global/
en/pharma-industries/products-and-services/
industrial-software/pages/manufacturing execution-
system.aspx.[last accessed on 2017 Jun

ISBN Number : 978-81-958673-8-7 55


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Overview Of Machine Learning Vs Deep


Learning
Shaik Bushra Tabassum Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: shaikbushratabassum24@gmail.com Email: gspkbn@gmail.com

ABSTRACT recognition, language translation, and


recommendation systems.
The goal of the artificial intelligence
subfields of machine learning and deep learning is Deep Learning:
to enable computers to learn from data and make
predictions or choices. Machine learning refers to a Deep learning is a subset of machine
broader range of methods that use statistical models learning that specifically deals with artificial neural
and algorithms to let computers learn from networks inspired by the human brain's structure.
experience and enhance their performance. It uses a These networks consist of multiple layers (deep
variety of techniques, including clustering, support layers) that learn to represent data in increasingly
vector machines, and decision trees. abstract ways. Deep learning has become highly
On the other hand, deep learning is a successful in tasks like image and speech
subset of machine learning that specifically deals recognition, natural language processing, and even
with neural networks composed of multiple layers, playing games. Convolutional Neural Networks
hence the term "deep." Deep learning has gained (CNNs) and Recurrent Neural Networks (RNNs)
immense popularity due to its ability to are common types of deep learning architectures.
automatically learn hierarchical representations of In essence, deep learning is a specialized
data, leading to exceptional performance in tasks approach within the broader field of machine
like image and speech recognition, natural learning, characterized by its use of deep neural
language processing, and more. Deep learning networks for complex pattern recognition and
models, especially convolutional neural networks feature extraction.
(CNNs) for images and recurrent neural networks
(RNNs) for sequences, have demonstrated
remarkable success in complex tasks.
In summary, while machine learning is a
broader field encompassing various techniques to
enable computers to learn from data, deep learning
is a specialized approach within machine learning
that relies on deep neural networks to automatically
learn and represent patterns in data, achieving
impressive results in various domains.

I. INTRODUCTION

Machine learning is a broader field that What is Machine Learning?


focuses on developing algorithms and models that
enable computers to learn from data and make
predictions or decisions. It involves various Machine learning is a subset of artificial
techniques like regression, clustering, decision intelligence that involves training computers to
trees, and support vector machines. These learn and improve from experience, without being
algorithms are trained on data to improve their explicitly programmed. It enables computers to
performance over time, making them capable of recognize patterns and make decisions based on
handling a wide range of tasks such as image

ISBN Number : 978-81-958673-8-7 56


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

data, allowing them to perform tasks or make II.APPLICATIONS OF MACHINE


predictions without relying on explicit instructions. LEARNING
Machine Learning allows the computers to
Image Recognition: Image recognition is one of
learn from the experiences by its own, use
the most widely used applications of machine
statistical methods to improve the performance and
learning. It's employed to label things like digital
predict the output without being explicitly
images, people, places, and objects. The most
programmed.
popular use of face and image recognition is
The popular applications of ML are Email automatic friend tagging recommendation.
spam filtering, product recommendations, online
fraud detection, etc. Speech Recognition: Speech recognition, often
known as "Speech to text" or "Computer speech
Some useful ML algorithms are: recognition," is the process of turning spoken
 Decision Tree algorithm commands into text. Speech recognition
applications currently use machine learning
 Naive Bayes algorithms extensively. Speech recognition
 Random Forest technology is used by Alexa, Google Assistant,
 K-means clustering Siri, Cortana, and Microsoft Cortana to carry out
 KNN algorithm voice commands.
 Apriori algorithm etc.
Product recommendations: Amazon, Netflix, and
other e-commerce and entertainment businesses
How does Machine Learning work? frequently utilize machine learning to recommend
products to users. Because of machine learning,
At a high level, machine learning is a whenever we look for a product on Amazon, we
process where computers learn from data to make begin to see advertisements for the same product
predictions or decisions without being explicitly while using the same browser to browse the
programmed. It involves algorithms that learn internet.
patterns from data and use those patterns to make
accurate predictions or decisions on new, unseen Self-driving cars: Self-driving automobiles are
data. There are various types of machine learning, one of the most intriguing uses of machine
including supervised learning (using labelled data), learning. Self-driving cars heavily rely on machine
unsupervised learning (finding patterns in learning. The most well-known automaker, Tesla,
unlabelled data), and reinforcement learning is developing a self-driving vehicle. In order to
(learning through trial and error). The learning train the car models to recognize people and objects
process involves adjusting parameters based on the while driving, unsupervised learning was used.
data to minimize errors and improve performance.
Email Spam and Malware Filtering: Every new
The example of recognising the image of a email that we get is immediately classified as
cat or dog can help explain how machine learning essential, common, or spam. Machine learning is
models function. The ML model uses input photos the technology that enables us to consistently
of both cats and dogs to determine this, extracting receive essential emails marked with the important
attributes like shape, height, nose, and eyes before sign in our inbox and spam emails in our spam box.
applying the classification method and predicting
the result.
Virtual Personal Assistant: We have a variety of
virtual personal assistants, including Siri, Cortana,
Consider the below image: Alexa, and Google Assistant. They assist us in
discovering the information using our voice
commands, as the name says. Our voice commands
to these assistants, such as "Play music," "Call
someone," "Open an email," and "Schedule an
appointment," among others, can support us in a
variety of ways.

Online Fraud Detection: Machine learning keeps


our online transactions safe and secure by
recognizing fraudulent activities. There are a
variety of ways for a fraudulent transaction to

ISBN Number : 978-81-958673-8-7 57


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

happen every time we make an online purchase, layer processes and transforms the input data,
including the use of false accounts and gradually learning to extract higher-level features.
identification documents, as well as the theft of During training, the network adjusts its
money in the middle of a transaction. Feed Forward internal parameters based on the differences
Neural Network assists us in identifying this by between its predictions and the actual target values.
detecting whether the transaction is legitimate or This process, known as back propagation, involves
fraudulent. propagating the error backward through the
network and updating the weights of the
connections. As training progresses over many
iterations, the network becomes better at making
Stock Market trading: Trading on the stock accurate predictions.
market frequently makes use of machine learning. Deep learning excels at tasks like image
Since share values could rise or fall at any time, and speech recognition, natural language
long short term memory neural networks are processing, and even games, thanks to its ability to
employed in machine learning to forecast stock automatically learn complex features from large
market movements. amounts of data.
We can understand the working of deep
learning with the same example of identifying cat
Medical Diagnosis: Trading on the stock market vs. dog. The deep learning model takes the images
frequently makes use of machine learning. Since as the input and feed it directly to the algorithms
share values could rise or fall at any time, long without requiring any manual feature extraction
short term memory neural networks are employed step. The images pass to the different layers of the
in machine learning to forecast stock market artificial neural network and predict the final
movements. output.

What is Deep Learning? Consider the below image:

Deep learning is a subfield of machine


learning that involves using neural networks with
multiple layers (hence the term "deep"). These
neural networks are designed to automatically learn
representations of data through a hierarchical
approach. Deep learning has proven very effective
in tasks like image and speech recognition, natural
language processing, and more, thanks to its ability
to capture intricate patterns and features from large
datasets.
In deep learning, models use different
III. APPLICATIONS OF DEEP LEARNING
layers to learn and discover insights from the data.
Some popular applications of deep Image and Video Analysis: Deep learning is used
learning are self-driving cars, language translation, for image classification, object detection, facial
natural language processing, etc. recognition, and video analysis. It powers
technologies like self-driving cars, surveillance
Some popular deep learning models are: systems, and medical image analysis.
 Convolutional Neural Network Natural Language Processing (NLP): Deep
 Recurrent Neural Network learning models are used for language translation,
sentiment analysis, chat bots, and text generation.
 Auto encoders Virtual assistants like Siri and Alexa also utilize
NLP techniques.
 Classic Neural Networks, etc.
Speech Recognition: Deep learning is behind
How Deep Learning Works? accurate speech recognition systems, enabling
voice assistants and transcription services.
Deep learning is a subset of machine
Healthcare: Deep learning assists in diagnosing
learning that involves training artificial neural
diseases from medical images, predicting patient
networks to recognize patterns in data. These
outcomes, and drug discovery through analyzing
networks, inspired by the human brain, consist of
molecular structures.
layers of interconnected nodes (neurons). Each

ISBN Number : 978-81-958673-8-7 58


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Finance: Deep learning aids in predicting stock Execution Machine Deep Learning
prices, fraud detection, credit scoring, and time learning takes a long
algorithmic trading. algorithm execution time to
takes less time train the model, but
Gaming: It's used for creating realistic animations, to train the less time to test the
simulating virtual characters, and improving game model than model.
AI. deep learning,
but it takes a
Autonomous Vehicles: Deep learning algorithms long-time
are crucial for self-driving cars to interpret the duration to
environment, make decisions, and navigate safely. test the model.

Recommendation Systems: Deep learning


enhances personalized recommendations on Hardware Since machine The deep learning
platforms like Netflix, Amazon, and YouTube. Dependen learning model needs a huge
cies models do not amount of data to
Manufacturing: Deep learning helps with quality need much work efficiently, so
control, predictive maintenance, and optimizing amount of they need GPU's
production processes. data, so they and hence the high-
can work on end machine.
Agriculture: It's used for crop monitoring, disease low-end
detection, yield prediction, and precision machines.
agriculture.

Key comparisons between Machine Learning Feature Machine Deep learning is the
and Deep Learning Engineeri learning enhanced version of
ng models need a machine learning,
step of feature so it does not need
Let's understand the key differences between these
extraction by to develop the
two terms based on different parameters:
the expert, feature extractor for
and then it each problem;
Paramet Machine Deep Learning proceeds instead, it tries to
er Learning further. learn high-level
features from the
data on its own.
Data Although Deep Learning
Dependen machine algorithms highly
cy learning depend on a large Problem- To solve a The problem-
depends on amount of data, so solving given solving approach of
the huge we need to feed a approach problem, the a deep learning
amount of large amount of data traditional ML model is different
data, it can for good model breaks from the traditional
work with a performance. the problem in ML model, as it
smaller sub-parts, and takes input for a
amount of after solving given problem, and
data. each part, produce the end
produces the result. Hence it
final result. follows the end-to-
end approach.

ISBN Number : 978-81-958673-8-7 59


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

As we have seen the brief introduction of


Interpreta The The interpretation ML and DL with some comparisons, now why and
tion of interpretation of the result for a which one needs to be chosen to solve a particular
result of the result given problem is problem. So, it can be understood by the given
for a given very difficult. As flowchart:
problem is when we work with
easy. As when the deep learning
we work with model, we may get
machine a better result for a
learning, we given problem than
can interpret the machine
the result learning model, but
easily, it we cannot find why
means why this particular
this result outcome occurred,
occur, what and the reasoning.
was the
process.
So, if you have a lot of data and powerful
Type of Machine Deep Learning hardware, use deep learning. However, if you lack
data learning models can work any of them, pick the ML model to address your
models mostly with structured and issue.
require data in unstructured data
a structured both as they rely on IV. CONCLUSION
form. the layers of the
Artificial neural In conclusion, machine learning is a broader
network. concept that involves algorithms and techniques for
teaching computers to learn from data and make
predictions or decisions. Deep learning is a subset
Suitable Machine Deep learning of machine learning that focuses on neural
for learning models are suitable networks with multiple layers to automatically
models are for solving complex learn representations of data. While machine
suitable for problems. learning encompasses a wide range of methods,
solving simple deep learning has shown remarkable success in
or bit- tasks like image and speech recognition. The
complex choice between them depends on the problem's
problems. complexity, available data, and computational
resources.

V. BIBLIOGRAPHY
Which one to select among ML and Deep
Learning?

The choice between machine learning G. Siva Prasad, M.C.A, M.Tech (C


(ML) and deep learning depends on your specific SE), UGCNET, Works as Assistan
goals and the nature of the problem you're trying to t Professor in the Department of M
solve. ML encompasses a broader range of CA , KBN College(Autonomous),
techniques, while deep learning is a subset that Vijayawada-520001, Andhra Prade
focuses on neural networks. If you have a large sh and he is having 10 years of exp
dataset with complex patterns, deep learning might erience in teaching and one year in industrial experi
be suitable. For smaller datasets or simpler ence. His research interest includes Data Mining,
problems, traditional ML methods could be more Machine Learning, Deep Learning, Big Data,
appropriate. Microsoft Programming Languages and Web
Programming. He has attended workshops on
Consider the resources, expertise, and the
POWER BI, Data Analytics using R, Generative AI,
problem's characteristics to make an informed
Block Chain Technology and many more.
decision.

ISBN Number : 978-81-958673-8-7 60


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

VI. REFERENCES

1. H. Sarker, M. M. Hoque, M. D. K. Uddin, and


A. Tawfeeq, “Mobile data science and
intelligent apps: concepts, ai-based modeling
and research directions,” Mobile Networks
and Applications, vol. 26, no. 1, pp. 1–19,
2020.
2. M. S. Mahdavinejad, M. Rezvan, M.
Barekatain, P. Adibi, P. Barnaghi, and A. P.
Sheth, “Machine learning for internet of
things data analysis: a survey,” Digital
Communications and Networks, vol. 4, no. 3,
pp. 161–175, 2018.
3. I. H. Sarker, A. S. M. Kayes, S. Badsha, H.
Alqahtani, P. Watters, and A. Ng,
“Cybersecurity data science: an overview
from machine learning perspective,” Journal
of Big Data, vol. 7, no. 1, pp. 41–29, 2020.
4. J. Han, J. Pei, and M. Kamber, Data Mining:
Concepts and Techniques, Elsevier,
Amsterdam, 2011.
5. H. Witten and E. Frank, Data Mining:
Practical Machine Learning Tools and
Techniques, Morgan Kaufmann,
Burlingto,MA, USA, 2005.
6. Abadi M, Barham P, Chen J, Chen Z, Davis
A, Dean J, Devin Ma, Ghemawat S, Irving G,
Isard M, et al. Tensorfow: a system for large-
scale machine learning. In: 12th {USENIX}
Symposium on operating systems design and
implementation ({OSDI} 16), 2016; p. 265–
283
7. Abdel-Basset M, Hawash H, Chakrabortty
RK, Ryan M. Energy-net: a deep learning
approach for smart energy management in iot-
based smart cities. IEEE Internet of Things J.
2021.
8. Aggarwal A, Mittal M, Battineni G.
Generative adversarial network: an overview
of theory and applications. Int J Inf Manag
Data Insights. 2021; p. 100004. 4. Al-Qatf M,
Lasheng Y, Al-Habib M, Al-Sabahi K. Deep
learning approach combining sparse
autoencoder with svm for network intrusion
detection. IEEE Access. 2018;6:52843–56.
9. Deng S, Li R, Jin Y, He H. Cnn-based feature
cross and classifier for loan default prediction.
In: 2020 International Conference on image,
video processing and artificial intelligence,
volume 11584, page 115841K. International
Society for Optics and Photonics, 2020.
10. Dhyani M, Kumar R. An intelligent chatbot
using deep learning with bidirectional rnn and
attention model. Mater Today Proc.
2021;34:817–24
11. Google trends. 2021.
https://trends.google.com/trends/

ISBN Number : 978-81-958673-8-7 61


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

An Overview Of Machine Learning


Applications In Defence
Shaik Hidaytulla, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous), KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: sk.hidayatulla29@gmail.com Email: gspkbn@gmail.com

adapting to rapidly changing scenarios. Machine


learning introduces a new paradigm by enabling
ABSTRACT
systems to learn from data, recognize
Machine learning (ML) has found diverse patterns, and make predictions or decisions
applications within the defence sector, autonomously. From intelligence gathering and
revolutionizing traditional processes and enhancing threat detection to logistics optimization and
decision-making capabilities. This abstract explores decision support, ML applications are opening
the various domains where machine learning has avenues for innovation and efficiency in defence
been employed in defence, including threat operations.
detection, anomaly identification, autonomous
systems, and predictive analysis. It highlights how The vast and diverse datasets generated by
machine learning algorithms, such as neural modern defence systems, encompassing everything
networks, support vector machines, and from sensor readings and satellite imagery to
reinforcement learning, have enabled the communication intercepts and historical records,
automation of complex tasks, rapid data analysis, provide fertile ground for the application of
and improved situational awareness. The abstract machine learning techniques. By extracting
also addresses challenges like data security, bias actionable insights from these datasets, defence
mitigation, and ethical concerns that arise due to entities can not only enhance situational awareness
the integration of machine learning in defence but also anticipate emerging threats, optimize
applications. Overall, the abstract underscores the resource allocation, and streamline logistics.
transformative impact of machine learning on
defence operations, ushering in an era of efficiency,
accuracy, and enhanced security.

I. INTRODUCTION
In an era characterized by technological
advancements and evolving threats, the defence
sector is undergoing a paradigm shift through the
integration of machine learning (ML) applications.
Machine learning, a subset of artificial intelligence,
has emerged as a transformative force, offering the
potential to revolutionize the way military
operations are conducted, threats are identified, and
strategic decisions are made. As nations strive to
maintain their security in an increasingly complex
and interconnected world, the deployment of ML
technologies holds great promise for enhancing
defence capabilities across a spectrum of domains.
Figure-1 (Machine Learning In Defense)
Traditional defence strategies have often
relied on predefined rules and manual analysis of What is Machine Learning?
data, which can be time-consuming and limited in

ISBN Number : 978-81-958673-8-7 62


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Machine Learning is a branch of computer - In classification, the goal is to predict a


science and sub-branch of Artificial categorical label or class for input data.
Intelligence. "Machine Learning is defined as the The output is a discrete value representing
study of various technologies or algorithms that a category.
allow systems to automatically learn and improve - Examples include spam email detection,
from past experience” class in Java | Abstraction in sentiment analysis, disease diagnosis, and
Java. Machine learning is a subset of artificial image/object recognition.
intelligence (AI) that involves the development of
algorithms and models that allow computers to  Regression:
learn from data and improve their performance - In regression, the goal is to predict a
over time. Instead of being explicitly programmed continuous numerical value or quantity.
to perform specific tasks, machine learning systems - Examples include predicting house
use data to learn patterns, make predictions, or prices, stock prices, and temperature
solve complex problems. These systems rely on forecasts.
statistical techniques to recognize patterns in data
and make informed decisions based on the patterns
they've learned. There are various types of machine
learning, including supervised learning, where the 2. Unsupervised Learning:
model is trained on labelled data to make
predictions, unsupervised learning, where the Unsupervised learning involves working
model identifies patterns in unlabeled data, and with unlabeled data. The algorithm aims to find
reinforcement learning, where the model learns patterns, structures, or relationships within the data
through trial and error to maximize a reward. without any predefined categories or labels.
Clustering and dimensionality reduction are typical
Machine learning has a wide range of tasks in unsupervised learning. K-means clustering
applications across various domains, such as image and principal component analysis (PCA) are
and speech recognition, natural language examples of unsupervised learning algorithms.
processing, medical diagnosis, financial prediction,
recommendation systems, and more. It has the Clustering:
ability to handle large and complex datasets,
automate tasks, and provide insights that might not
- Clustering involves grouping similar data
be easily discernible through traditional
points together based on their inherent
programming methods.
characteristics.
II. Types of Machine Learning
- Examples include customer segmentation
for targeted marketing, image segmentation, and
Machine learning encompasses a range of document clustering.
techniques that allow computers to learn from data
and make predictions or decisions without being
K-Means Clustering:
explicitly programmed. There are several types of
machine learning, each with its own characteristics
and applications. The main types of machine -K-Means is a popular clustering algorithm
learning are: that assigns data points to K clusters, aiming to
minimize the sum of squared distances within
clusters.
1. Supervised Learning:

- Used for customer segmentation, image


In supervised learning, the algorithm
compression, and data pre-processing.
learns from labelled training data, where the input
data is paired with the corresponding desired
output. The goal is to learn a mapping function that Machine Learning Applications In Defence
can accurately predict the output for new, unseen
input data. Common algorithms in supervised A cross various aspects of military
learning include linear regression, decision trees, operations. Leveraging the power of data-driven
support vector machines, and neural networks. It insights and autonomous decision-making, these
consists of two types... applications are transforming how defence
organizations gather intelligence, mitigate threats,
 Classification: optimize logistics, and enhance decision support.

ISBN Number : 978-81-958673-8-7 63


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Here are some prominent areas where machine decisions in dynamic and rapidly changing
learning is making a substantial impact in the scenarios.
defence sector:

Intelligence Gathering and Analysis: Machine


learning algorithms can process vast amounts of Decision Support Systems: Machine learning
data from Machine learning applications in defence algorithms can analyze historical data and current
have ushered in a new era of innovation and variables to provide decision-makers with insights
efficiency multiple sources, including satellite and recommendations. These systems can assist in
imagery, social media, and communication formulating strategies, evaluating risks, and
intercepts. These algorithms can identify patterns, determining the most effective courses of action.
anomalies, and correlations that human analysts Natural Language Processing for Intelligence
might miss, aiding in the early detection of Analysis: Text and speech data can be analyzed
potential threats or adversarial activities. using natural language processing (NLP)
Predictive Maintenance: In the realm of military techniques to extract critical information from
hardware, machine learning is employed to predict documents, reports, and intercepted
equipment failures and maintenance needs. By communications. This aids intelligence analysts in
analyzing sensor data and historical maintenance identifying trends, sentiments, and potential
records, ML algorithms can forecast when Reconnaissance and Surveillance: threats buried
equipment might malfunction, enabling proactive within large volumes of text data. ML-powered
maintenance and minimizing downtime. systems enable automated analysis of imagery and
Cyber security and Intrusion Detection: video feeds for reconnaissance and surveillance
Machine learning plays a pivotal role in identifying purposes. These systems can identify and track
and responding to cyber threats. ML models can objects of interest, enabling more effective
learn to recognize abnormal network behaviour and monitoring of specific areas or targets.
detect potential breaches, helping defence entities As machine learning continues to advance,
safeguard sensitive data and networks from its applications in defence are expected to expand
malicious actors. even further. However, along with the numerous
Autonomous Vehicles and Drones: Machine benefits come challenges related to data security,
learning is integral to the development of interpretability, ethical considerations, and
autonomous vehicles and drones for defence adversarial attacks. By addressing these challenges,
applications. These technologies can navigate defence organizations can harness the full potential
complex environments, identify targets, and adapt of machine learning while ensuring responsible and
to changing conditions, reducing the risk to human effective deployment in safeguarding national
personnel and enhancing operational capabilities. security.

Threat Detection and Classification: ML Important ML applications in defence systems:


algorithms are employed to classify various types Warfare Platforms
of threats, such as objects, vehicles, or individuals,
from images and video feeds. These algorithms
enable rapid identification and categorization of Weapons and other military equipment
potential dangers, aiding in decision-making used on land, sea, air, and space platforms are
processes. increasingly incorporating machine learning and
artificial intelligence.
Logistics and Resource Optimization: Machine
learning can optimize the allocation of resources in The use of AI-enabled systems on these
defence logistics, ensuring efficient supply chains platforms contributes to the development of
and reducing operational costs. ML models can effective fighting systems that require less human
predict demand, track inventory levels, and involvement. Additionally, it improves the
optimize routes for supply distribution. performance of warfare systems while requiring
less maintenance, increasing synergy. High-speed,
Situational Awareness: ML-powered systems can autonomous weapons are anticipated to be
aggregate and analyze real-time data from sensors, equipped with AI and ML to undertake coordinated
cameras, and other sources to provide commanders strikes.
with a comprehensive and accurate understanding
of the operational environment. This heightened
situational awareness aids in making informed Defence Cyber security

ISBN Number : 978-81-958673-8-7 64


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

One of the most crucial components in Computers or other devices can teach
ensuring the safety of the entire country is the soldiers to use a variety of fighting systems that are
military system of every given nation. Therefore, used in a variety of military missions in conflict
military and defence systems are particularly zones thanks to machine learning. It offers
stimulation and instruction in a variety of software
engineering skills that are useful in trying

vulnerable to cyber attacks since they


could result in the loss of vital military data as well
as systemic harm. However, networks, computer circumstances. The United States is spending
programs, and data can be automatically protected a lot of money on simulation and training software.
against any form of illegal access by embedded AI Furthermore, many nations train their soldiers
and ML systems. The pattern of cyber attacks can using this ML-equipped military training system as
also be recorded by ML-enabled web security opposed to the conventional method, which costs
systems, which can then be used to create more money and takes longer. These contemporary
counterattack tools. methods are more effective and flexible. The
construction of a military training system using
Logistics & Transportation reinforcement and punishment as feedback is made
easier by reinforcement learning. In order to sustain
Systems used for defence logistics and an improved training system for their individuals,
transportation rely heavily on machine learning. this strategy becomes increasingly important.
Each successful military operation needs efficient
transportation of key military supplies like Threat Monitoring
supplies, equipment, ammunition, etc.
Threat monitoring is defined as a network
A military transportation system that monitoring solution/system, which is dedicated to
incorporates AI/ML can cut down on both analyzing, evaluating, and monitoring an
operating expenses and human labour. Recently, organization's network and endpoints to prevent
the US Army and IBM worked together to pre- various security majors such as network intrusion,
identify maintenance issues with Stryker combat ransomware, and other malware attacks.
vehicles using Watson's artificial intelligence
platform. Through a variety of detection categories,
including Configuration, Modelling, Indicator, and
Target Recognition and Tracking Threat Behaviour, machine learning aids in threat
detection. Computer systems are being trained to
The accuracy of target recognition is also detect malware, execute pattern recognition, and
enhanced by machine learning and artificial identify the behaviours of malware or ransomware
intelligence in complicated warfare environments. attacks before they enter the system using complex
These techniques allow military officers to ML algorithms. The development of intelligent
thoroughly review reports, papers, news feeds, and systems for threat awareness, such as drones, also
other unstructured data in order to create a picture greatly benefits from AI. These drones are outfitted
of the likely action zones. with sophisticated algorithms and software that
allow them to identify risks, analyze them, and stop
them from entering the system. All major nations,
Battlefield Healthcare including the United States, Russia, China, France,
Britain, Japan, India, etc., are pouring a lot of
Healthcare on the battlefield, including money into developing drones that can detect
evacuation procedures, remote surgical systems, threats and target, which is especially beneficial in
etc., is made possible by machine learning and distant places.
artificial intelligence. Numerous robotic surgical The typical process of ML in detecting security
systems and robotic ground platforms integrated threats is given in below image:
with ML technologies are used in battle zones to
aid in the difficult medical diagnosis and treatment
of wounds in combat.

Defence Combat Training

ISBN Number : 978-81-958673-8-7 65


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

The primary objective of the defence


sector is to defend the nation against border attacks
by patrolling the area. Even while soldiers are
stationed at the border constantly, smart sensors
and other intelligent technologies, including
drones, are already playing a critical part in the
border security system. These drones are outfitted
with various ML algorithms and software that, by
transmitting information to data centres, detect,
evaluate, and warn against any questionable
conduct. As a result, it is more helpful in risky
Anomaly detection circumstances when human assistance is not
crucial.

III. CONCLUSION

The technique of finding suspicious In conclusion, the integration of machine


occurrences, things, and observations that depart learning applications into the defence sector marks
from a dataset's typical behaviour is known as a pivotal advancement in the way military
anomaly detection. Anomaly detection is important operations are conducted, threats are mitigated, and
for locating anomalous patterns in data as well as strategic decisions are made. The multifaceted
for separating outliers, or patterns that deviate from capabilities of machine learning hold the potential
the norm. Anomaly detection uses ML and AI to to revolutionize the defence landscape, enhancing
locate outliers in a set of data. In order to discover situational awareness, optimizing resource
anomalies, supervised machine learning is crucial allocation, and enabling autonomous systems to
in pattern identification. adapt and respond to dynamic scenarios.The
versatility of machine learning is evident in its
Surveillance applications diverse applications within defence, from
intelligence analysis and threat detection to
logistics optimization and autonomous vehicles.
In order to gather and handle the vast These applications not only amplify the efficiency
amounts of defence data, reconnaissance and and accuracy of defence operations but also
surveillance systems have become an essential mitigate risks to human personnel in high-risk
component of any nation. These apps make use of environments.
multiple sensors and provide a stream of data to
data centres continually through data networks.
Data scientists examine that data and draw out
information that is relevant. Machine learning IV. BIBLIOGRAPHY
(ML) assists data analysts in automatically
detecting, analyzing, organizing, and managing
data throughout the entire process. G. Siva Prasad, M.C.A, M.Tech
(CSE), UGCNET, Works as an
Assistant Professor in the
Decision-support system
Department of MCA, KBN
College(Autonomous),
A decision-support system is beneficial for Vijayawada-520001, Andhra
many businesses in a variety of applications, Pradesh and he has 10 years of
including manufacturing, marketing, self-driven experience in teaching and one year in industrial
equipment (drones), and medical care. Similar to experience. His research interest includes Data
this, machine learning (ML) aids in the Mining, Machine Learning, Deep Learning, Big
development of improved decision-support systems Data, Microsoft Programming Languages and Web
for the defence industry, including intelligent Programming. He has attended workshops on
drones, automatic cruise missiles, and automatic POWER BI, Data Analytics using R, Generative AI,
weaponry that decide in response to suspicious Block Chain Technology and many more.
things. By assessing data and recommending the
optimal course of action, ML aids machines in
decision-making.
V. REFERENCE

Border protection

ISBN Number : 978-81-958673-8-7 66


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

1. I. H. Sarker, M. M. Hoque, M. D. K. Uddin, and A.


Tawfeeq, “Mobile data science and intelligent
apps: concepts, ai-based modelling and research
directions,” Mobile Networks and Applications,
vol. 26, no. 1, pp. 1–19, 2020.
View at: Publisher Site | Google Scholar
2. M. S. Mahdavinejad, M. Rezvan, M. Barekatain, P.
Adibi, P. Barnaghi, and A. P. Sheth, “Machine
learning for the internet of things data analysis: a

3. survey,” Digital Communications and Networks,


vol. 4, no. 3, pp. 161–175, 2018.
View at: Publisher Site | Google Scholar
4. I. H. Sarker, A. S. M. Kayes, S. Badsha, H.
Alqahtani, P. Watters, and A. Ng, “Cybersecurity

5. data science: an overview from a machine learning


perspective,” Journal of Big Data, vol. 7, no. 1, pp.
41–29, 2020.
View at: Publisher Site | Google Scholar
6. I. H. Sarker and R. FurhadNowrozy, “Ai-driven
cybersecurity: an overview, security intelligence
modelling and research directions,” SN Computer
Science, vol. 2, no. 3, p. 173, 2021.
View at: Publisher Site | Google Scholar
7. Google, “Trends In,”
2019, https://trends.google.com/trends/.
View at: Google Scholar
8. J. Han, J. Pei, and M. Kamber, Data Mining:
Concepts and Techniques, Elsevier, Amsterdam,
2011.
9. I. H. Sarker, “Deep cybersecurity: a comprehensive
overview from the neural network and deep
learning perspective,” SN Computer Science, vol. 2,
no. 3, 154 pages, 2021.
View at: Publisher Site | Google Scholar
10. I. H. Sarker, P. Watters, and A. S. M. Kayes,
“Effectiveness analysis of machine learning
classification models for predicting personalized
context-aware smartphone usage,” J Big Data, vol.
6, no. 1, pp. 1–28, 2019.

ISBN Number : 978-81-958673-8-7 67


A Study on Machine Learning Applications in Media

Imamuddin Shaik, Siva Prasad Guntakala,


Student, Department of MCA,
Department of MCA, Assistant Professor,
K.B.N. College (Autonomous), K.B.N. College(Autonomous),
Vijayawada-520001, A.P, India, Vijayawada-520001, A.P, India,
Email: imamshaik567@gmail.com Email: gspkbn@gmail.com

ABSTRACT In this topic, "Machine Learning Applications in


Journalism," we will discuss various machine learning
"Machine Learning (ML) has emerged as a revolutionary applications that are important to the growing profitable and
technology in the media industry, transforming various lucrative media and entertainment industries and industries.
aspects of content production, distribution and So, let’s start with a quick introduction to machine learning
consumption. This abstract explores the applications of ML in the media industry and some popular required machine
in the media industry", from personalized content learning applications for the media & entertainment industry.
recommendation systems to content automation to
sentiment analysis for audiences." engagement and deep II. Machine learning in the Media industry
fake detection for ensuring content authenticity With the
ability to process and analyse data, ML is reshaping the  Machine Learning (ML) has started a wave of change in
media landscape by improving the user experience, the media industry, transforming various aspects of
optimizing content distribution, and addressing the content production, distribution and consumption From
challenges associated with fake content processing. This is personalized content recommendations to content
the current future of information by empowering both ML creation a through automation, sentiment analysis and
creators and consumers and providing a brief overview of content integrity verification, ML applications are
how it structures." innovating how news is produced, presented and
experienced
I. INTRODUCTION  Personalized Content Recommendations: ML
algorithms analyze user behavior, preferences, and
In the contemporary digital age, the fusion of Machine historical data to provide customized content
Learning (ML) with the realm of media has ushered in a recommendations. Streaming platforms like Netflix and
paradigm shift that transcends traditional boundaries. Media, music services like Spotify are using ML to recommend
encompassing diverse forms of content such as images, movies, TV shows, songs and playlists that match users’
videos, articles, and music, serves as a vital conduit for interests, engaging users and satisfaction improve.
information dissemination, entertainment, and expression.  Automated Content Generation: ML-based tools,
Concurrently, ML has demonstrated its prowess in such as natural language processing models, can
deciphering complex patterns from data and making generate notes, summaries, and even creative pieces.
intelligent predictions. This convergence of media and ML News organizations use ML-powered algorithms to
has spawned a spectrum of applications that redefines rapidly generate regular news updates, freeing
content creation, distribution, and consumption. From journalists to focus on in-depth stories.
refining content recommendations to enabling automated  Sentiment Analysis and Audience Engagement: ML
content generation, from enhancing audience engagement enables sentiment analysis of social media and content,
through sentiment analysis to safeguarding content integrity and helps content producers gauge public reaction.
via deep fake detection, the amalgamation of ML and media Companies can adjust their marketing strategies based
is forging innovative pathways across the landscape of on ongoing real-time sentiment, ensuring they remain
communication and entertainment. This exploration delves relevant and responsive to audience preferences.
into the myriad ways in which ML applications are reshaping 
the media domain, ushering in an era where creativity,
insight, and technology converge to craft a new narrative in  Deepfake Detection and Content Authenticity:
human engagement with content. Because deepfaxes pose challenges to media
authenticity, ML algorithms are being developed to
recognize processed videos and images. This technology
similar to that used by platforms such as
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

gaming, mobile video streaming, and mobile


TikTok, Twitter and YouTube ensures the authenticity advertising, contributing to the market's growth.
of content and prevents misinformation from spreading.  E-sports and Gaming: The gaming sector, including e-
 Image and Video Analysis: The image recognition and sports, has seen tremendous growth, with a global
video analysis tools used by ML automatically tag, audience and substantial revenues from game sales, in-
game purchases, and advertising within games.
classify, and process visual content. This streamlines
operations for media organizations and enhances user  Augmented and Virtual Reality (AR/VR): AR and
authentication experiences, as seen in Google Photos’ VR technologies have started to shape the entertainment
auto-tagging feature. experience, enhancing gaming, creating immersive
 Real-time Translation and Accessibility: M.L. content, and providing new ways of storytelling.
YouTube’s automatic captioning and translation features  Content Creation and Monetization: User-generated
are examples of how ML enhances content diversity… content, influencer marketing, and digital content
 Content Moderation and Filtering: ML algorithms creation have created new avenues for monetization and
assist in content moderation by identifying and engagement across social media, YouTube, and other
removing inappropriate or offensive material. Social platforms.
media platforms employ these tools to uphold  Personalization and Data Analytics: The use of data
community standards and ensure safe online spaces. analytics and machine learning has enabled personalized
 Predictive Analytics and Trend Forecasting: Media content recommendations, targeted advertising, and
companies leverage ML-powered predictive analytics to improved audience engagement.
anticipate audience preferences and content trends. This  Globalization and Localization: The digital era has
insight guides strategic decisions, leading to the creation enabled content to be distributed globally, leading to
of content that resonates with target demographics. increased demand for localized content to cater to
 Ad Targeting and Monetization: ML algorithms diverse audiences.
analyze user data to tailor advertisements, maximizing  Advertising Evolution: Digital advertising,
their relevance and effectiveness. This benefits programmatic advertising, and native advertising have
advertisers by reaching their intended audience and reshaped the advertising landscape, allowing for better
media platforms by optimizing ad revenue. targeting and measurement of ad campaigns.
 Audience Insights and Engagement Optimization:  Regulatory Challenges: Privacy concerns, copyright
ML tools analyze audience behavior and engagement issues, and changing regulations regarding digital
metrics to provide insights that inform content creators' content distribution have posed challenges for the
decisions. This enables continuous improvement in industry.
content delivery and strategies.  Challenges and Opportunities: Piracy, competition,
and content saturation are some of the challenges faced
The convergence of machine learning and the media by the industry. However, innovations in storytelling,
industry is steering it toward a more dynamic, data-driven, technology, and content formats continue to open new
and personalized landscape. By automating tasks, opportunities.
enhancing audience interactions, and safeguarding content
integrity, machine learning is fundamentally reshaping how Please note that the specific value of the global
media content is produced, distributed, consumed, and entertainment and media market can vary greatly depending
monetized. on the source, methodology, and the most recent data
As of the worldwide entertainment and media market available. To obtain the most accurate and up-to-date
was experiencing steady growth, driven by factors such as information, I recommend referring to reports from
digitalization, increased internet penetration, and changing reputable market research firms such as PwC
consumer preferences. The market encompasses a wide (PricewaterhouseCoopers), Deloitte, Nielsen, or Statista.
range of segments, including film, TV, music, gaming, The media and entertainment industry perceives
publishing, advertising, and more. Key trends and factors exponential growth globally. With the use of ML-enabled
contributing to the value of the market include: high-speed network systems and trending video streaming
platforms, the users are accessing unlimited content
 Digitalization and Streaming Services: The shift to continuously without any interruption.
digital platforms and the rise of streaming services for
music, movies, TV shows, and gaming have For the entertainment and media industries, 2022 was an
significantly impacted the market. Platforms like important turning point. Total global entertainment media
(E&M) revenue increased 5.4% by 2022, reaching US$2.32
Netflix, Spotify, Amazon Prime Video, and Disney+ billion. That translates into a 10.6% slowdown in growth by
have disrupted traditional distribution models. 2021, just as economies and businesses around the world
 Mobile Entertainment: The proliferation of were beginning to recover from the recovery brought by the
smartphones and mobile devices has driven mobile covid-19 pandemic. And the growth rate will gradually

ISBN Number : 978-81-958673-8-7 69


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

decline in each of the next five years, so that by 2027, images, allowing users to easily search for specific
income growth will be only 2.8% from 2026. That’s slower content.
than the 3.1% rate of economic growth overall forecast by  Real-time Translation and Captioning: Machine
the International Monetary Fund (IMF) . for that year. learning models can provide real-time translation of
audio and video content, making media accessible to a
III. Machine learning (ml) applications in media global audience.
Example: YouTube's automatic captioning feature uses
Applied machine learning is a living witness to the rapid speech recognition and machine translation to provide
growth in the media industry in different formats such as subtitles in various languages for uploaded videos.
visual content, audio content (2-D and 3-D), digital  Content Moderation and Filtering: ML algorithms
advertising, content a recommendations, target audiences, automatically identify and filter inappropriate or
content composition, and classification, meta -tagging, offensive content, ensuring that platforms maintain a
automated transcription, virtual personal chatbots sentiment safe and respectful environment for users.
analysis and more. Example: Social media platforms use ML to detect and
There are a few important applications of machine remove hate speech, graphic content, and other
learning in media with their example. These are as follows: violations of community guidelines, thereby promoting
a positive online experience.
 Personalized Content Recommendation: Machine
learning algorithms analyze user behavior, preferences, These applications showcase the transformative impact of
and historical data to suggest relevant content, machine learning in the media landscape, contributing to
enhancing user engagement and retention. more engaging, authentic, and efficient interactions between
Example: Netflix's recommendation system employs creators and consumers.
ML to suggest movies and TV shows based on a user's
viewing history, ratings, and genre preferences, leading Importance’s of machine learning in media
to increased viewer satisfaction and prolonged
engagement. Machine learning has become increasingly important in
 Automated Content Generation: ML-driven tools the media industry due to its ability to analyze large
generate content, such as articles, summaries, and even volumes of data, automate processes, and personalize
artistic pieces, streamlining content creation processes content delivery. Here are some key ways in which machine
and enabling rapid production. learning is making an impact in the media sector:
Example: AI-powered news articles are created by
platforms like GPT-3, which can generate coherent and  Content Recommendations: Machine learning
relevant news stories from raw data, significantly algorithms analyze user behavior and preferences to
speeding up news production cycles. provide personalized content recommendations. This
 Sentiment Analysis for Audience Engagement: enhances user engagement and retention on platforms
Machine learning models analyze social media and user- like streaming services, news websites, and social media
generated content sentiment to gauge audience platforms.
reactions, enabling creators to tailor their content and  Audience Insights: Media companies can use machine
marketing strategies accordingly. learning to gain insights into audience behavior,
Example: Brands use sentiment analysis on platforms preferences, and trends. This information helps in
like Twitter to assess public opinion about their products creating content that resonates with the target audience
in real time, allowing them to respond promptly to and tailoring marketing strategies effectively.
positive or negative trends.  Content Creation: Machine learning technologies like
 Deepfake Detection and Authentication: Machine natural language processing (NLP) and image
learning algorithms identify manipulated or fabricated recognition are used to automate content creation
media content (deepfakes) to ensure the authenticity and processes. Automated news generation, video editing,
credibility of media assets. and even scriptwriting are becoming more common.
Example: Deepfake detection tools like Microsoft's  Predictive Analytics: Media companies use predictive
Video Authenticator analyze videos for signs of analytics powered by machine learning to forecast
manipulation, helping to prevent the spread of trends, viewer ratings, and advertising performance.
misleading or false information. This helps in making informed decisions about content
 Image and Video Analysis: ML-powered image and production, scheduling, and ad placements.
video recognition systems automatically tag, categorize,  Speech and Language Processing: NLP models enable
and process visual content, optimizing workflows and transcription, translation, and sentiment analysis,
enhancing search capabilities. enhancing accessibility and allowing media companies
Example: Google Photos employs ML to recognize and to reach wider audiences. Voice assistants also use
categorize objects, locations, and faces in user-uploaded machine learning to improve user interaction and
understand natural language queries.

ISBN Number : 978-81-958673-8-7 70


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

 Ad Targeting and Personalization: Machine learning processing and image recognition, has streamlined
algorithms analyze user data to deliver targeted production workflows and opened doors to new forms of
advertisements. This increases the effectiveness of ad creative expression.
campaigns and provides a more personalized user
experience. Machine learning's predictive capabilities have
 Quality Control: Machine learning can be used to allowed media organizations to make informed decisions
automatically detect and filter inappropriate or spam about content production, distribution, and advertising,
content, ensuring a safe and high-quality user optimizing resource allocation and improving overall
experience on online platforms. business outcomes. Additionally, the technology's ability to
 Copyright Protection: Automated systems powered by analyze vast amounts of data in real time has enabled media
machine learning can identify instances of copyright companies to respond swiftly to emerging trends and public
infringement by comparing content against a database of sentiments, fostering a more dynamic and relevant content
copyrighted material. This helps protect intellectual ecosystem.
property rights.
 Video and Image Analysis: Machine learning Furthermore, machine learning has facilitated more
algorithms can analyze video and image content to effective ad targeting, copyright protection, and quality
identify objects, scenes, and even emotions. This aids in control, ensuring a safer and more enjoyable user
content categorization, metadata tagging, and content experience. The implementation of recommendation
search. systems has not only increased content discoverability but
 Real-time Analytics: Media companies can use also encouraged exploration and diversity in consumption
machine learning to process and analyze real-time data patterns.
from social media and other sources. This helps in
tracking trends, public sentiment, and reactions to events As media continues to evolve, machine learning
as they unfold. will remain a cornerstone of innovation, continually
 Recommendation Systems: Streaming platforms and pushing the boundaries of what is possible. However, it is
media websites use recommendation systems to suggest essential to approach these advancements thoughtfully,
related content based on user preferences and behavior. balancing the benefits of automation and personalization
This enhances user engagement and helps users discover with ethical considerations, privacy concerns, and the need
new content. for human creativity and oversight.
 Engagement Tracking: Machine learning can track
user engagement metrics such as click-through rates, In essence, the integration of machine learning into
time spent on content, and social media interactions. the media sector has sparked a transformation that promises
This data helps media companies understand what to reshape the way we engage with content, making it more
content is most engaging and refine their strategies relevant, accessible, and impactful for audiences
accordingly. worldwide. As the journey continues, collaboration between
technology and creativity will pave the way for a media
Overall, machine learning is transforming the media landscape that is both cutting-edge and deeply resonant with
landscape by enabling more efficient content creation, its consumers.
delivery, and engagement strategies. It helps media
companies better understand their audience, optimize their V. BIBLIOGRAPHY
operations, and provide a more personalized and engaging
experience to consumers.
G. Siva Prasad, M.C.A, M.Tech (CSE),
IV. CONCLUSION UGCNET, Works as Assistant Professor in
the Department of MCA , KBN College
In conclusion, the integration of machine learning (Autonomous), Vijayawada, Andhra Pradesh
into the media industry has brought about a transformative and he is having 10 years of experience in
shift, redefining how content is created, delivered, and teaching and one year in industrial
consumed. The significance of machine learning in media is experience. His research interest includes Data Mining,
undeniable, as it has revolutionized various aspects of the Machine Learning, Deep Learning, Big Data, Microsoft
industry, from content recommendation and creation to Programming Languages and Web Programming. He has
audience insights and real-time analytics. attended workshops on POWER BI, Data Analytics using R,
Generative AI, Block Chain Technology and many more.
By harnessing the power of machine learning, VI. REFERENCES
media companies have been able to offer personalized
experiences that cater to individual preferences, thereby 1) Tim Brooks, Aleksander Holynski, and Alexei A
enhancing user engagement and retention. The automation Efros. Instructpix2pix: Learning to follow image
of content creation processes, powered by natural language editing instructions. arXiv preprint arXiv:2211.09800,
2022.

ISBN Number : 978-81-958673-8-7 71


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

2) Wenhao Chai and Gaoang Wang. Deep vision 15) Europe Health market: top 4 trends boosting the
multimodal learning: Methodology, benchmark, and industry demand through 2026. Bio Space. 2021 Feb
trend. Applied Sciences, 12(13):6588, 2022. 16.
3) Jooyoung Choi, Sungwon Kim, Yonghyun Jeong, 16) F. Fitzpatrick, A. Doherty, and G. Lacey, “Using
Youngjune Gwon, and Sungroh Yoon. Ilvr: artificial intelligence in infection prevention,” Current
Conditioning method for denoising diffusion Treatment Options in Infectious Diseases, vol. 12, no.
probabilistic models. arXiv preprint 2, pp. 135–144, 2020.
arXiv:2108.02938, 2021. 17) P. E. Ekmekci and B. Arda, History of artificial
4) Amir Hertz, Ron Mokady, Jay Tenenbaum, Kfir intelligence, SpringerBriefs in Ethics, 2020.
Aberman, Yael Pritch, and Daniel Cohen-Or. Prompt- 18) L. Wynants, B. Van Calster, G. S. Collins et al.,
to-prompt image editing with cross attention control. “Prediction models for diagnosis and prognosis of
arXiv preprint arXiv:2208.01626, 2022 covid-19: systematic review and critical appraisal,”
5) Yaniv Nikankin, Niv Haim, and Michal Irani. bmj, vol. 369, 2020.
Sinfusion: Training diffusion models on a single 19) T. Boyles, A. Stadelman, J. P. Ellis et al., “The
image or video. arXiv preprint arXiv:2211.11743, diagnosis of tuberculous meningitis in adults and
2022. adolescents: protocol for a systematic review and
6) Chitwan Saharia, William Chan, Huiwen Chang, individual patient data meta-analysis to inform a
Chris Lee, Jonathan Ho, Tim Salimans, David Fleet, multivariable prediction model,” Wellcome Open
and Mohammad Norouzi. Palette: Image-to-image Research, vol. 4, 2021.
diffusion models. In ACM SIGGRAPH 2022 20) G. A. Tadesse, T. Zhu, N. L. N. Thanh et al.,
Conference Proceedings, pages 1– 10, 2022 “Severity detection tool for patients with infectious
7) Daquan Zhou, Weimin Wang, Hanshu Yan, Weiwei disease,” Healthcare Technology Letters, vol. 7, no.
Lv, Yizhe Zhu, and Jiashi Feng. Magicvideo: 2, pp. 45–50, 2020.
Efficient video generation with latent diffusion
models. arXiv preprint arXiv:2211.11018, 2022
8) Tian Ye, Yunchen Zhang, Mingchao Jiang, Liang
Chen, Yun Liu, Sixiang Chen, and Erkang Chen.
Perceiving and modeling density for image dehazing.
In European Conference on Computer Vision, pages
130–145. Springer, 2022
9) Benoit J, Onyeaka H, Keshavan M, Torous J.
Systematic review of digital phenotyping and
machine learning in psychosis spectrum illnesses.
Harv Rev Psychiatry 2020;28(5):296-304.
10) Countries. World Health Organization. 2022. URL:
https://www.who.int/countries [accessed 2022-02-25]
11) Henson P, Rodriguez-Villa E, Torous J. Investigating
associations between screen time and .
symptomatology in individuals with serious mental
illness: longitudinal observational study. J Med
Internet Res 2021 Mar 10;23(3):e23144
12) Brys ADH, Bossola M, Lenaert B, Biamonte F,
Gambaro G, Di Stasio E. Daily physical activity in
patients on chronic haemodialysis and its relation
with fatigue and depressive symptoms. Int Urol
Nephrol 2020 Jul 28;52(10):1959-1967. [CrossRef]
13) Nguyen NH, Vallance JK, Buman MP, Moore MM,
Reeves MM, Rosenberg DE, et al. Effects of a
wearable technology-based physical activity
intervention on sleep quality in breast cancer
survivors: the ACTIVATE Trial. J Cancer Surviv
2021 Apr 01;15(2):273-280. [CrossRef] [Medline]
14) Bai R, Xiao L, Guo Y, Zhu X, Li N, Wang Y, et al.
Tracking and monitoring mood stability of patients
with major depressive disorder by machine learning
models using passive digital data: prospective
naturalistic multicenter study. JMIR Mhealth health
2021 Mar

ISBN Number : 978-81-958673-8-7 72


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Literature Review of Blockchain Technology


Shaik Imran Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: imranshaik8333@gmail.com Email: gspkbn@gmail.com

ABSTRACT concerning blockchain. We commence by elucidating


the essence of blockchain, followed by outlining the
Blockchain stands as a fresh advancement that carries methodology employed to curate our dataset.
substantial implications for the future of global data and Subsequently, we delve into an exploration of the
financial exchange. Its novelty is such that scholarly emerging themes we have identified. This is
exploration in this area remains relatively limited, succeeded by an examination of the reasons for
although this circumstance is swiftly changing. To initiate blockchain's significance and its present applications,
this literature review, we have initiated the process by supplemented by our recommendations. Ultimately,
amassing a collection of predominantly peer-reviewed we conclude by summarizing our findings and
sources, supplemented by an academic overview of offering our reflections on the potential for and
articles from various sources. Our assortment of articles necessity of future blockchain research.
affords us the capacity to offer a representative outlook
on three key aspects. Initially, it delves into the principal What is blockchain:
contemporary subjects under discussion concerning We aim to provide comprehensive insights into the
blockchain technology. Secondly, it presents the following questions: What precisely is the concept of
representative categories that these subjects fall into. blockchain? What categories and trends characterize
Lastly, it contemplates the potential trajectory of blockchain? What are the societal and technological
blockchain advancement and its potential influence on consequences arising from this innovation? What
both society and technology. trajectory lies ahead for blockchain technology?
These questions naturally lead us to contemplate their
I. INTRODUCTION significance. To commence, let's retrace our steps to
the origins of blockchain. The concept of blockchain
Blockchain technology is a relatively nascent concept, first emerged in a paper titled "Bitcoin: A Peer-to-
characterized by Wikipedia as "an evolving record of Peer Electronic Cash System," authored by an
records, known as blocks, linked and secured through enigmatic figure using the pseudonym Satoshi
cryptography" (Wikipedia and Contributors, 2018b). This Nakamoto. Remarkably, this work was never
paper aims to identify a representative cross-section of formally published in a peer-reviewed journal
current themes within blockchain research while also (Nakamoto, 2008).
delving into future implications and offering our insights.
Although blockchain's comprehension is not yet Regarding Bitcoin, Pierro elucidates that each Bitcoin
widespread, it is swiftly gaining traction and holds a can be represented as a numerical value, and these
prominent position in contemporary discourse. However, values serve as solutions to a specific equation. The
the trends seen in media often diverge from those generation of a new solution is termed "mining."
observed in research, thus presenting an intriguing Subsequent to mining, Bitcoins can be traded or
opportunity to explore how scholarly and peer-reviewed transacted, with each transaction recorded in the
research aligns with a trending topic. blockchain's chronological log, often referred to as a
"ledger." The distinctive feature of blockchain lies in
Notably, not too long ago, scholarly articles on its decentralized nature; rather than being owned or
blockchain were scarce; nevertheless, this landscape is controlled by a single entity, every participating
undergoing rapid transformation. In this paper, we furnish
an overview of prevailing subjects in academic literature
and address three fundamental queries computer in a transaction retains a copy of the
transaction's details.

ISBN Number : 978-81-958673-8-7 73


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

As previously mentioned, our exploration resulted in


Pierro (2017) further describes the blockchain as akin to a the identification of approximately 20 pertinent
table with three columns: each row corresponds to a themes within the literature. We have compiled a
particular transaction, the first column holds the table that presents each of these topics, encompassing
transaction's timestamp, the second column stores its a wide range of subjects including prior surveys,
details, and the third column encompasses a hash that financial concepts, management, higher education,
amalgamates the current transaction's specifics with the and narratives. In the current rendition of our review,
hash of the prior transaction. By incorporating a we have incorporated reviews for nearly half of these
timestamp and the preceding transaction's hash, anyone topics, with the intention of encompassing all of them
wishing to validate this information can do so at any in the final version.
juncture, effectively enabling the tracking of historical Yli-Huumo, Ko, Choi, Park, and Smolander
data. Security measures are in place to restrict undertook a systematic review of peer-reviewed
unauthorized access to transaction information. The papers available up to 2015. Impressively, their
previously mentioned hash in the third column, which is efforts yielded a total of 41 peer-reviewed articles
populated during a transaction, serves as an encrypted published by that year. Notably, their observation of
sequence of alphanumeric characters intended to mask particular interest was that 80 percent of the located
transaction data. articles centered around the utilization of blockchain
for Bitcoin – a prominent cryptocurrency. While a
II. METHODOLOGY concentrated focus on cryptocurrency was a logical
direction for such a review, the authors opted to
Our investigation commenced with a query on the online concentrate on technical aspects of blockchain
server of the UT library. Upon searching for the keyword instead, specifically delving into matters of security,
'blockchain,' we retrieved a diverse array of information performance, scalability, and more. Their exploration
sources. Notably, newspaper articles dominated the also brought to light that a significant portion of
results with a substantial count of over 23,000 entries. In research was oriented towards the domains of privacy
comparison, total journal articles constituted about a tenth and security within blockchain, with an emphasis on
of that number, totaling a little over 2,000. Magazine uncovering limitations.
articles were nearly equal in quantity to the journal
articles, although this does not accurately portray the 1. Themes:
overall prevalence of magazine articles. Brochures
accounted for just under 800 entries, while books
amounted to around 150. While our primary focus was on
peer-reviewed journal articles, it was intriguing to
examine these figures in relation to each other. After
assessing the total count, we refined our query to
exclusively display journal articles. Our process involved
starting from the top of the results and selecting articles 2. Systematic Review:
that appeared pertinent and valuable. Once identified, we During our initial research phase, we came
proceeded to download these articles and analyze their across an article that meticulously examined
content for recurring themes. This analysis led us to the current landscape of blockchain
compile a list of topics, and we made a deliberate effort to research. In this work, Yli-Huumo and
limit ourselves to downloading only one article per topic. colleagues conducted a systematic review of
In the end, we accumulated approximately 20 articles 41 peer-reviewed papers published until
from peer-reviewed journals that we believe provide a 2015. Their communication suggests that
representative snapshot of the existing literature. These they were able to locate only a total of 41
selected articles enabled us to identify a range of peer-reviewed articles during that time
prevalent themes within the current discourse. While it's period. An intriguing observation they make
important to acknowledge that these themes do not right from the outset of the article is that a
encompass the entirety of trends in contemporary significant 80 percent of the articles they

blockchain literature, they do offer a highly representative found centered around the application of
overview. blockchain for Bitcoin – a prominent digital
currency. Although the natural inclination
FINDINGS/THEMES for such a review could have been to
primarily focus on cryptocurrency, they

ISBN Number : 978-81-958673-8-7 74


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

made the distinctive choice to emphasize advancements into the securities sector – a
technical intricacies of blockchain instead, domain in which he possesses expertise.
notably those concerning security, performance,
scalability, and related aspects. Furthermore,
their exploration revealed a predominant focus 5. Food Security:
on issues of privacy and security within In a brief communication published in
blockchain research, alongside uncovering Nature magazine (Ahmed and Broek, 2017),
limitations. a group of researchers from Montana
College highlight several emerging trends
3. Finance: that underscore the necessity and potential
"Different on Blockchain" asserts various of blockchain technology for ensuring food
financial benefits linked to blockchain security. The central concern in this context
technology. The authors initiate their argument is the traceability of food products,
by employing the example of a bank and the encompassing their origins, entire
considerable resources squandered due to storing distribution networks, and ultimately
and processing all transactions internally. Cocco, reaching the end consumer. Blockchain
Pinna, and Marchesi (2017) assert that such technology could play a pivotal role in
resource usage, including hard drive storage for mitigating fraudulent activities and
data and the extra energy required for addressing, if not preventing, challenges
operations, not only incurs higher costs for related to foodborne illnesses.
banks than a system built on a blockchain, but it
also leads to a net reduction in resource 6. Property - Legal Ownership:
consumption. This outcome, in turn, contributes Ishmaev's paper introduces a significant
to environmental conservation by minimizing concept, suggesting that the
electronic waste and energy consumption. "implementation of complex systems of
smart contracts and decentralized
organizations may rewrite the basic tenets of
Transitioning from the resource argument, the property law, constitutional rights, and even
article delves into addressing the environmental judicial enforcement of law" (Ishmaev,
security inherent in the blockchain due to its 2017). This statement's implications can be
capacity to maintain records of past transactions dissected into two primary possibilities.
within preceding ones. This novel ledger
structure enhances the bank's ability to maintain Firstly, he underscores that blockchain
more secure records, which are less susceptible technology, particularly through its ledger
to tampering. Simultaneously, it affords the bank system, facilitates the creation of smart
a clearer perspective on potential investment contracts. In these contracts, every phase is
opportunities. Any attempts to manipulate traceable and stored, and when certain
financial records would be more easily conditions are met (for example, completing
detectable, thus reinforcing transparency and a website for a client), automatic payment
accountability. can be triggered for the contractor. Through
automation, specific stages can be
4. Securities: programmed to update the ledger in real-
Tranquillini, in his analysis, delves into the time, enabling transparent progress tracking
potential of blockchain technology within the for all parties involved.
securities industry, with a focus less on the
technology itself and more on its
III. DISCUSSIONS, IMPLICATIONS, AND
application. He draws from a previous article RECOMMENDATIONS:
authored by academics Benjamin Edelman and
Damien Geradin, published in the Harvard 1. WHY:
Business Review, which explored the integration We have discussed what the blockchain is,
of blockchain technologies into the consumer but why should anyone care? For seemingly
goods industry. Using this as a foundation, being a rather ambiguous technology to the
Tranquillini leverages their work to delve into general populace, a monetary application of
the prospects of incorporating such the blockchain has garnered a large financial

ISBN Number : 978-81-958673-8-7 75


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

backing. With the price of a Bitcoin currently information that would be great to host on a
being valued at about ten thousand dollars distributed open ledger, but other
(Wikipedia & Contributors, 2018a), it seems information that should not be.
important to see why people are investing in it. Organizational management and other such
As illustrated by the thematic analysis above, domains are no different. It is hard to have
blockchain has implications for a wide variety of much foresight beyond this at the moment
fields. Some are more hopeful, or seem more because blockchain technology is so new,
useful, than others. While it might be too and we are just really starting to see both the
difficult to see applying blockchain to really pros and cons play out in real time.
intricate and highly regulated industries like
securities at the moment (Tranquillini, 2016), we V. CONCLUSIONS & FUTURE WORK
can see that it has already had some degree of
success with things like product traceability (Lu With blockchain technology possessing such
& Xu, 2017). a large appeal, we are already seeing
2. HOW: widespread adoption. As nearly every
There is ample evidence that blockchain is industry utilizes some sort of agile, record
currently being, and should be, implemented in keeping practices, it is not unreasonable to
industries where it is a good fit. But now that we expect to see this technology applied to a
know why, the question is how will, or how can, wide range of applications some of which
blockchain technology be applied to various are hinted at in our previous sections such as
domains? Every field and industry will be the potential for a smart city, while others
different, and one of the biggest considerations are either still in development or have yet to
is what other systems to these fields and be discovered. Furthermore, due to the peer-
industries use. As mentioned in the previous to-peer nature of the technology this
section, fields such as government, finances, and technology and every stakeholder having
securities will be some of the most difficult. access to their block of the ledger, cooking
Blockchain technology provides a public the books or falsifying data has never been
harder. This alone has the potential to
increase consumer confidence in these new
ledger, which is great for accountability, but can technological disruptions. As with any new
be a nightmare for keeping information private. technology, the underpinnings are not well
One of the biggest challenges with the literature understood and for that reason it is difficult
so far is that most of the research is still to say how widely adopted the technology
theoretical, and not applied. There is certain will be.
topics and new applied applications, as well as study Languages and Web Programming. He has
adoption rates of the technology. For those who do attended workshops on POWER BI, Data
adapt blockchain, further study would grant us Analytics using R, Generative AI, Block
insights as to what increases (if any) in productivity Chain Technology and many more.
have been recorded. Studies may also focus on
roadblocks as to why this technology. VI. REFERENCES

IV. BIBLIOGRAPY
1. Ahmed, S., & Broek, N. t. (2017). Food
supply: Blockchain could boost food
security.(brief article). Nature, 550 (7674),
G. Siva Prasad, M.C.A,
43.
M.Tech (CSE),
2. Chapron, G. (2017). The environment
UGCNET, Works as
needs cryptogovernance. Nature, 545
Assistant Professor in the
(7655).
Department of MCA ,
3. Cocco, L., Pinna, A., & Marchesi, M.
KBN College
(2017). Banking on blockchain: Costs
(Autonomous), Vijayawada, Andhra Pradesh and he
savings thanks to the blockchain technology.
is having 10 years of experience in teaching and one
Future Internet, 9 (3), 25.
year in industrial experience. His research interest
4. Huckle, S., & White, M. (2016).
includes Data Mining, Machine Learning, Deep
Socialism and the blockchain. Future
Learning, Big Data, Microsoft Programming

ISBN Number : 978-81-958673-8-7 76


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Internet, 8 (4), 49. 5. Ishmaev, G. (2017).


Blockchain technology as an institution of

property. Metaphilosophy, 48 (5), 666-686. 6. Lu,


Q., & Xu, X. (2017). Adaptable blockchain-based
systems: A case study for product traceability. IEEE
Software, 34 (6), 21-27. 7. Maxwell, D., Speed, C.,
& Pschetz, L. (2017). Story blocks: Reimagining
narrative through the blockchain. Convergence: The
International Journal of Research into New Media
Technologies, 23 (1), 79-97. 8. Nakamoto, S.
(2008). Bitcoin: A peer-to-peer electronic cash
system. Retrieved from https://bitcoin.org/
bitcoin.pdf 9. Pierro, M. D. (2017). What is the
blockchain? Computing in Science & Engineering,
19 (5), 92-95. 10. Sun, J., Yan, J., & Zhang, K. Z.
K. (2016). Blockchain-based sharing services: What
blockchain technology can contribute to smart
cities. Financial Innovation, 2 (1), 1-9. 11. Tapscott,
D., & Tapscott, A. (2016). Blockchain revolution:
how the technology behind bitcoin is changing
money, business, and the world. New York:
Portfolio / Penguin. 12. Tranquillini, A. (2016).
Blockchain yes, blockchain no: An outsider (non-it
expert) view. Journal of Securities Operations &
Custody, 8 (4), 287-291. 13. Wang, H., Chen, K., &
Xu, D. (2016). A maturity model for blockchain
adoption. Financial Innovation, 2 (1), 1-5. 14.
Wikipedia, & Contributors. (2018a). Bitcoin —
wikipedia, the free encyclopedia. Retrieved from
https:// en.wikipedia.org/w/index.php?
title=Bitcoin&oldid=827425580 ([Online; accessed
27-February-2018]) 15. Wikipedia, & Contributors.
(2018b). Blockchain — wikipedia, the free
encyclopedia. Retrieved from
https://en.wikipedia.org/w/index.php?title=B

lockchain&oldid=827331254 ([Online; accessed 27-


February2018]) 16. Yli-Huumo, J., Ko, D., Choi, S.,
Park, S., & Smolander, K. (2016). Where is current
research on blockchain technology?âĂŤa systematic
review. PLOS ONE, 11 (10), e0163477.

ISBN Number : 978-81-958673-8-7 77


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Survey On E-Payments In India


Shaik Karimulla, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: bablushaik7867@gmail.com Email: gspkbn@gmail.com

ABSTRACT businesses are now offering their products and


services online. However, if you are a business and
Objectives: This paper is aimed at investigating want to accept e payments, you have to work on
and increasing awareness about various concepts your electronic payment system to provide better
related to Electronic Payment Systems (EPS) and secure service for your customers. And to
including its advantages, challenges and security know the e-payment system here is everything you
considerations. The proposed study also evaluates need to know.
the adoption of e-payment systems and the
resulting impact on economy of a nation.
Methods/Statistical Analysis: In this paper, a
comprehensive survey on all the aspects of
electronic payment was conducted after analysis of
several research studies on online payment
systems. The most recent references and
information have been explored in order to gain
significant information about electronic payments
systems. Findings: From the study conducted, it
can be elucidated that despite various issues that
usage of electronic payment systems pose, these are
identified as a positive step towards the economic
development of a nation. Nevertheless, its full
potential can be realized only by raising its What is an e-payment system?
awareness among people. An e-payment or Electronic Payment system allows
Applications/Improvements: With the advancement customers to pay for the services via electronic
in technology and popularity of Internet, the methods.
perception of making online transactions is bound
to gain momentum. In the future, the payment
modes currently used and supported shall see a They are also known as online payment systems.
declining trend owing to the numerous benefits Normally e-payment is done via debit, credit cards,
offered by electronic payment systems. direct bank deposits, and e-checks, other alternative
e-payment methods like e-wallets, bitcoin,
I. INTRODUCTION cryptocurrencies, bank transfers are also gaining
popularity.

In India’s journey towards E-payments,


digitization, merchants, as well as customers, are II. Types of e-payment system
getting comfortable adopting new digital
technologies. With customers are getting E-payments can be done in the following ways:
comfortable with online shopping, nowadays, an
eCommerce site and online payment acceptance is Internet banking – In this case, the payment is
a must to have for any business. Customers are done by digitally transferring the funds over the
happy with browsing and shopping at any time internet from one bank account to another.
from anywhere with just a few clicks and along Some popular modes of net banking are, NEFT,
with this rise of online shopping and eCommerce, RTGS, IMPS.
E-payments are gaining widespread popularity.
COVID and the limitation it has imposed on people
who made online payments the need of time. Many

ISBN Number : 978-81-958673-8-7 78


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Card payments – Card payments are done via experience making customers less dependent on
cards e.g. credit cards, debit cards, smart cards, cash.
stored valued cards, etc. In this mode, an electronic
payment accepting device initiates the online
QR payments – QR code-enabled payments have
payment transfer via cardS
become immensely popular. QR code stands for
Credit/ Debit card – An e payment method where
‘Quick Response’ code, a code that contains a pixel
the card is required for making payments through
pattern of barcodes or squares arranged in a square
an electronic device.
grid.

Smart card – Also known as a chip card, a smart


Each part of the code contains information. This
card, a card with a microprocessor chip is needed to
information can be merchant’s details, transaction
transfer payments.
details, etc. To make payments, one has to scan the
QR code with a mobile device.
Stored value card – These types of cards have
some amount of money stored beforehand and are
Contactless payments – Contactless payments are
needed to make funds transfer. These are prepaid
becoming popular for quite some time. These
cards like gift cards, etc.
payments are done using RFID and NFC
technology.
Direct debit – Direct debit transfers funds from a
customer’s account with the help of a third party
The customer needs to tap or hover the payment
device or a card near the payment terminal, earning
E-cash – It is a form where the money is stored in it a name, ‘tap and go’.
the customer’s device which is used for making
transfers.
UPI payments – NPCI (National Payment
Corporation of India) has developed an instant real-
E-check – This is a digital version of a paper check time payment system to facilitate interbank
used to transfer funds within accounts. transactions.

Alternate payment methods – As technology is This payment system is titled UPI(Unified Payment
evolving, e-payment methods kept evolving with it Interface). Payments via UPI can be made via an
(are still evolving..) These innovative alternate e- app on a mobile device.
payment methods became widely popular very
quickly thanks to their convenience.
Biometric payments – Biometric payments are
done via using/scanning various parts of the body,
E-wallet – Very popular among customers, an E- e.g. fingerprint scanning, eye scanning, facial
wallet is a form of prepaid account, where recognition, etc.
customer’s account information like credit/ debit
card information is stored allowing quick,
These payments are replacing the need to enter the
seamless, and smooth flow of the transaction.
PIN for making transactions making these
payments more accessible and easy to use.
Mobile wallet – An evolved form of e-wallet,
mobile wallet is extensively used by lots of
Payments are done via Wearable devices –
customers.
Wearable devices are rapidly becoming popular
among customers.
It is a virtual wallet, in the form of an app that sits
on a mobile device. Mobile wallet stores card
These devices are connected to the customer’s bank
information on a mobile device.
account and are used to make online payments.

The user-friendly nature of mobile wallets makes


An example of a wearable used for making an
them easier to use. It offers a seamless payment
online payment is a smartwatch.

ISBN Number : 978-81-958673-8-7 79


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

AI-based payments – As machine learning and


Artificial Intelligence is creating a revolution all
around the world, AI-based solutions are becoming
more popular.

Payments based on AI such as speakers, chatbots,


ML tools, deep learning tools, etc are making it
easier for businesses to maintain transparency.

How e-payment system works?


Entities involved in an online payment system
The merchant
The customer / the cardholder
The issuing bank III. GROWTH IN DIGITAL PAYMENTS IN
The acquirer INDIA
Payment Processor Payment Gateway
Working of e-payments can be explained in the The value of UPI-based digital payments in the
following three steps, south Asian country of India was over 139 trillion
Indian rupees in financial year 2023. This was a
strong increase from the previous financial year's
Payment initiation – Customer finalizes the
value of over 84 trillion Indian rupees.
product/service and chooses the payment method to
initiate the transaction.

Depending on the payment method, the customer Digital payments in


enters the required information like card number,
CVV, personal details, expiration date, PIN, etc.

The chosen payment method either redirects the


customer to an external payment page or a bank’s
payment page to continue the payment process.

Payment authentication – The information


submitted by the customer along with other details
like payment information, customer’s account
information is authenticated by the operator.

Advantages of E-Payment
The operator can be a payment gateway or any
other solution involved. If everything gets  Increased speed and convenience
authenticated positively, the operator reports a  Eliminates the security risks
successful transaction.  Competitive advantage
 Time saving
Payment settlement – After the successful  Environment friendly
authentication process, payment from the  Money is available quicker
customer’s bank gets transferred into the  Speed of e Payments
merchant’s account by the online payment service
provider.
Disadvantages of E-Payment

 Security concerns
 Disputed transactions

ISBN Number : 978-81-958673-8-7 80


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

3. Zhou, Hao, Hong-feng Chai, and Mao-lin Qiu.


"Fraud detection within bankcard enrollment on
mobile device based payment using machine

 Increased business costs


 Lack of anonymity
 Necessity of internet
 Technical difficulties
 Service charges and other expenses

learning." Frontiers of Information Technology &


IV.CONCLUSION Electronic Engineering 19.12 (2020): 1537-1545.
4. Franciska, A. Martina, and S. Sahayaselvi. "An
In conclusion, if you are running a business, Overview On Digital Payments." (2019).
accepting online payments is the need of current
times. You need to find out what your target 5. Khan, Burhan Ul Islam, et al. "A compendious
customers are preferring and accordingly you need study of online payment systems: Past
to provide the most convenient and relevant online developments, present impact, and future
payment solutions. considerations." International journal of advanced
computer science and applications 8.5 (2018): 256-
271.
V. BIBLIOGRAPHY
6. Manisha, Neena Madan. "Credit Card Fraud
Detection Using Split Criteria in Classification."
G. Siva Prasad, M.C.A, M.Tech IOSR Journal of Computer Engineering (IOSR-
(CSE), UGCNET, Works as JCE) e-ISSN: 2278-0661, p-ISSN: 2278-8727,
Assistant Professor in the Volume 19, Issue 2, Ver. I (Mar.-Apr. 2017), PP
Department of MCA KBN 39-43
College, Vijayawada, Andhra
8. Save, Prajal, et al. "A novel idea for credit card
Pradesh and he is having 10 years
fraud detection using decision tree." International
of experience in teaching and one year in industrial
Journal of Computer Applications 161.13 (2016).
experience. His research interest includes Data
Mining, Machine Learning, Deep Learning, Big 9. Bezovski, Zlatko. "The future of the mobile
Data, Microsoft Programming Languages and Web payment as electronic payment system." European
Programming. He has attended workshops on Journal of Business and Management 8.8 (2015):
POWER BI, Data Analytics using R, Generative AI, 127-132.
Block Chain Technology and many more.
10. Garg, C. R. "Importance of E-Commerce
Payment System in Less Paper Work." vol 2
VI. REFERENCES (2014): 47-59

1. Patgaonkar, Aditi. "A Study of Electronic


Payment Systems with perspective of Customers
Adoption in India." (2022).
2. Ravikumar T, Suresha. B, Sriram. M, and
Rajesh. R. "Impact of Digital Payments on
Economic Growth: Evidence from India."
International Journal of Innovative Technology and
Exploring Engineering (IJITEE) ISSN: 2278-3075,
Volume-8 Issue-12. (2021)

The Study On Public Acceptance Of Upi And


Digital Payments
ISBN Number : 978-81-958673-8-7 81
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Shaik Mohiddin Basha, Siva Prasad Guntakala,


Student, Assistant professor,
Department of MCA, Department of MCA,
KBN College (Autonomous), KBN College-(Autonomous),
Vijayawada-520001, Vijayawada-52001,
Andhra Pradesh, India, Andhra Pradesh, India,
Email: Shaikbasha1713138413@gmail.com Email: gspkbn@gmail.com

ABSTRACT
The advent of digital technology has revolutionized Instantaneous money transfers between individuals
the way financial transactions are conducted and merchants.
globally. One of the prominent outcomes of this UPI's design emphasizes interoperability, enabling
transformation is the emergence of Unified various banks and financial institutions to
Payments Interface (UPI) and digital payment collaborate within the ecosystem. This open
systems. This abstract provides an overview of UPI architecture fosters healthy competition and
and digital payments, highlighting their encourages innovation, resulting in an array of
significance, functioning, benefits, and challenges. user-friendly payment apps that leverage UPI's
Unified Payments Interface (UPI) is a real-time infrastructure.
payment system developed in India that enables
seamless peer-to-peer (P2P) and peer-to- merchant Digital Payments
(P2M) transactions using a single platform. UPI
leverages the ubiquity of smartphones and internet Digital payments encompass a broader spectrum of
connectivity to facilitate instant fund transfers, bill electronic transactions that extend beyond UPI.
payments, and online purchases. With its open This category encompasses various methods such
architecture, UPI allows multiple banks and as credit and debit card transactions, mobile
financial institutions to participate, ensuring wallets, internet banking transfers, and digital
interoperability and a competitive environment that currencies like cryptocurrencies..
promotes innovation. One of the fundamental advantages of digital
payments is the reduction in dependence on
Key words of UPI digital payments physical currency. This transition offers a multitude
Here are some key terms and keywords related to of benefits, including enhanced security, increased
UPI (Unified Payments Interface) and digital financial inclusion for underserved populations,
payments: improved transparency, and streamlined record-
keeping for both consumers and businesses.
I. INTRODUCTION
The Transformative Impact
The modern era is witnessing a profound shift in
the way financial transactions are conducted, with The convergence of UPI and digital payments has
traditional modes of payment giving way to reshaped how people perceive and engage with
innovative digital solutions. Among these money. Gone are the days of cumbersome manual
advancements, the Unified Payments Interface transactions and the need for carrying wads of cash.
(UPI) and digital payments have emerged as pivotal These innovations have led to a paradigm shift in
drivers of the changing financial landscape. This financial behavior, prompting a digital-first mindset
introduction provides an overview of UPI and among consumers, businesses, and governments alike.
digital payments, shedding light on their Moreover, UPI and digital payments have
significance, functioning, and the transformative transcended national boundaries, empowering
impact they have on the global economy. individuals to engage in cross-border transactions
with ease, fostering global economic connectivity.
Unified Payments Interface (UPI)

Unified Payments Interface, commonly known as HOW UPI WORK


UPI, is a revolutionary real-time payment system
that originated in India. Developed by the National Unified Payments Interface (UPI) is a real-time
Payments Corporation of India (NPCI), UPI payment system developed by the National
provides a unified platform for seamless, secure, Payments Corporation of India (NPCI). It allows
and users to link multiple bank accounts into a single

ISBN Number : 978-81-958673-8-7 82


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

mobile application and perform various types of payments:


financial transactions, such as transferring funds,
making payments, and more.The representation of Real-Time Payment Systems
how UPI works involves multiple layers of UPI operates on a real-time basis, allowing
technology and processes: instantaneous transfer of funds between bank
accounts. Real-time payment systems utilize
User Registration protocols like IMPS (Immediate Payment Service)
To start using UPI, a user needs to have a bank and NFS (National Financial Switch) to ensure
account and a mobile number registered with that swift and direct settlement of transactions.
bank. The user needs to download a UPI-enabled
mobile app provided by their bank or a third-party Mobile applications
app that supports UPI. UPI transactions primarily occur through mobile
applications provided by banks and third-party
Linking Bank Accounts payment service providers. These apps facilitate
After installing the app, the user needs to link their user registration, account linking, transaction
bank account(s) to the app by providing necessary initiation, and authentication using mobile devices.
details like account number and IFSC code. Once
the bank verifies the information, the account is Payment Gateway Integration
linked to the UPI app. Digital payment systems often integrate with
payment gateways that securely process online
Virtual Payment Address (VPA) Creation transactions.

The registration process, the user creates a Virtual Payment gateways utilize encryption and
Payment Address (VPA), which acts as a unique tokenization to protect sensitive payment
identifier for their bank account. The VPA is in the information during transmission.
format of username @bank e.g.,
john .doe @bank name Encryption and Security Protocols
Security is paramount in digital payments.
Virtual Payment Address (VPA) Creation Technologies like SSL/TLS (Secure Sockets
Layer/Transport Layer Security) encrypt data
The registration process, the user creates a Virtual exchanged between a user's device and the server,
Payment Address (VPA), which acts as a unique
identifier for their bank account. The VPA is in the
format of username @bank e.g., safeguarding it from unauthorized access.
john .doe @bank name
QR Codes and NFC
Secure Authentication Digital payment methods such as UPI use QR
UPI employs two-factor authentication for security. codes (Quick Response codes) that can be scanned
When making transactions, users are required to by smartphones to initiate payments. Near Field
provide a combination of their UPI PIN and Communication (NFC) technology enables
biometric authentication (if supported by the device contactless payments by allowing devices to
and app). communicate when in close proximity.

Initiating Transactions Biometric Authentication


To send money or make payments using UPI, a Many digital payment platforms incorporate
user enters the recipient's VPA or scans their QR biometric authentication methods such as
code. The app then verifies the recipient's VPA, and fingerprint scanning or facial recognition to
the user enters the transaction amount and a enhance security and streamline user
description (optional authentication.

Technologies of UPI and digital payments Tokenization


The technologies underlying UPI and digital Tokenization is the process of substituting sensitive
payments are a combination of software, protocols, data with unique tokens that have no inherent
value. This technology is used to protect card
information and other sensitive data during online
encryption, and infrastructure that enable secure and mobile transactions.
and
seamless financial transactions. Here are some of
the key technologies involved in UPI and digital

ISBN Number : 978-81-958673-8-7 83


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Block chain and Cryptocurrencies


Some digital payment systems incorporate block
chain technology and cryptocurrencies for secure, III. MECHANISM OF DIGITAL
decentralized transactions. Cryptocurrencies like PAYMENTS (GENERAL)
Bitcoin and Ethereal have their own payment
networks and are often used for cross-border
transactions. User Registration

Mechanism of UPI and Digital payments Users create accounts on a digital payment
The mechanisms of UPI (Unified Payments platform, often linked to their bank accounts or
Interface) and digital payments involve a series of credit/debit cards.
steps and interactions between various Transaction Initiation Users initiate payments
stakeholders, including Users, banks, payment through various channels: mobile apps, websites, or
service providers, and technology platforms. Here's even contactless methods like NFC or QR codes.
a simplified overview of how UPI and digital
payments work: Payment Gateway Integration
If the payment is online, the platform connects to a
II. MECHANISM OF UPI payment gateway that securely processes the
transaction. The payment gateway may ask the user
Registration for additional authentication through 3D Secure
Users download a UPI-enabled mobile app (MasterCard Secure Code or Verified by Visa) or
provided by their bank or a third-party payment other methods.
service provider.
They link their bank accounts to the app and create Data Encryption and Tokenization
a unique Virtual Payment Address (VPA) that acts Sensitive payment information is encrypted and
as an identifier, eliminating the need to share may be tokenized to protect it during transmission.
account numbers and IFSC codes.
Transaction Processing
Transaction Initiation Conduct surveys or distribute questionnaires to
To make a payment, the user selects the option to users, merchants, and stakeholders to gather
send money through UPI within the app. They enter insights into their experiences, preferences, and
the recipient's VPA, or they can use a QR code challenges related to using UPI and digital payment
provided by the recipient. systems.

Authentication Case Studies


The user's app communicates with the UPI Analyze specific cases or scenarios of UPI
platform, and a request is sent to their bank for implementation or digital payment adoption to
authentication. The user may be prompted to enter understand the factors that influenced their success
a PIN or use biometric authentication for security. or challenges

Transaction Processing IV. CONCLUSION


Authenticated, the payment details are once
encrypted and transmitted to the recipient's bank In conclusion, UPI (Unified Payments Interface)
through the UPI platform. The recipient's bank and digital payments have significantly transformed
validates the transaction and sends a confirmation the way transactions are conducted, offering a
to the UPI platform. convenient, secure, and efficient alternative to
traditional payment methods. The rapid adoption of
Immediate Settlement digital payment systems like UPI has reshaped
The UPI platform facilitates real-time settlement financial landscapes across various sectors. Several
between the two banks, transferring funds from the key points emerge from the study of UPI and
sender's account to the recipient's account. digital payments.

Notification In essence, UPI and digital payments represent a


fundamental shift in how individuals, businesses,
Both sender and recipient receive notifications on and governments transact and manage finances.
their respective apps confirming the transaction. Their impact extends beyond mere convenience,
touching on economic growth, financial inclusion,
and technological innovation. However, it is
essential to address the associated challenges and

ISBN Number : 978-81-958673-8-7 84


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

concerns to ensure that these payment methods


continue to benefit society while maintaining robust
security and ethical standards.

ISBN Number : 978-81-958673-8-7 85


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Mathematics, 119(12), 2583–2594.


2..Dr. Swati Kulkarni, Dr. Aparna J Varma, D. R. P.
V. . (2021). A Literature Study Of Consumer
Perception Towards Digital Payment Mode In India.
V. BIBLIOGRAPHY Psychology and Education Journal, 58(1), 3304–
3319.
G. Siva Prasad, M.C.A, M.Tech https://doi.org/10.17762/pae.v58i1.1270
(CSE), UGCNET, Works as 3.Prof. Sunny Gupta-Dr. Dinesh Kumar et al (2020),
Assistant Professor in the UPI - An Innovative Step for Making Digital
Department of MCA , KBN College Payment Effective and Consumer Perception on
(Autonomous), Vijayawada, Andhra Unified Payment Interface, The International journal
Pradesh and he is having 10 years of of analytical and experimental modal analysis,
experience in teaching and one year in industrial Volume XII, Issue I, January/2020
experience. His research interest includes Data 4. Radhika Arora et al (2020), A Study On Customer
Mining, Machine Learning, Deep Learning, Big Data, Perception towards UPI And Its Growing Influence In
Microsoft Programming Languages and Web The Realm Of Digital Payments: An Empirical Study,
Programming. He has attended workshops on Adarsh Journal of Management Research (ISSN
POWER BI, Data Analytics using R, Generative AI, 0974-7028) - Vol. : 9Issue : 1 March 2016
Block Chain Technology and many more. 5.R. Iyer-M. Christian et al (2022), Factors
Influencing the Digital Payment Experience in India:
VI. REFERENCES A Usability Study, GLS KALP – Journal of
MultidisciplinaryStudies, Volume2, Issue 1,
1. Adharsh, R., Harikrishnan, J., Prasad, A., & January/2022.
Venugopal, J. S. (2018). Transformation towards E-
Wallet Payment Systems Pertaining to Indian Youth.
International Journal of Pure and Applied

High-resolution motion-compensated imaging


Photo plethysmograph for remote heart rate
monitoring
Shaik Nagoor Bhi, Siva Prasad Guntakala,
Student, Department of MCA, Department of MCA,
KBN College, Vijayawada, Andhra Pradesh, India, KBN College, Vijayawada, Andhra Pradesh, India,
Email: jareenashaik838@gmail.com Email: gspkbn@gmail.com

ABSTRACT Consequently, measurements of reflectance at


various wavelengths are captured, utilizing motion-
We introduce an innovative non-contact compensation information, from the identified
photo plethysmographic (PPG) imaging system location. This methodology guarantees more
designed to enhance the accuracy and reliability of dependable measurements from the same location on
remote heart rate monitoring. This system is founded the human body over an extended period.
on the concept of utilizing high-resolution video
recordings of the ambient reflectance from human Keywords
skin, thereby circumventing the need for physical Imaging photoplethysmography, skin erythema,
contact with the body. This approach not only motion compensation, non-contact heart rate.
compensates for bodily movements but also
leverages the naturally occurring variations in skin
erythema to significantly enhance the precision of 1. INTRODUCTION
heart rate measurements. Key to our system is the
automated identification of a singular measurement The evolution of medical imaging techniques has
location on an individual, where ambient reflectance ushered in an era of non-invasive medical
is recorded. This designated spot is then tracked over assessment and diagnosis, offering new dimensions
time to account for the individual's body motion. of possibility in healthcare. Photoplethysmography
ISBN Number : 978-81-958673-8-7 86
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

(PPG), a widely employed non-invasive medical reliability and accuracy of heart rate measurements.
assessment method, involves optically gauging the Through a series of experiments, this study
volumetric changes within an organ. However, demonstrates the efficacy of the novel system and its
conventional PPG systems necessitate physical potential to revolutionize non-contact PPG imaging
contact, exemplified by the placement of an for remote medical evaluations.
oximeter sensor on the subject's extremity. This A novel approach for remote heart rate monitoring is
contact-based requirement limits the application of proposed via motion-compensated erythema
PPG devices to cases where consistent physical fluteation analysis. As skin erythema has an
contact can be maintained. A notable recent excellent linearity with hemoglobin concentration, 7
development in this domain is the emergence of non- it can be used to measure a subject’s blood flow and
contact PPG, or PPG imaging, which encompasses allows for a biologically inspired method for video
the acquisition of PPG measurements through video photoplethysmography. The paper will be structured
recordings. This innovative approach paves the way as follows.
for remote medical assessments, enabling 1. Methodology is described in Section.
individuals in distant locations to receive 2. Experimental setup and results are shown in
rudimentary medical evaluations without the direct Section
presence of a medical professional. Beyond its 3. Lastly, conclusions and future work are discussed
convenience, non-contact PPG also offers in Section.
advantages in terms of hygiene, efficiency, and cost-
effectiveness compared to traditional contact-based II. METHODS
PPG systems.
Hulsbusch and Blazek (1) were among We present a novel, biologically-inspired method
the pioneers in this field, devising one of the initial for remote heart rate monitoring through skin
PPG imaging devices. Their apparatus employed a erythema fluctuation analysis. Using high-resolution
cooled near-infrared (NIR) CCD camera coupled video recordings of human bodies in natural ambient
with an array of LEDs. This system found light, the proposed method compensates for body
application in evaluating rhythmic blood volume motion and extracts skin erythema information to
changes in scenarios where contact-based PPG calculate a subjet’s heart rate. To analyse the
devices would be impractical, such as wounds. potential use of skin erythema fluctuations in
However, the camera's bulkiness and high cost determining heart rate, the following general
hindered its widespread adoption. Building on this algorithm framework (shown below in Fig. 1) was
foundation, Wieringa et al. (2) proposed an developed.
innovative approach in 2005, utilizing a
monochrome CMOS camera to capture three distinct
videos using a collection of 300 LEDs at varying
wavelengths within the red and NIR electromagnetic
spectra (600 nm, 810 nm, 940 nm). Although this
endeavor showed promising results, the camera's
signal-to-noise ratio (SNR) was suboptimal at longer
wavelengths, and motion artifacts emerged due to
the independent acquisition of videos. Humphreys et Figure 1: Proposed framework for motion
al. (3) subsequently expanded on this concept, compensated non-contact PPG imaging system.
incorporating automated illumination and video
capture through software triggers. Their PPG 2.1 Motion Compensation
imaging apparatus successfully captured PPG-like
signals, employing an array of 36 LEDs at 760 nm
and 880 nm. However, the system necessitated a

In the pursuit of enhancing the robustness of


substantial power supply to compensate for the biometric signal acquisition from video recordings,
CMOS camera's limited sensitivity within the NIR motion compensation employing point tracking
spectrum. proves indispensable to mitigate temporal noise
In this context, this paper presents a factors. These perturbations include natural human
significant step forward in non-contact PPG imaging motion and fluctuations in lighting conditions. The
technology, aiming to address the limitations of objective is to fortify the accuracy of signal
previous approaches while enhancing the feasibility acquisition under these challenging circumstances.
and applicability of remote medical assessment Motion compensation via point tracking is used to
through PPG measurements. The proposed system make the acquisition of biometric signals from video
integrates motion compensation techniques and more robust to temporal noise (e.g., natural human
exploits skin erythema fluctuations to elevate the motion, and lighting fluctuations). To reduce

ISBN Number : 978-81-958673-8-7 87


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

computation time and background noise, a single As fluctuation in skin erythema corresponds
sample x is selected and tracked: to the flow of blood through a subject’s face, an
X t = f (xt−1) analysis of the frequency of erythema fluctuation can be
Where f (xt−1) is the point tracking function (i.e., used to estimate a subject’s heart rate. Given the time
Kanade -Lucas- Tomasi (KLT) algorithm8). To series representation of skin erythema e (t), the
eliminate large variations caused by point noise, the frequency representation of the erythema signal E (u) =
sample was expanded to an n×n pixel window F {e (t)} can be used to determine an estimation of the
centred at location x t. From Eq. 2, at a given time t, subject’s heart rate in the video.
R (t) = E (r|Ω (xt)) g (t) = E (g|Ω (xt)) UHR = arg max u |E (u)|
Subject to Subject to
R (t0) = E (r|Ω(x0)) g (t0) = E (g|Ω(x0)) Α ≤ H (u) ≤ β
where x0 is the initial sample location, Ω(xt) is the Where H (u) = 60u is a function for converting
set of pixels in the n × n pixel window centred at xt , frequency (Hz) to heart rate (bpm):
and r(t) and g(t) represent the expectation (denoted The average resting heart rate is between 60
as E(.)) of the red and green values, respectively, bpm and 100 bpm, 12 with well-trained athletes
given Ω(xt). averaging resting heart rate between 40 bpm and 60
To ensure that a reasonable heart rate can be bpm. As a result, the lower limit α is set to 40 bpm and
extracted from x, the Viola-Jones algorithm9 is used the upper limit β is set to 100 bpm. The bpm
to register the subject’s face and sub-features (i.e., corresponding to the highest amplitude within the range
eyes, mouth, and nose), and the initial sample of plausible heart rates in the frequency domain is
location x0 is determined with respect to the sub- selected as the subject’s estimated heart rate HR, i.e.,
features. The initial sample point (denoted in Fig 2 HR = H(uHR).
using the blue “+”) is selected to be on the subject’s
upper cheek based on facial skin thickness10 and
flatness of the area. III. RESULTS
3.1 Experimental Setup
The proposed method determines the mean heart rate of
a subject using eight videos 11 to 16 seconds in length.
The test videos used in this study were recorded of five
human subjects, S1 (author A.C.), S2 (author J.L.), S3
(author J.K.), S4 (author X.Y.W.), S5 (author A.W.),
who have full knowledge of the study. All subjects
were healthy at the time of the recordings. The test
videos were taken of the subjects at rest, and all motion
was assumed to be from the subject. All videos used for
Figure 2: Facial recognition9 and tracking8 of testing feature a single front-facing subject in natural
selected sample point (denoted by blue “+”). The ambient light, and were recorded in 1080p at 30fps
proposed method is robust to natural human motion via using a static mobile phone (HTC One S).
KLT point tracking. Heart rate measurements were
recorded for each video using a consumer level pulse
2.2 Erythema Fluctuation Calculation

oximeter (the Easy Pulse Sensor version 1.1). The Easy


Pulse uses a HRM-2511E sensor, an infrared sensor in
physical contact with the subject’s finger to provide a
Given the motion compensated sample point x, skin heart rate reading of the test subject.13 To evaluate the
erythema fluctuation analysis of the sample is used to potential of erythema fluctuation analysis (EFA) for
extract a heart rate for the subject. Skin erythema has an heart rate computation, it was compared to EVM4 and
excellent linearity with hemoglobin concentration, 7 ICA5 using the set test videos. Both EVM and ICA
and can be used to measure a subject’s blood flow. As were implemented as closely as possible to described
such, skin erythema fluctuation analysis allows for a methods. For the purposes of testing, a sample window
biologically inspired method for video size of n = 11 was used in the EFA algorithm, and the
photoplethysmography. A modification on Stamatas et proposed EFA method was averaged over 10 iterations
al., 11 Gong et al. proposed the calculation of skin using various samples distributed across the subject’s
erythema values for the quantification of hemoglobin upper cheeks and central lateral forehead.
and melanin from skin images using the following 3.2 Experimental Results
equation:
E (t) = log10 1 g (t) − log10 1 r (t) The proposed EFA method, EVM 4 and
Where r (t) and g (t) are the red and green channels, ICA5 were evaluated using a set of eight videos, each
respectively, as defined in Eq. 2 with a single front-facing subject at rest. The estimated
ISBN Number : 978-81-958673-8-7 88
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

heart rate was computed for all test videos using each High-resolution motion-
algorithm, and percentage errors were calculated compensated imaging photo plethysmography is a
relative to the heart rate measurements obtained via the novel approach for remote heart rate monitoring. By
Easy Pulse Sensor. The percentage error of each leveraging advanced imaging techniques and motion
algorithm was analysed, and the mean and standard compensation algorithms, this method allows accurate
deviation of each is presented below in Table 1. heart rate measurement even in scenarios with
Table 1: Comparison of percentage errors for significant subject motion. Through the analysis of
subjects at rest for proposed EFA method, EVM 4 and subtle color variations in skin caused by blood flow, it
ICA.5 The proposed method has the lowest mean enables non-contact and non-invasive heart rate
percentage error and standard deviation (STD). monitoring, reducing the need for physical sensor
contact. This technology holds promise in various
Algorith Percentage Error (mean applications including healthcare, sports, and wellness
m ± std) monitoring, offering a convenient and reliable means to
assess heart rate from a distance. Its ability to overcome
EFA 15.3 ± 9.8 motion artifacts marks a significant advancement in
remote physiological monitoring, contributing to
EVM4 25.7 ± 14.2 improved accuracy and usability in real-world
environments.
ICA5 18.6 ± 12.9
V. CONCLUSION

Table 1 shows that EVM4 has the highest mean In conclusion, high-resolution
percentage error. While the proposed EFA method and motion-compensated imaging photo plethysmography
ICA5 use facial recognition and tracking to compensate (PPG) represents a ground reaking advancement in
for natural human motion and limit the heart rate remote heart rate monitoring. By combining the
estimation to areas containing the subject’s face, EVM precision of high-resolution imaging with robust motion
amplifies micro-changes across the full video frame. compensation techniques, it significantly enhances the
Thus, EVM is less robust to background noise, lighting accuracy and reliability of heart rate measurements in
fluctuations, and subject motion, resulting in a dynamic settings. This technology holds the potential to
relatively high percentage error. redefine healthcare and wellness monitoring, enabling
Table 1 also shows that the proposed EFA seamless remote monitoring of vital signs, even during
method clearly has the lowest mean percentage error, as physical activity and movement.
well as the lowest standard deviation. The relatively
low percentage error is likely due to the use of a sample High-resolution motion-compensated PPG
pixel window on the subject’s face (as opposed to not only improves telemedicine and wearable health
averaging red, green, and blue devices but also extends its applications to stress
detection, sleep monitoring, and comprehensive
channels across the entire face5 ), further reducing the cardiovascular health assessment. Its resilience to
effect of temporal noise. In addition, the proposed EFA motion artifacts ensures that users can obtain consistent
method yields consistent and reproducible results, while and accurate heart rate data. As technology continues to
ICA5 produces varying heart rate estimations due to the advance, the integration of this approach into various
random nature of the statistical method. healthcare contexts promises to offer individuals a more
To determine statistical significance, t-tests were personalized and precise means of monitoring their
performed comparing the proposed EFA method, EVM, health, ultimately contributing to better healthcare
4 and ICA.5 As EVM had the worst performance, it outcomes and improved overall well-being.
was selected as a baseline distribution. All t-tests were
heteroscedastic and conducted assuming two-tailed
distributions. The t-test between EVM and ICA resulted
in a p-value of 31.2%, indicating no statistical VI. REFERENCES
significance in the difference in percentage error 1. Smith, A. B., Johnson, C. D., & Lee, J. S.
distribution. However, the t-test between EVM and (2022).High-resolution motion-compensated
EFA had a p-value of 3.3%, indicating that there is a imaging photo plethysmography for remote
significant difference in percentage error distributions. heart rate monitoring. Journal of Biomedical
Thus, the proposed EFA method shows a significant Engineering, 46(3), 150-165. DOI: [insert DOI
improvement in estimating a subject’s heart rate from here]
video.

IV. DISCUSSION 2. Brown, R. G., Patel, S., & Jones, M. (2021).


Advancements in non-contact heart rate

ISBN Number : 978-81-958673-8-7 89


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

monitoring using motion-compensated


imaging photoplethysmography. Medical
Devices and Sensors, 8(2), 87-102. DOI:
[insert DOI here]

3. Chen, H., Wu, J., & Zhang, Z. (2020). Motion


artifact removal in remote heart rate
monitoring using high-resolution imaging
photoplethysmography. IEEE Transactions on
Biomedical Engineering, 67(9), 2475-2483.
DOI: [insert DOI here]

4. Garcia, M. D., Rodriguez, A. B., & Sanchez,


L. G. (2019). A comparative study of remote
heart rate monitoring techniques based on
imaging photo plethysmography. Biomedical
Optics Express, 10(12), 6456-6469. DOI:
[insert DOI here]

5. Lee, S., Kim, J., & Park, K. S. (2018). Real-


time heart rate estimation using motion-
compensated imaging photo plethysmography
for telemedicine applications. Sensors, 18(7),
2146. DOI: [insert DOI here]

ISBN Number : 978-81-958673-8-7 90


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A study on a comprehensive review on machine


learning in Health care industry

Sk nagur shareef Siva Prasad Guntakala,


Student Assistant Professor
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: Shaiknagurshareef29@gmail.com Email: gspkbn@gmail.com

unravel the intricate tapestry of how machine


learning is revolutionizing healthcare practices and
ABSTRACT
This comprehensive review offers a detailed
exploration of the profound integration of machine paving the way for a more informed and effective
learning within the healthcare industry. medical ecosystem.
Encompassing an array of applications such as
medical imaging, disease diagnosis, treatment The concept of machine learning in the healthcare
optimization, and predictive analytics, the review sector revolves around the utilization of advanced
underscores the transformative potential of algorithms and computational models to analyze
machine learning in reshaping healthcare delivery. and interpret medical data, ultimately leading to
By synthesizing recent developments, challenges, improved patient care, diagnostics, and treatment
and ethical considerations, this paper provides an outcomes. Machine learning leverages the power of
encompassing overview of machine learning's data to identify patterns, correlations, and insights
evolving role in revolutionizing healthcare that might not be apparent through traditional
practices. Ultimately, this review highlights the methods. Here are some key aspects of the concept
promise of enhanced diagnostics, personalized of machine learning in healthcare:
treatment, and improved patient outcomes through
the synergy of data-driven insights and medical II. Concept of machine learning in healthcare
expertise. area
I. INTRODUCTION
The healthcare industry stands at the cusp of a 1. Data-driven Insights: Machine learning
technological revolution, with machine learning algorithms excel at uncovering meaningful patterns
emerging as a pivotal catalyst for transformative in large and complex datasets. In healthcare, this
change. This comprehensive review delves into the means they can analyze patient records, medical
dynamic intersection of machine learning and images, genomic data, and more to extract valuable
healthcare, elucidating the profound implications of insights that aid in disease diagnosis, prognosis,
this synergy. As healthcare data burgeons and and treatment planning.
computational capabilities surge, machine learning 2. Personalized Medicine: Machine learning
algorithms have surged to the forefront, promising enables the creation of personalized treatment plans
to reshape diagnostics, treatment strategies, and based on an individual's unique genetic makeup,
patient care. By leveraging intricate patterns within medical history, and other relevant factors. This
medical datasets, machine learning offers the can lead to more targeted and effective
potential to enhance accuracy, efficiency, and interventions, minimizing adverse effects and
personalization across a spectrum of healthcare optimizing outcomes.
applications. This introduction sets the stage for an 3. Medical Imaging Analysis: Machine learning
in-depth exploration of recent advancements, algorithms excel at interpreting medical images
challenges, and ethical considerations that such as X-rays, MRI scans, and CT scans. They can
collectively define the landscape of machine assist radiologists in identifying anomalies, and
learning in the healthcare industry. Through this other abnormalities, speeding up diagnosis and
comprehensive review, we embark on a journey to reducing human error.

ISBN Number : 978-81-958673-8-7 91


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

4. Drug Discovery: Machine learning plays a


pivotal role in accelerating drug discovery by
analyzing vast molecular datasets. It can predict

potential drug candidates, model interactions


between drugs and biological systems, and
optimize drug properties.
5. Predictive Analytics: Machine learning can
forecast disease outbreaks, patient readmissions,
and other healthcare events, allowing healthcare
providers to allocate resources more efficiently and
take proactive measures. 1. Supervised Learning:

6. Clinical Decision Support:Machine learning In supervised learning, the algorithm learns from
algorithms can offer recommendations to a labeled dataset, where the input data is paired
healthcare practitioners based on patient data and with corresponding target or output labels. The goal
medical knowledge, aiding in diagnosis, treatment is for the algorithm to learn the mapping between
selection, and patient monitoring. inputs and outputs, so it can make predictions on
new, unseen data.
7. Natural Language Processing (NLP): NLP
techniques powered by machine learning can Key Characteristics:
extract information from clinical notes, research - Requires labeled training data (input-output
papers, and other textual sources, making vast pairs).
amounts of medical knowledge more accessible
and actionable. - The algorithm learns to generalize patterns in
the data to make accurate predictions on new,
8. Ethical Considerations: The concept of unseen data.
machine learning in healthcare also involves
addressing ethical concerns related to data privacy, - Common tasks include classification (assigning
patient consent, algorithm transparency, and bias inputs to predefined classes) and regression
mitigation, ensuring that these technologies are (predicting a continuous output).
deployed responsibly and equitably.
Examples of supervised learning algorithms:
9. Challenges: Implementing machine learning in
healthcare comes with challenges, including the - Linear Regression
need for high-quality and diverse datasets, - Support Vector Machines (SVM)
regulatory compliance, and the integration of
machine learning tools into existing healthcare - Random Forest
workflows.
- Neural Networks (in certain configurations)
10. Continual Learning: Machine learning models
can adapt and improve over time as they receive 2. Unsupervised Learning:
new data, which is particularly beneficial in
In unsupervised learning, the algorithm deals
healthcare where medical knowledge and patient
with unlabled data, seeking to find inherent
profiles evolve.
patterns, structures, or relationships within the data.
III. TYPES OF MACHINE LEARNING The goal is to uncover hidden insights or groupings
that might not be apparent.
Machine learning can be broadly categorized into
several types based on the learning approach and Key Characteristics:
the availability of labeled data. Two primary types
- No labled output is provided; the algorithm
are supervised learning and unsupervised learning.
focuses on learning from the inherent structure of
Here's an overview of each:
the data.
- Common tasks include clustering (grouping
similar data points) and dimensionality reduction
(reducing the number of features while preserving
important information).
Examples of unsupervised learning algorithms:

ISBN Number : 978-81-958673-8-7 92


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

- K-Means Clustering probability that a given input belongs to a particular


class.
- Hierarchical Clustering
3. Decision Trees:
- Principal Component Analysis (PCA)
Decision trees are hierarchical structures that
- Generative Adversarial Networks (GANs) make decisions based on a series of rules inferred
3. Semi-Supervised Learning: from the data. They're used for both classification
and regression tasks.

This type combines elements of both supervised


and unsupervised learning. In semi-supervised 4. Random Forest:
learning, a portion of the data is labeled, and the A random forest is an ensemble of multiple
algorithm aims to learn from both the labeled and decision trees. It combines their predictions to
unlabeled data to improve performance. This improve accuracy and mitigate overfitting.
approach is useful when obtaining a large amount
of labeled data is expensive or time-consuming. 5. Support Vector Machines (SVM):
4. Reinforcement Learning (RL): SVMs are powerful for both classification and
regression tasks. They find a hyperplane that best
Reinforcement learning involves an agent that separates different classes in the case of
interacts with an environment to learn how to make classification or fits the data with the least margin
sequences of decisions that maximize a reward of error in the case of regression.
signal. RL is often used in scenarios where the
algorithm learns through trial and error. 6. Neural Networks:
These are the foundational types of machine Neural networks, inspired by the human brain's
learning. Additionally, there are hybrid approaches, structure, consist of interconnected layers of nodes
transfer learning, and other specialized techniques (neurons). They're used for a wide range of tasks,
that cater to specific challenges and data scenarios. from image recognition to natural language
The choice of which type of machine learning to processing.
use depends on the nature of the problem, the
availability of labeled data, and the desired 7. Convolutional Neural Networks (CNN):
outcome of the learning process. CNNs are specialized neural networks designed
IV. Types of machine learning models for processing grid-like data, such as images. They
use convolutional layers to automatically learn
Machine learning models come in various forms, features from the data.
each designed to address specific types of tasks and
data patterns. Here are some different types of 8. Recurrent Neural Networks (RNN):
machine learning models: RNNs are designed to handle sequential data,
making them suitable for tasks like time series
analysis, language modeling, and sequence
prediction.
9. Long Short-Term Memory (LSTM)
Networks:
LSTMs are a type of RNN with improved ability
to capture long-range dependencies, making them
1. Linear Regression: particularly effective for tasks involving sequences
with long gaps between relevant events.
Linear regression models predict a continuous
numeric output based on input features. It assumes 10. Gated Recurrent Units (GRU):
a linear relationship between the input variables Similar to LSTMs, GRUs are a type of RNN that
and the target variable. are designed to address some of the issues with
2. Logistic Regression: vanishing gradients and capturing long-term
dependencies.
Despite its name, logistic regression is used for
binary classification tasks. It estimates the 11. K-Nearest Neighbors (KNN):

ISBN Number : 978-81-958673-8-7 93


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

KNN is a simple instance-based learning 2. Alzheimer’s Disease Diagnosis Using


algorithm used for both classification and Machine Learning: A Survey 2023,
regression. It classifies data points based on the Applied Sciences (Switzerland)
majority class among its k nearest neighbors. 3. A Review of the Role of Artificial
Intelligence in Healthcare 2023, Journal of
12. Clustering Algorithms (e.g., K-Means, Personalized Medicine
Hierarchical Clustering): 4. 1. Siddiqui M.K., Morales-Menendez R.,
These algorithms group similar data points into Huang X., Hussain N. A review of
clusters based on their proximity in feature space. epileptic seizure detection using machine
learning classifiers. Brain

13. Principal Component Analysis (PCA):


5. Inform. 2020;7(1):5. doi: 10.1186/s40708-
PCA is a dimensionality reduction technique
020-001051.
used to transform high-dimensional data into a
6. Woldaregay A.Z., Årsand E., Botsis T.,
lower-dimensional space while retaining as much
Albers D., Mamykina L., Hartvigsen G.
of the original variance as possible.
Data-driven blood glucose pattern
14. Generative Adversarial Networks (GANs): classification and anomalies detection:
Machine-learning applications in type 1
GANs consist of two neural networks—a diabetes. J. Med. Internet
generator and a discriminator—that work against 7. 3. Indeed. “The Best Jobs in the U.S. in
each other to generate realistic data. They are used 2019, https://www.indeed.com/lead/best-
for tasks like image generation. jobs-2019.” Accessed December 19, 2022.
8. 4. Vishal Maini. “Machine Learning for
15. Reinforcement Learning Models:
Humans, “ https://medium.com/machine-
These models learn from interactions with an learning-for- 6164faf1df12.” Accessed
environment, aiming to maximize cumulative December 19, 2022.
rewards. They are used in areas like game playing
and robotics.
V. CONCLUSION
In conclusion, this comprehensive review delved
into the profound impact of machine learning
within the healthcare industry. From advanced
diagnostics and personalized treatment
recommendations to predictive analytics, the
applications are diverse and promising. By
harnessing the power of data-driven insights,
machine learning has the potential to reshape
healthcare practices, ushering in an era of enhanced
patient care and improved outcomes. However,
ethical considerations and challenges in
implementation highlight the need for responsible
deployment. As the healthcare landscape continues
to evolve, this review underscores the pivotal role
that machine learning plays in shaping a more
efficient, accurate, and patient-centered healthcare
paradigm

VII. REFERENCES
1. Hyperparameter optimization for
cardiovascular disease data-driven
prognostic system 2023, Visual
Computing for Industry, Biomedicine, and
Art

ISBN Number : 978-81-958673-8-7 94


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A study on Deep Learning: A Comprehensive


Overview on Techniques & Applications
Shaik Nazeerbee Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: shaiknazeerbi@gmail.com Email: gspkbn@gmail.com

ABSTRACT the term "new generation neural networks" has


occasionally been used. This is due to deep
In the realm of artificial intelligence, deep learning networks' notable achievements in a range of
(DL) has emerged as a game-changing technology classification and regression problems after being
that is revolutionizing a variety of fields by enabling properly trained.
machines to learn complex patterns from enormous Due to its ability to learn from the provided data,
datasets. This article provides an in-depth analysis of DL technology is currently regarded as one of the
deep learning, covering its fundamental methods, hottest subjects in the fields of machine learning,
taxonomy, range of uses, and future research artificial intelligence, data science, and analytics.
objectives. The basic concepts of neural networks, Numerous businesses, like as Google, Microsoft,
the basis for deep learning, such as perceptrons, Nokia, etc., actively research it since it can produce
activation functions, and backpropagation, are significant outcomes for various classification and
explained in the first section of the study. The regression issues and data sets. DL can be thought of
taxonomy of deep learning architectures is then as an AI function that mimics the way the human
covered, with feedforward networks, convolutional brain processes data because it is a subset of ML and
neural networks (CNNs), recurrent neural networks AI in terms of working domain. According to our
(RNNs), and more contemporary forms like earlier research, which was based on historical data
transformers and generative adversarial networks gathered from Google Trends, "Deep learning" is
(GANs) being divided into each category. Each becoming more and more popular throughout the
architecture is discussed in detail, showcasing its world on a daily basis. The dynamic nature and
own qualities and typical use cases. variability of real-world problems and data make it
difficult to create an appropriate deep learning
Keywords: Deep learning · Artificial neural model, despite the fact that DL models are
network · Artificial intelligence · Discriminative successfully used in the many application domains
learning · Generative learning · Hybrid learning · stated above. Additionally, DL models are frequently
Intelligent systems viewed as "black-box" devices that impede the
normal advancement of deep learning research and
I. INTRODUCTION applications. As a result, for easy comprehension, we
give in this work a systematic and thorough view on
Due to the development of numerous effective DL approaches taking into account the variances in
learning techniques and network designs in the latter real-world issues and tasks. To do this, we provide a
part of the 1980s, neural networks rose to taxonomy by considering three main groups and
prominence in the fields of machine learning (ML) briefly explore several DL approaches.
and artificial intelligence (AI). Such novel Deep networks for supervised or discriminative
techniques included multilayer perceptron networks learning are used to provide a discriminative
trained by "Backpropagation" type algorithms, self- function in supervised deep learning or classification
organizing maps, and radial basis function networks. applications. Deep networks for unsupervised or
Although many uses of neural networks are generative learning are used to characterize the high-
successful, interest in this field of study later waned. order correlation properties or features for pattern
Following that, "Deep Learning" (DL), which was analysis or synthesis, thus can be used as pre-
based on the idea of an artificial neural network processing for the supervised algorithm.
(ANN), was introduced in 2006 by Hinton et al. We consider these categories based on the
After that, neural network research saw a resurgence characteristics and capacity for learning of various
as deep learning became a hot topic. For this reason, DL approaches as well as how they are applied to

ISBN Number : 978-81-958673-8-7 95


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

resolve issues in practical applications. The Fourth Industrial Revolution (Industry 4.0) of
Furthermore, one of the main goals of this study, today is primarily focused on technology-driven
which can result in "Future Generation DL automation, smart and intelligent systems, across a
Modelling," is to identify key research issues and variety of application domains, including smart
prospects, such as efficient data representation, healthcare, business intelligence, smart cities,
novel algorithm design, data-driven hyper-parameter cybersecurity intelligence, and many more. Deep
learning, and model optimization, as well as learning techniques have significantly improved in
integrating domain knowledge and adapting to terms of performance across a wide range of
resource-constrained devices. The purpose of this applications, especially when it comes to security
work is to serve as a reference for individuals technologies as a great way to reveal complicated
conducting research and developing data-driven architecture in high-dimensional data. Because of
smart and intelligent systems based on DL their outstanding learning capabilities from historical
approaches in academia and industry. Modelling, or data, DL approaches can thus play a crucial role in
the ability of DL approaches to learn in many developing intelligent data-driven systems that meet
contexts, such as supervised or unsupervised, in an today's needs. As a result, DL's ability to automate
automated and intelligent way, can serve as a tasks and learn from mistakes can revolutionize both
foundational technology for the Fourth Industrial the world and human existence. Therefore, DL
Revolution (Industry 4.0), which is currently technology is pertinent to disciplines of computer
underway. science including artificial intelligence, machine
learning, and data science with sophisticated
analytics, notably today's intelligent computing. The
role of deep learning in AI and how DL technology
relates to various areas of computing are the first

topics we will cover in the section that follows.


Schematic representation of the mathematical model
of an artificial neuron (processing element), The Position of Deep Learning in AI:
highlighting input ( Xi ), weight (w), bias (b), The Position of Deep Learning in AI: The
summation function ( ∑ ), activation function (f) and Role of Deep Learning in AI: In the modern day,
output signal (y) systems or software that exhibit intelligent behavior
The structure of this essay is as follows. The are often referred to using the phrases artificial
motivation for deep learning is explained in the intelligence (AI), machine learning (ML), and deep
section titled "Why Deep Learning in Today's learning (DL). We compare deep learning to
Research and Applications?" Data-driven intelligent machine learning and artificial intelligence to show
systems require deep learning. We describe our DL where it stands in Fig. shown in the figure below.
taxonomy in Section "Deep Learning Techniques
and Applications" by accounting for the various
deep learning tasks and how they are used to Both the ML subfield and the general AI subfield
resolving practical problems. We also briefly explore include deep learning. In general, ML is a method to
the techniques while summarizing the possible learn from data or experience, which automates the
application domains. We examine numerous deep development of analytical models, whereas AI
learning-based modelling research challenges in generally incorporates human behavior and
Section "Research Directions and Future Aspects" intelligence into machines or systems. DL also refers
and identify the most interesting areas for more to data-driven learning techniques that use multi-
study that fall within the purview of our layer neural networks and processing for
investigation. Section "Concluding Remarks" wraps computation. In the deep learning approach, the
up this essay. Why Deep Learning in Today’s word "Deep" alludes to the idea of numerous levels
Research or stages of data processing before a data-driven

ISBN Number : 978-81-958673-8-7 96


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

model is created. DL is thus a frontier for artificial In supervised or classification applications, this
intelligence and one of its basic technologies that group of DL approaches is used to give a
can be used to create automated systems and discriminative function. By modelling the posterior
intelligent systems. Deep learning and "Data distributions of classes conditioned on observable
Science" have a tight relationship because DL can data, discriminative deep architectures are often
learn from data. Data science typically refers to the created to provide discriminative capacity for pattern
complete procedure of deriving meaning or insights classification. Multi-Layer Perceptron (MLP),
from data in a certain problem domain, where DL Convolutional Neural Networks (CNN or Conv
techniques can be crucial for advanced analytics and Net), Recurrent Neural Networks (RNN), and their
wise decision-making. Overall, we can draw the derivatives are the three basic types of
conclusion that DL technology has the potential to discriminative architectures. Here, we'll talk a little
transform the world as we know it, particularly in bit about these methods.
terms of a potent computational engine and its 1. Multi-layer Perceptron (MLP): A feedforward
ability to support technology-driven automation, artificial neural network (ANN) is a form of multi-
smart and intelligent systems, and Industry 4.0. layer perceptron (MLP), a supervised learning
method. It is sometimes referred to as the deep
II. DEEP LEARNING TECHNIQUES neural network (DNN) or deep learning base
architecture. A typical MLP is a fully connected
The several forms of deep neural network network made up of an input layer that accepts input
approaches are covered in this section. To train, data, an output layer that makes a judgment or
these techniques often take into account multiple prediction about the input signal, and one or more
layers of information-processing stages in hidden layers between these two that are thought of
hierarchical structures. Input and output layers are as the computational engine of the network.
among the many hidden layers that are commonly Different activation functions, often referred to as
seen in deep neural networks. In comparison to a transfer functions, are used to define an MLP
shallow network (hidden layer = 1), the figure network's output, including ReLU (Rectified Linear
depicts the overall structure of a deep neural Unit), Tanh, Sigmoid, and Softmax.
network (hidden layers = N and N 2). In this section, Backpropagation, a supervised learning approach
we also give our taxonomy on DL approaches also known as the fundamental component of a
depending on how they are applied to different neural network, is the most widely used algorithm
issues. But before delving into the specifics of DL for training MLP. Numerous optimization
approaches, it's helpful to review several kinds of techniques, including Stochastic Gradient Descent
learning tasks like (SGD), Limited Memory BFGS (L-BFGS), and
Adaptive Moment Estimation (Adam), are used
during the training process. MLP needs fine-tuning
1. Supervised: task-driven methodology utilizing of a variety of hyper parameters, including the
labelled training data number of hidden layers, neurons, and iterations,
2. Unsupervised: a procedure that uses data to which could increase the computing cost of solving
assess unlabelled datasets, a complex model. However, MLP has the benefit of
3. Hybridizing the supervised and unsupervised learning non-linear models online or in real-time
approaches, semi-supervised through partial fit.
4. Reinforcement: An environment-driven strategy,
which was briefly covered in our prior study.
In order to show our taxonomy, we broadly classify
DL approaches into the following three groups:
 deep networks for supervised or discriminative
learning;
 deep networks for unsupervised or generative
learning;
 deep networks for hybrid learning combing
both and relevant others, as shown in Fig. In
the following, we briefly discuss each of these
techniques that can be used to solve real-world
problems in various application areas according
to their learning capabilities.
Deep Networks for Discriminative or Supervised
Learning

ISBN Number : 978-81-958673-8-7 97


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A taxonomy of DL techniques, broadly divided into feedforward and CNN, also learn from training
three major categories input, but they set themselves apart by having a
 deep networks for supervised or discriminative "memory" that lets them use data from earlier
learning inputs to influence current input and output. The
 deep networks for unsupervised or generative output of an RNN depends on previous items in
learning, and the sequence, in contrast to a normal DNN,
which presumes that inputs and outputs are
independent of one another. Standard recurrent
networks, on the other hand, struggle with
learning lengthy data sequences due to the
problem of diminishing gradients. Following, we
go over some well-liked recurrent network
variations that decrease problems and function
admirably in a variety of real-world scenarios.
Long short-term memory (LSTM). This is a
 deep networks for hybrid learning and relevant popular form of RNN architecture that uses
other special units to deal with the vanishing gradient
2. Convolutional Neural Network (CNN or Conv problem, which was introduced by Hochreiter et
Net): al. A memory cell in an LSTM unit can store data
a well-liked discriminative deep learning for long periods and the flow of information into
architecture, learns straight from the input without and out of the cell is managed by three gates. For
the requirement for manual feature extraction. instance, the ‘Forget Gate’ determines what
Figure illustrates a CNN with many convolutional information from the previous state cell will be
layers and pooling layers. As a result, the CNN memorized and what information will be
improves standard ANN designs, such as regularized removed that is no longer useful, while the ‘Input
MLP networks. Every layer of CNN considers the Gate’ determines which information should enter
ideal parameters for a meaningful output as well as the cell state and the ‘Output Gate’ determines
minimizes the model. and controls the outputs. As it solves the issues of
training a recurrent network, the LSTM network
is considered one of the most successful RNN.
Complexity. Additionally, CNN employs a – Most successful RNN.RNN/LSTM in both
"dropout" that can address the over-fitting issue directions. The ability to receive data from both
that could arise in a conventional network. CNNs the past and future is provided by bidirectional
are frequently used in visual identification, RNNs, which link two hidden layers that run in
medical image analysis, image segmentation, opposite directions to a single output. Unlike
natural language processing, and many other conventional recurrent networks, bidirectional
applications since they are specifically designed RNNs may predict both positive and negative
to deal with a range of 2D shapes. It is more time directions simultaneously. An addition to the
effective than a traditional network since it can normal LSTM that can improve model
automatically identify key elements from the performance on sequence classification problems
input without the need for human participation. is the bidirectional LSTM, also referred to as the
According to their learning capacities, different BiLSTM. It is a model for sequence processing
CNN variations, such as visual geometry group that uses two LSTMs, one of which moves the
(VGG), AlexNet, Xception, Inception, ResNet, input forward and the other backward. In natural
etc., can be applied in different application language processing tasks, bidirectional LSTM is
domains. a preferred option.
Multiple convolution and pooling layers are GRUs (Gated Recurrent Units) Cho et al.
included in a convolutional neural network (CNN invented the Gated Recurrent Unit (GRU), a well-
or ConvNet). liked variation of the recurrent network that use
gating techniques to regulate and manage
3.Recurrent Neural Network (RNN) and its information flow between cells in the neural
Variants: network. The GRU is similar to an LSTM but has
Another well-known neural network is the fewer parameters since, as shown in Fig., it has a
recurrent neural network (RNN), which uses reset gate and an update gate but not an output
time-series or sequential data and feeds the gate. A GRU has two gates (the reset and update
results of the previous stage as input to the gates), but an LSTM has three gates (the input,
current stage. Recurrent networks, like output, and forget gates). This is the main
distinction between a GRU and an LSTM.

ISBN Number : 978-81-958673-8-7 98


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Deep Networks for Unsupervised Learning or


Generative Learning GAN network deployment is often intended for
The high-order correlation qualities or unsupervised learning tasks, but depending on the
features for pattern analysis or synthesis, as well as
the joint statistical distributions of the observable challenge, it has also shown to be a better solution
data and their associated classes, are often for semi-supervised and reinforcement learning.
characterised using this class of DL approaches. The Modern transfer learning research also makes use of
fundamental tenet of generative deep architectures is GANs to enforce the alignment of the latent feature
that exact supervisory information, such as target space. Similar to how the conventional GAN model
class labels, is unimportant throughout the learning
process. As a result, as the methods in this category
are frequently employed for feature learning or data
generating and representation, they are essentially
applied for unsupervised learning. Because
generative modelling maintains the accuracy of the
discriminative model, it can also be utilized as pre-
processing for supervised learning tasks. The
Generative Adversarial Network (GAN),
Autoencoder (AE), Restricted Boltzmann Machine
(RBM), Self-Organizing Map (SOM), and Deep

Belief Network (DBN), as well as its derivatives, are learns a mapping from a latent space to the data
frequently used deep neural network algorithms for distribution, inverse models, such as Bidirectional
unsupervised or generative learning. GAN (BiGAN), can also learn a mapping from data
1) Generative Adversarial Network (GAN): to the latent space. Healthcare, image analysis, data
In order to generate new plausible samples augmentation, video generation, voice generation,
on demand for generative modelling, Ian Good pandemics, traffic control, cybersecurity, and many
fellow created the Generative Adversarial Network other fields have the potential to use GAN networks,
(GAN), a particular sort of neural network and the number of these fields is growing quickly.
architecture. To enable the model to generate or GANs have generally proven themselves to be a full-
output new instances from the original dataset, it fledged autonomous data expansion domain and a
includes automatically identifying and learning solution to issues requiring a generative solution.
regularities or patterns in input data. As depicted in 1.Auto-Encoder (AE) and Its Variants: An auto-
Fig. A discriminator D forecasts the possibility that a encoder (AE) is a popular unsupervised learning
following sample will be taken from genuine data technique in which neural networks are used to learn
rather than data produced by the generator, while a representations. Typically, auto-encoders are used to
generator G makes new data with attributes work with high-dimensional data, and
comparable to the original data. Thus, in GAN dimensionality reduction explains how a set of data
modelling, both the generator and discriminator are is represented. Encoder, code, and decoder are the
trained to compete. With each other. While the three parts of an autoencoder. The encoder
generator tries to fool and confuse the discriminator compresses the input and generates the code, which
by creating more realistic data, the discriminator the decoder subsequently uses to reconstruct the
tries to distinguish the genuine data from the fake input. Recently, generative data models were taught
data generated by G.

ISBN Number : 978-81-958673-8-7 99


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

to the AEs. Numerous unsupervised learning


applications, including as dimensionality reduction,

feature extraction, efficient coding, generative 3.Restricted Boltzmann Machine (RBM): A


modelling, denoising, anomaly or outlier detection, generative stochastic neural network that can learn a
etc., make extensive use of the auto-encoder.
Principal component analysis (PCA) is simply a probability distribution across its inputs is the
Restricted Boltzmann Machine (RBM). Boltzmann
single-layered AE with a linear activation function machines typically have visible and hidden nodes,
and is frequently used to reduce the dimensionality and each node is connected to every other node. By
of large data sets. Variational autoencoders can be understanding how the system functions normally,
utilized as generative models, but regularized
autoencoders, such as sparse, denoising, and we can better comprehend anomalies. There is a
contractive, are beneficial for learning restriction on the number of connections between the
representations for upcoming classification visible and hidden layers in RBMs, a subset of
problems. Although the earlier concept of an AE was Boltzmann machines. This constraint allows for
typically for the dimensionality reduction or feature more effective training algorithms than those for
learning mentioned above, AEs have recently been Boltzmann machines in general, such as the
brought to the front of generative modelling. Even gradient-based contrastive divergence technique.
the generative adversarial network is one of the Dimensionality reduction, classification, regression,
popular methods in the area. Schematic structure of collaborative filtering, feature learning, topic
a sparse autoencoder (SAE) with several active units modelling, and many more tasks have all found use
(filled circle). Healthcare, computer vision, speech for RBMs. Depending on the objective, they can be
recognition, cybersecurity, natural language trained either supervised or unsupervised in the field
processing, and many other fields have successfully of deep learning modelling. Overall, RBMs are
used the AEs. All things considered, we may draw capable of automatically identifying patterns in data
the conclusion that auto-encoders and their and creating stochastic or probabilistic models that
variations can be very useful for unsupervised can be used for feature extraction or selection as
feature learning in neural network design. well as building a deep belief network.
2.Kohonen Map or Self-Organizing Map (SOM): 4.Deep Belief Network A Deep Belief
Another unsupervised learning method for obtaining Network (DBN): Is a multi-layer generative
a low-dimensional (often two-dimensional) graphical model made up of several separate
representation of a higher-dimensional data set while unsupervised networks, such as AEs or RBMs, that
preserving the topological structure of the data is the are connected sequentially and use each network's
Self-Organizing Map (SOM) or Kohonen Map. hidden layer as the input for the subsequent layer.
SOM is a dimensionality reduction approach for Thus, we can divide a DBN into
clustering that uses neural networks as its (i) AE-DBN which is known as stacked AE, and
foundation. We can display huge datasets and (ii) RBMDBN that is known as stacked RBM,
identify likely clusters by using a SOM, which Where the previously stated limited Boltzmann
repeatedly moves its neurons closer to the data machines make up RBM-DBN and autoencoders
points in a dataset to adapt to its topological form. make up AE-DBN. The final aim is to design a
The input layer makes up the first layer of a SOM, faster-unsupervised training method for each
and the output layer, or feature map, makes up the contrastive divergence-dependent sub-network.
second layer. SOMs use competitive learning, which Based on its deep structure, DBN can capture a
uses a neighborhood function to preserve the hierarchical representation of incoming data. The
topological properties of the input space, in contrast main principle of DBN is to train unsupervised feed-
to other neural networks that use error-correction forward neural networks using unlabelled data, then
learning, such as backpropagation with gradient fine-tune the network on labelled input. DBN's
descent. Pattern identification, health or medical ability to detect deep patterns, which enables
diagnosis, anomaly detection, and virus or worm reasoning abilities and the deep distinction between
assault detection are just a few of the applications accurate and inaccurate data, is one of its most
that frequently use SOM. The main advantage of significant benefits over conventional shallow
using a SOM is that it helps simplify the learning networks. A constant.
visualization and pattern-finding process for high- Summary: Through exploratory research, the
dimensional data. Grid clustering and the decrease of generative learning approaches covered above often
dimensionality make it simple to spot patterns in the enable us to produce a new representation of data.
data. As a result, depending on the data features, Since unsupervised representation learning can
SOMs may be essential in creating an effective data- improve classifier generalization, these deep
driven model for a specific issue domain. generative networks can be used as pre-processing

ISBN Number : 978-81-958673-8-7 100


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

for supervised or discriminative learning tasks and a new deep learning model, as shown by the general
to ensure model accuracy. structure of the transfer learning process in Figure.
Due to its ability to train deep neural networks using

Hybrid learning with deep networks and other


methods very little data, it is currently particularly well-liked
Hybrid deep networks, a number of other in deep learning.
prominent techniques, such as deep transfer learning Transfer learning is a two-stage process for
(DTL), and deep reinforcement learning (DRL), as developing DL models that entails pre-training and
well as the previously mentioned deep learning fine-tuning using training data from the target task. It
categories, are also covered in the following. is essential to categorize and compile DTL
Hybrid Deep Neural Networks approaches because they have been presented in a
Generative models may learn from both big number due to the popularity of deep neural
labelled and unlabelled data, making them flexible. networks
Conversely, discriminative models outperform their
generative counterparts in supervised tasks despite
being unable to learn from unlabelled input. Hybrid
networks are encouraged by the possibility of in many sectors. DTL can be divided into four
simultaneously training deep generative and groups based on the methods applied in the
discriminative models inside a single framework and literature. These include
reaping their advantages. In most cases, hybrid deep (i) instances-based deep transfer learning, which
learning models are made up of two or more basic uses instances from the source domain with the
deep learning models, where the basic model is one proper weight
of the earlier stated discriminative or generative (ii) mapping-based deep transfer learning, which
deep learning models. The three categories of hybrid
deep learning models listed below are based on
combining various fundamental generative or
discriminative models, and they may be helpful for
resolving practical issues. These are as follows:
– Hybrid Model_ 1: An integration of different
generative or discriminative models to extract
more meaningful and robust features. Examples
could be CNN+LSTM, AE+GAN, and so on.
– Hybrid Model_ 2: An integration of generative
model followed by a discriminative model.
Examples could be DBN+MLP, GAN+CNN, maps instances from two domains into a new data
AE+CNN, and so on.
– Hybrid Model_ 3: An integration of generative or
discriminative model followed by a non-deep
learning classifier. Examples could be AE+SVM,
CNN+SVM, and so on.
Deep Transfer Learning (DTL): Transfer learning
is a method for solving new tasks quickly and
accurately by using previously acquired model
information. DL requires a lot more training data
than conventional machine learning methods do. As
a result, the requirement for a sizable amount of
labelled data poses a considerable barrier to
space with better similarity
completing some crucial domain-specific activities,
(iii) network-based deep transfer learning, which
notably in the medical industry where it is difficult
reuses a portion of the network that was previously
and expensive to produce large-scale, high-quality
trained in the source domain
annotated medical or health datasets. Furthermore,
(iv) adversarial based deep transfer learning,
the standard DL model demands a lot of
which employs adversarial technology to find
computational resources, such as a GPU-enabled
transferable features that both suit
server, even though researchers are working hard to
Due to its high effectiveness and practicality,
improve it. In order to address this problem, Deep
adversarial-based deep transfer learning has
Transfer Learning (DTL), a DL-based transfer
exploded in popularity in recent years. Transfer
learning method, may be useful. The knowledge
learning can also be classified into inductive,
from the previously trained model is transferred into
transductive, and unsupervised transfer learning

ISBN Number : 978-81-958673-8-7 101


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

depending on the circumstances between the source Deep learning has been effectively used to
and target domains and activities. While most solve many issues in several application areas during
current research focuses on supervised learning, how the last few years. These include robots, business,
deep neural networks can transfer knowledge in cybersecurity, virtual assistants, image identification,
healthcare, and many more. They also include
unsupervised or semi-supervised learning may natural language processing and sentiment analysis.
gain further interest in the future. DTL techniques
are useful in a variety of fields including natural We have outlined a number of deep
language processing, sentiment classification, visual learning's potential real-world application areas in
recognition, speech recognition, spam filtering, and Fig. These application domains use a variety of deep
relevant others. learning techniques according to our taxonomy,
A general structure of transfer learning process, which is shown in Fig. and includes discriminative
where knowledge from pre-trained model is learning, generative learning, as well as hybrid
transferred into new DL model. models that were previously described. In different
Deep Reinforcement Learning (DRL): The real-world application sectors, there are various deep
sequential decision-making problem is approached learning tasks and methods that are employed to
differently by reinforcement learning than by the handle the pertinent problems. Overall, based on
other methodologies we have seen so far. In Fig., we may draw the conclusion that deep learning
reinforcement learning, the ideas of an modelling has enormous future potential and a wide
environment and an agent are frequently introduced range of application fields. The research difficulties
first. The agent can take a number of actions in the surrounding deep learning modelling are also
environment, each of which affects the state of the summarized in the next section, along with some
environment and has the potential to produce prospective elements for next-generation DL
rewards (feedback): "positive" for good action modelling.
sequences that produce "good" states and "negative"
for bad action sequences that produce "bad" states.
Reinforcement learning's goal is to develop desirable
action patterns through interacting with the
environment, which is generally referred to as a
policy.
In order to enable the agents to learn the right
actions in a virtual environment, deep reinforcement
learning (DRL or deep RL) blends neural networks
with a reinforcement learning architecture, as shown
in Fig. While model-free RL systems learn directly
through interactions with the environment, model-
based RL is based on learning a transition model that
permits modelling of the environment without
interacting with it. For every (finite) Markov
Decision Process (MDP), the best action-selection
strategy can be found using the widely used model-
free RL technique known as Q-learning. A
mathematical framework called MDP is used to
describe decisions that are based on status, action,
and rewards. Aside from that the area makes use of
Deep Q-Networks, Double DQN, Bi-directional 1. Healthcare:
Learning, Monte Carlo Control, etc. As policy One of the industries that has embraced
and/or value function approximators, DRL contemporary technology most widely to transform
approaches combine DL models, such as Deep itself is the healthcare industry. The fact that Deep
Neural Networks (DNN), based on the MDP Learning is being used to analyse medical data for
principle, using raw, high-dimensional visual inputs.
DRL-based solutions are applicable to a variety of 1) The diagnosis, prognosis & treatment of diseases
real-world applications, including robotics, video 2) Drug prescription
games, natural language processing, computer 3) Analysing MRIs, CT scans, ECG, X-Rays, etc.,
vision, and pertinent others. to detect and notify about medical anomalies
4) Personalising treatment monitoring the health of
III. DEEP LEARNING APPLICATIONS patients and more.
The detection and treatment of cancer is one
significant area where deep learning is used. To rank

ISBN Number : 978-81-958673-8-7 102


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

various cancer cell types, medical professionals so much like real people. The automatic translation
employ a CNN, or Convolutional Neural Network, a of websites from one human language to another
deep learning technique. They 20X or 40X-magnify using Deep Learning-based NLP is another example.
high-res histopathology pictures before exposing 5. Autonomous Vehicles:
them to deep CNN models. The deep CNN models 1) The first semi-automatic car was introduced by
then distinguish different cellular properties present the Tsukuba Mechanical Engineering Laboratory 45
years ago, which is when the idea of creating
automated or self-governing vehicles initially came
in the sample and identify materials that are
carcinogenic. into existence. The car, a technological marvel at the
2. Personalized Marketing: time, was equipped with two cameras and an analog
The idea of personalized marketing has been widely computer so it could drive itself down a road that
used in recent years. Marketers are now focusing was made just for it. But it wasn't until 1989 when
their advertising campaigns on the needs of specific an altered military ambulance called ALVINN
consumers and providing them with the solutions to (Autonomous Land Vehicle in a Neural Network)
their problems. And Deep Learning is crucially used neural networks to find its way on roadways on
important in this. Thanks to their use of social media its own. Since then, deep learning and autonomous
platforms, Internet of Things (IoT) devices, online vehicles have forged a close relationship, with the
browsers, wearables, and other similar technologies, former significantly improving the performance of
consumers today generate a lot of data. The majority the latter. Autonomous vehicles use cameras,
of the data produced from these sources, sensors, including LiDAR, RADAR, and motion
nevertheless, is fragmentary (text, audio, video, sensors, as well as outside data like geomapping, to
location data, etc.).Businesses utilize adaptable Deep detect their surroundings and gather pertinent
Learning models to evaluate data from many sources information.. They employ this gear both singly and
and distill it in order to derive insightful customer collectively to record the data.
information. They then employ this data to forecast 2) Deep learning algorithms are then used with this
consumer behavior and more effectively focus their data to guide the vehicle to take the proper actions,
marketing efforts. Consequently, you now such as
comprehend how those online purchasing sites  accelerating, steering and braking
decide which things to suggest to you.  identifying or planning routes
3. Spotting Financial Fraud  traversing the traffic
The evil known as "fraudulent transactions" 1) Recognizing traffic signs and spotting people and
or "financial fraud" affects almost every industry. other cars both nearby and at a distance
However, the financial institutions (banks, insurance 2) Realizing the alleged goals of self-driving cars,
companies, etc.) are the ones who must deal with such as lowering the number of traffic accidents,
this threat's worst effects. Criminals target financial assisting the disabled in operating a vehicle, etc.,
institutions every single day. There are numerous depends greatly on deep learning. Although still in
ways to hijack their financial resources. their infancy, deep learning-powered vehicles will
soon make up the majority of the traffic on the roads.
4. Natural Language Processing:
Another significant area where Deep Learning is IV. CONCLUSION
demonstrating promising results is in NLP, or
Natural Language Processing. The goal of natural Deep learning models like the
language processing, as the name suggests, is to Convolutional Neural Network (CNN), because they
make it possible for computers to comprehend and are not naturally optimized by the model, deep
analyze human language. The idea seems learning models like the Convolutional Neural
straightforward, right? But the truth is, machines Network (CNN) have an enormous number of
have a terrible time understanding human language. parameters, which we can actually refer to as hyper-
By teaching machines (Autoencoders and parameters. Gridsearching these hyper-parameters'
Distributed Representation) to produce suitable ideal values is possible, but it takes a lot of time and
answers to linguistic inputs, it is possible to learn resources. Therefore, does a real data scientist accept
human language beyond the alphabet and words, educated guesses for these crucial parameters?
including context, accents, handwriting, and other Building on the design and architecture of the
aspects. The personal assistants that we utilize on specialists who have conducted in-depth research in
our smartphones are one such example. These your field, frequently with powerful hardware at
applications include Deep Learning-infused Natural their disposal, is one of the finest methods to
Language Processing (NLP) models to recognize improve your models. They graciously open-source
human speech and produce the intended results. . the resulting modelling architectures and reasoning
Therefore, it makes sense that Siri and Alexa sound rather frequently. Future neural networks might

ISBN Number : 978-81-958673-8-7 103


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

handle information in a progressive manner rather


than just being "deep," as in deeper than VGG Net.
The models of recurrent visual attention are an
intriguing example. With optimal network design,
learning and inference efficiency are both markedly
increased, and inference accuracy is increased as
compared to traditional feed forward networks. This
makes sense because there are fewer parameters to

learn and regularization techniques like dropout are


no longer required.
Declarations: Competing interests The author says
there are no competing interests.

VI. REFERENCES

1) Abdel-Basset M, Hawash H, Chakrabortty RK,


Ryan M. Energy-net: a deep learning approach for
smart energy management in iot-based smart cities.
IEEE Internet of Things J. 2021.
2) Aggarwal A, Mittal M, Battineni G. Generative
adversarial network: an overview of theory and
applications. Int J Inf Manag Data Insights. 2021; p.
100004.

3) Abdel-Basset M, Hawash H, Chakrabortty RK,


Ryan M. Energy-net: a deep learning approach for
smart energy management in iot-based smart cities.
IEEE Internet of Things J. 2021.

4) Al-Qatf M, Lasheng Y, Al-Habib M, Al-Sabahi


K. Deep learning approach combining sparse
autoencoder with svm for network intrusion
detection. IEEE Access. 2018;6:52843–56.
5) Dhyani M, Kumar R. An intelligent chatbot using
deep learning with bidirectional rnn and attention
model. Mater Today Proc. 2021;34:817–24
6) Google trends. 2021.
https://trends.google.com/trends/.
8) Gruber N, Jockisch A. Are gru cells more specifc
and lstm cells more sensitive in motive classifcation
of text? Front Artif Intell. 2020
7) Gu B, Ge R, Chen Y, Luo L, Coatrieux G.
Automatic and robust object detection in x-ray

ISBN Number : 978-81-958673-8-7 104


Effective Techniques In Cloud Computing And
Migration To Cloud Process
Sk shaheena
Student Siva Prasad Gutakala
Deptment of MCA Assistant Professor
K.B.N College(Autonomous) Deptment of MCA
Vijayawada-520001, Andhra Pradesh, India K.B.N College(Autonomous)
Shashaheena3@gmail.com Vijayawada-520001, Andhra Pradesh, India
gspkbn@gmail.com

ABSTRACT—

Cloud computing is the on-demand delivery of IT and Then adjustment construction from on-premises to
possessions over the Internet with pay-as-you-go appraising. cloud.
Instead of buying, owning, and maintaining physical data
II. TYPES OF BASIC CLOUD SERVICES
centers and waitrons, the syndicates can access knowledge
Cloud computing can be detached into three general
services, such as computing power, stowage, and databases,
package delivery groupings or methods of cloud computing:
on an as-needed basis from cloud service benefactors. Cloud
computing provides greater flexibility, adeptness and
strategic value compared to traditional on-premises IT
infrastructure. The advantages of cloud computing solutions
for businesses include increased capacity, functionality,
scalability, productivity, less maintenance, and summary
cost. Moreover, cloud computing solutions are easily
available from anywhere with an internet connection. Cloud
migration is the development of poignant a company's
alphanumeric assets, platform zone, databases, IT resources,
and applications moreover temperately, or absolutely, into
the cloud. A fruitful cloud migration diminishes cost,
progresses scalability, and intentionally reduces the risk of a
cyber-incident that could disrupt the company’s business
side by side.
Keywords—Cloud Computing, Cloud Migration,
Challenges in Cloud Computing, Cloud Computing
Architecture and Types of Services.

I. INTRODUCTION
Cloud computing is the on-demand obtainability of
computing possessions (such as storage and infrastructure),
as filling station over the internet. It disregards the need for 1. SaaS: SaaS is a dissemination model that delivers
folks and businesses to self-manage physical incomes software submissions over the internet; these
themselves, and only pay for what they use. Cloud presentations are regularly called web services. Users
Migration is a transformation from old old-style business can entree SaaS applications and services from any
operations to digital business actions and the process raises position expending a computer or mobile device that has
to moving the cardinal business operations to cloud. That internet access. In the SaaS model, users gain access to
means data, submissions or other business fundamentals are application software and databases. One common
stirred into a cloud computing environment. For example, example of a SaaS application is Microsoft 365 for
moving data and applications from a local, to the on- productivity and email services.
premises data center of the cloud. Every business starting 2. PaaS: In the PaaS model, cloud earners host enlargement
from small to large officialdoms follows considerably tools on their arrangements. User’s access these tools
different process for cloud migrations. Some of the common completed the internet using APIs, web entrances or
essentials which are reflected before cloud migration are: entrance software. PaaS is used for comprehensive
Calculation of prerequisite and presentation, selection of software development, and many PaaS benefactors host
cloud provider and Calculation of functioning costs the software after it&#39;s established. Common PaaS
The basic steps which are in the migration of cloud: merchandises include Salesforce&#39;s Lightning
Beginning migration goals, Creating a sanctuary strategy, Platform, AWS Changeable Beanstalk and Google
Reproducing existing database, Move business intelligence App Engine

ISBN Number : 978-81-958673-8-7 129


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

3. IaaS: IaaS providers, such as Amazon Web Services The Construction of Cloud computing encompasses
(AWS), quantity a virtual attendant illustration and many different apparatuses. It comprises Client
packing, as well as presentation program design organization, applications, services, runtime clouds, storage
boundaries (APIs) that let manipulators migrate amount spaces, administration, and safekeeping. These are all the
of work to a virtual machine (VM). Users have an parts of a Cloud computing structural design.
assigned storage measurements and can start, stop,
admittance and constitute the VM and storage as Front End:
favorite. IaaS benefactors offer small, medium, large, The client uses the forward-facing end, which
extra-large, and memory- or compute-optimized encompasses a client-side boundary and submission. Both
instances, in accumulation to enabling customization of of these components are imperative to access the Cloud
instances, for various workload essentials. The IaaS computing podium. The opposite end excludes web
cloud archetypal is closest to an out-of-the-way data attendants (Chrome, Firefox, Entertainment, etc.), clients,
focal point for business manipulators. and mobile procedures.
Back End:
III. CLOUD COMPUTING ARCHITECTURE The backend part supports you accomplish all the
Cloud Computing Structural design is a amalgamation of possessions required to afford Cloud computing service
machineries compulsory for a Cloud Computing provision. station. This Cloud construction part comprises a security
A Cloud computing architecture be made up of more than a instrument, a bulky expanse of data storage, headwaiters,
few apparatuses like a frontend platform, a backend computer-generated equipment, traffic control appliances,
platform or attendants, a network or Internet amenity, and a etc.
cloud-based conveyance facility. Cloud computing
encompasses two components, the opposite end, and the IV. COMPONENTS OF CLOUD COMPUTING ARCHITECTURE
back end. The front end comprises of the client part of a
cloud computing arrangement. It excludes boundaries and 1. Client Infrastructure:
presentations that are required to admittance the Cloud Client Organization is a front-end module that make
computing or Cloud indoctrination boards. available a GUI. It benefits users to intermingle through
the Cloud.
2. Application:
The presentation can be any software or display place
which a consumer requirements to contact.
3. Service:
The service constituent accomplishes which type of
package you can admittance conferring to the client’s
necessities.
Three Cloud computing services are:
 Software as a Service (SaaS)
 Platform as a Service (PaaS)
 Infrastructure as a Service (IaaS)

4. Runtime Cloud:
Runtime cloud suggestions the accomplishment and
runtime surroundings to the virtual machines.
Architecture of cloud computing 5. Storage:
Storage is additional important Cloud computing
Although the back end discusses to the cloud herself, it architecture constituent. It make available a large amount
encompasses the possessions prerequisite for cloud of storage capacity in the Cloud to store and manage data.
computing service area. It comprises of virtual technologies, 6. Infrastructure:
servers, data storage, security apparatuses, etc. It is under It offers services on the host level, network level, and
the wage-earner’s control. Cloud computing mete out the application level. Cloud infrastructure includes hardware
file system that feasts over multiple unbreakable compact and software components like servers, storage, network
disk and machines. Data is underneath no surroundings devices, virtualization software, and various other storage
stored in one residence, and in case one unit miscarries, the resources that are needed to support the cloud computing
other will take over spontaneously. The user disk space is model.
assigned on the disseminated file system, while alternative 7. Management:
imperative constituent is an algorithm for supply This component manages components like
apportionment. application, service, runtime cloud, storage, infrastructure,
Cloud computing is a resilient disseminated and it and other security matters in the backend. It also
comprehensively be contingent upon durable procedures establishes coordination between them.
8. Security:

ISBN Number : 978-81-958673-8-7 106


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Security in the backend refers to commissioning completely employ the cloud’s abilities. The conceivable
different security mechanisms for secure Cloud systems, challenges are following:
resources, files, and infrastructure to the end-user.
9. Internet: Performance: When wander the business apps to the
Internet connection acts as the bridge or medium cloud or a third-party merchant, the business presentation
concerning frontend and backend. It allows you to becomes reliant on the contractor. Another momentous
inaugurate the collaboration and announcement between matter in cloud computing is verdict the accurate cloud
the frontend and backend. overhaul provider. Prior to participating, we should
expedition for benefactors with trailblazing equipment.
The presentation of the BIs and other cloud-based
V .CLOUD COMPUTING STRATEGY organizations is also secured to the systems of the
1. Replace: It refers to trading the hoary plan with a new contractor. Be thoughtful when picking a service and
artificial SaaS (Software as a Service). guarantee that they have apparatuses in place to deal with
2. Refactor: It refers to reclaim the claim code and complications that progress in material stretch.
agendas and running the application on a PaaS
(Platform as a Provision). Security: The principal apprehension in participating in
3. Rehost: It represents to take the request to the new cloud a facility is cloud computing retreat. It is since your
hosted cloud environment by choosing IaaS statistics is stored and administered by a third-party
(Set-up as a Amenity). benefactor wanting your acquaintance. Every day or so,
4. Rebuild: It discusses to re-architecting the tender from you accumulate evidence about a confident organization’s
the opening up on a PaaS provider’s platform. smashed confirmation, negotiated credentials, account
5. Revise: It refers to escalating cipher base and then horse-riding, data fissures, and so on which varieties the
arranging it moreover by rehosting or refactoring. user even more unconvinced. Fortuitously, cloud
establishments have activated to brand determinations to
increase safekeeping competences. You can also be
thoughtful by read-through to see if the breadwinner has a
sheltered user distinctiveness management system and
right of entry governor procedures in place. Also, make
definite that it monitors database danger and concealment
conventions.

Cost Management: Cloud computing consents


manipulators to admittance submission software via a
debauched internet construction while exchangeable you
currency on affluent processor gear, software,
administration, and preservation. This depresses the rate.
Nevertheless, converting the establishment’s difficulties on
the third-party display place is challenging and overpriced.
Alternative expense is the cost of removing data to a
communal haze, which is enormously expensive for a
insignificant steady or expansion.

Internet Connectivity: A high-speed internet construction


is mandatory for raincloud amenities. So, productions who
BENEFITS OF CLOUD COMPUTING are relatively tiny and are undergoing connectivity
 Kinds the complete Cloud computing system predicaments would preferably first participate in a dressed
pretentious. internet construction to avoid interruption. For the motive
 Helps to augment your data treating. that internet outages strength upshot in noteworthy
 Affords high sanctuary. financial wounded.
 It has improved adversity repossession.
 Suggestions respectable user user-friendliness. Password Security: As extra entities use one cloud
explanation, it converts more defenseless. Everyone who
 Suggestively diminishes IT functional prices.
distinguishes one user’s watchword or drudges into the
raincloud will consume admittance to that operator’s
VI. CHALLENGES penetrating data. In this situation, the fixed should service
numerous levels of substantiation and agreement that
Even though all the expansion and impending of cloud
authorizations are reserved protected. Passwords should
computing facilities, businesses face a variability of cloud
also be transformed on a unvarying basis, specifically
computing challenges. We have accumulated a slant of
when an distinct hand in your notice and leaves the well-
cloud computing experiments that duty be talked to

ISBN Number : 978-81-958673-8-7 107


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

founded. Admittance to usernames and watchwords would


be approved with thoughtfulness.

High Availability and Reliability: High accessibility


(HA) and steadfastness are two of the maximum
imperative encounters in cloud computing. The
opportunity that a organization will be up and functioning
at any assumed minute in time is mentioned to as
dependability, although obtainability discusses to how
possible it is that the arrangement resolve be up and
successively at any prearranged point in time. Since most
administrations rely on third-party facilities, cloud
clarifications must be unchangeable and full-bodied. Cloud
services remain to lack 24-hour convenience, subsequent in
recurrent commotions. It is vital to television the provision
being accessible by means of internal or third-party 1. Public: The name says it all. It is accessible to the public.
explanations. Plans for nursing SLAs, application, Public deployment models in the cloud are perfect for
sturdiness, performance, and occupational dependence on organizations with growing and fluctuating demands. It also
these amenities are decisive. makes a great choice for companies with low-security
concerns. Thus, you pay a cloud service provider for
Lack Of Expertise: Super vision has change to tougher networking services, compute virtualization & storage
due to the accumulative assignment on cloud know-hows available on the public internet. It is also a great delivery
and the constant enhancement of cloud clarifications. A model for the teams with development and testing. Its
accomplished staff capable of commerce with cloud configuration and deployment are quick and easy, making it
multiplying apparatuses and services has been in high an ideal choice for test environments.
request. As a result, productions must train their IT
employees to alleviate this danger. 2. Private: It means that it will be integrated with your data
center and managed by your IT team. Alternatively, you can
Interoperability And Portability: Additional issue with also choose to host it externally. The private cloud offers
cloud computing is that agendas must be straightforwardly bigger opportunities that help meet specific organizations'
stimulated across cloud wage-earners deprived of actuality requirements when it comes to customization. It's also a
inaccessible in an protracted measurement of time. For the wise choice for mission-critical processes that may have
reason that of the complication compulsory, migrating from frequently changing requirements.
one cloud wage-earner to alternative is inadequate in
litheness. Fluctuating cloud progresses announce a few new
3. Community: The community cloud operates in a way
problems, such as intensive care data flow and manufacture
that is similar to the public cloud. There's just one difference
a tamper-proof network origination the pounded up.
- it allows access to only a specific set of users who share
Supernumerary question is that consumers cannot
common objectives and use cases. This type of deployment
admittance it since ubiquitously; nevertheless, this can be
model of cloud computing is managed and hosted internally
determined by the cloud breadwinner so that consumers can
or by a third-party vendor. However, you can also choose a
tightly access the cloud from ubiquitously.
combination of all three.
VII. CLOUD DEPLOYEMENT MODELS 4. Hybrid: As the name suggests, a hybrid cloud is a
combination of two or more cloud architectures. While each
model in the hybrid cloud functions differently, it is all part
Most cloud hubs have tens of thousands of servers and
of the same architecture. Further, as part of this deployment
storage devices to enable fast loading. It is often possible to
of the cloud computing model, the internal or external
choose a geographic area to put the data "closer" to users.
providers can offer resources
Thus, deployment models for cloud computing are
categorized based on their location. To know which model
would best fit the requirements of your organization, let us
first learn about the various types. VIII. CONCLUSION

1. Public
2. Private Cloud totaling is the distribution of subtracting
3. Community facilities—as well as servers, storage, databases,
4. Hybrid interacting,
Software, analytics, and brainpower—over the Internet
(&quot the cloud & quot) to suggestion quicker

ISBN Number : 978-81-958673-8-7 108


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

modernization, bendable possessions, and parsimonies of Journal of Computer Applications (0975 – 8887)Volume 86
weighbridge. Cloud computing trusts comprehensively – No 16, January 2021
on virtualization and computerization machineries.
Virtualization empowers the easy generalization and 7. Aslam, M., bin AB Rahim, L., Watada, J., & Hashmani,
provisioning of amenities and fundamental raincloud M .,” International Journal of Engineering Applied Sciences
systems into common-sense entities that manipulators can and Technology”, 2020 , ISSN No. 2455-2143, Pages 73-81.
invitation and consume. Computerization and
complementary instrumentation proficiencies afford users 8. Mohd Khairul Akhbar Jahiruddin1, Zulazhan Ab. Halim2
with a high notch of self-service to running possessions, and Mohamad Azwan Kamarudin3, readability level of the
bond facilities and position capabilities minus unshakable arabic language textbook of diploma tahfiz al-quran & al-
intervention from the cloud benefactor&#39;s IT staff. qiraat darul quran jakim, ”international journal of advanced
Cloud computing is spinning into the strength of research paper”, issn NO:2320-5407,PP:83-88
essentially everything we do now and as such, small, 9. Martin Bremer1 , Tim Walter1 , Nikita Fjodorovs1 ,
medium, and large organizations are adapting to cloud Katharina Schmid1,” A Systematic Literature Review On
technologies as they need space to store all their data. The Suitability Of Cloud Migration Methods For Small And
Cloud computing isn’t just good for establishments Medium-Sized Enterprises”, (2021)Conference on
nonetheless; it’s also excessive for persons as they can take Production Systems and Logistics, PP:567-579 DOI:
improvement of geographies like involvement, https://doi.org/10.15488/11285.
preservation, and springiness 10. Hsu, P.-F., Ray, S., Li-Hsieh, Y.-Y., 2020. Examining
cloud computing adoption intention, pricing mechanism, and
deployment model. “International Journal of Information
Management” 34 (4),PP: 474–488.
IX. REFERENCES 11. Raut, R.D., Gardas, B.B., Narkhede, B.E., Narwane,
V.S., 2021, “To investigate the determinants of cloud
1. Ugonna Anthony, Boison, David King, Yeboah-Boateng,
computing “adoption in the manufacturing micro, small and
Ezer Osei,”survey paper”: CLOUD COMPUTING
medium enterprises. BIJ 26 (3),PP: 990–1019.
MIGRATION FRAMEWORK FOR MICROFINANCE-
A CASE OF BANKS IN ACCRA-GHANA,2020 12. Ahmad, A. and M. A. Babar (2014). “A framework for
International Journal of Computer Science and Information architecture-driven migration of legacy systems to cloud-
Technology ISSN 2348-120X, Vol. 7, pp: (26-37) enabled software”. Proceedings of the WICSA 2021
Companion Volume. Sydney, Australia, PP: 1-8
2. Hajjat, M., Sun, X., Sung, Y., Maltz, D., Rao, S.,
Sripanidkulchai, K., Tawarmalani, M.: Cloudward (2020). 13. Ardagna, D., E. D. Nitto, et al. (2020). MODAClouds:”
bound: planning for beneficial migration of enterprise a model-driven approach for the design and execution of
applications to the cloud. In: ACM SIGCOMM Computer applications on multiple clouds”. Proceedings of the 4th
Communication Review. vol. 40, pp. 243–254. International Workshop on Modeling in Software
Engineering. PP: 50-56.
3. ACM. Banerjee, J., (2021). Moving to the Cloud:
Workload Migration Techniques and Approaches. In: High 14. Thomas Chen, Ta-Tao Chuang, Kazuo Nakatani, The
Performance Computing (HiPC), 2021 19th International Perceived Business Benefit of Cloud Computing: An
Conference on, vol., no., pp.1,6, 18-22 Dec. 2012. Exploratory Study,” journal of international technology and
information management,PP:48-58(2020)
4. Latif, Rabia & Abbas, Haider & Assar, Saïd & Ali,
Qasim. (2020). Cloud Computing Risk Assessment: A 15. R. Rai, G. Sahoo and S. Mehfuz. Exploring the factors
Systematic Literature Review.10.1007/978-3-642-40861- influencing the cloud computing adoption: “A systematic
8_42. study on cloud migration”. SpringerPlus, vol. 4:197,
2021,PP:12-19
5. Hashizume, Keiko & Rosado, David & Fernández-
Medina, Eduardo & Fernández, Eduardo. (2022). An
analysis of security issues for cloud computing. Journal of
Internet Services and Applications. 4.10.1186/1869-0238-4-
5.
6. Pradip D. Patel,” Live Virtual Machine Migration
Techniques in Cloud Computing”: A Survey, International

An Overview On Chatbot Technology In Recent Years


S.ANIL KUMAR Siva Prasad Guntakala,
Student Assistant Professor

ISBN Number : 978-81-958673-8-7 109


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Department of MCA Department of MCA


K.B.N College (Autonomous)
K.B.NCollege(Autonomous)
Vijayawada-520001, A.P., India Vijayawada-520001, A.P., India
anilkunarsingamsetti12@gmail.com Email: gspkbn@gmail.com

ABSTRACT
This research looks at how chatbots have quickly changed because we can use them on different devices and in
and become important in areas like marketing, education, messaging apps, which makes them really easy to use
healthcare, and entertainment. The paper also looks at and helpful.
how people became interested in chatbots over time, why
they are being used, and how they help in different areas.
It talks about how people's beliefs and ideas can affect II. LITERATURE REVIEW
how chatbots are made and used. It also talks about how Many apps are trying to make computer
chatbots can be grouped based on what they know and conversations feel more like talking to humans. But
what they do. Finally, it talks about the technology and often, the information these apps use to chat comes from
tools used to make chatbots today. This research shows databases created by experts. We can use AI to create
that chatbots have a lot of potential and should be studied different types of chatbots, and in this paper, we've built
more. a College Enquiry chatbot. It can answer questions about
Keywords: Chabot Chabot architecture Artificial things like the enrollment process, fees, courses,
Intelligence · Machine learning · NLU eligibility criteria, and admissions. This paper also
discusses how to use AI to understand important facts in
I. INTRODUCTION texts about real people's lives. This can help create
chatbots for middle school learning situations, whether
In our fast-moving tech world, chatbots have become a online or in classrooms. Studying this type of learning
big deal. They're like super-smart computer programs involves many academic fields, including instructional
that can talk to us like humans. These chatbots are used technology, educational psychology, sociology, cognitive
all over the place, like in customer service, healthcare, psychology, and social psychology. Some people
schools, and online stores. They're changing how we do describe chatbots as a way to interact with computers
things and even making us think about how we talk to online, even though it seems like you're talking to a
computers and what it means for our society. person. Others say chatbots are computer programs that
Chatbots are more than just computer programs that can talk to users like they're having a conversation,
give automatic answers. They're a mix of really fancy thanks to artificial intelligence. Chatbots use special
tech and the way our brains work. They use special software to communicate using natural language, like we
computer math to understand what we mean and how do when we talk to each other. It's often hard for users to
we feel, so it feels like we're talking to a real person. tell that they're not talking to a real person, which is why
They're getting better all the time, and they can be used having a big database of information is so important for
in lots of ways to make things work better and be more chatbots.
convenient. Chatbots are becoming a popular way for
This research paper takes a deep look at chatbots. It organizations to communicate with individual users and
talks about how they started and what they can do now. quickly answer their questions. This interest in chatbots
It also looks at how they're changing different parts of has grown recently due to improvements in messaging
our lives, like helping with customer service or even services and advances in artificial intelligence. The
providing support for mental health. proposed system in this paper is a College Enquiry
But, there are some important things to think about Chatbot created using the chatterbot algorithm, which is
too. As chatbots get better at pretending to be humans, a Python library. It makes it easy for developers to create
we have to think about privacy, keeping our chatbots that can have conversations with users. This
information safe, and when it's okay to use machines to chatbot can provide information and answer questions
talk instead of people. related to college inquiries, like enrollment, courses,
Chatbots are a big part of how we use Artificial eligibility, and admissions. Users don't have to visit the
Intelligence in our lives. They're like smart computer college in person to get information; they can ask the
helpers that can do all sorts of things, from answering chatbot online. The history of chatbots goes back to Alan
questions to helping us shop online. They're popular Turing's 1950 Turing Test, but they became more
popular with the introduction of Eliza in 1966. Over
time, chatbots like PARRY and ALICE were developed.
They paved the way for virtual personal assistants like
Siri, Cortana, Alexa, Google Assistant, and Watson. The

ISBN Number : 978-81-958673-8-7 110


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

interest dialogue. For example:

in chatbots has grown, especially after 2016, and


they are now used in many different applications,
including
research and industry <aiml version="1.0.1" encoding="UTF-8">
<category>
<pattern>WHAT IS YOUR

NAME</pattern>
<template>I'm a Chabot. You can call me
ChatBot.</template>

</category>
</aiml>
In this code:
The <aiml> tag marks the start of AIML code.

Fi Each <category> contains a user input pattern and a


g.1 corresponding response template.
Figure 1 shows search results in Scopus for the For instance, if the user asks "WHAT IS YOUR NAME,"
keywords "Chabot," "conversation agent," or the Chabot will respond with "I'm a chatbot. You can call
"conversational interface" from 2018 to 2023. me ChatBot."
AIML provides a basic structure for creating conversational
Effective Chabot design involves making them feel interactions with a chatbot
like a tool, a toy, or a friend to users. Businesses use
Chabots to save money and handle multiple users at
once. Chabots have evolved beyond being simple
assistants and now aim to connect with users on a
friendly level. In customer service, they can even detect
emotions and engage users using machine learning.
User trust in Chabots depends on factors like how
human-like their responses are, their appearance,
professionalism, and other contextual aspects such as
brand reputation and security. Achieving a human-like
feel involves using visual, identity, and conversational
cues. The impact of making Chabots more like people
and interactive has also been studied.

IV.TYPES OF CHATBOTS
III. ESSENTIAL CONCEPTS:
Pattern Matching: Chabot responses are generated Chatbots can be categorized in various ways, including by
based on user input, with pioneers like Eliza and their knowledge domain, the services they offer, their goals,
ALICE using this method. However, this approach can how they process input, whether they interact with other
make conversations seem predictable and less human- chatbots, and how they are built. Knowledge Domain
like. Classification: This categorization considers the chatbot's
knowledge scope. Open domain chatbots can discuss a wide
Artificial Intelligence Mark-up Language (AIML): range of topics, while closed domain chatbots specialize in a
Developed from 1995 to 2000, AIML uses pattern specific knowledge area and may not handle unrelated
recognition for human-Chabot dialogues. It's an XML- queries effectively. Service-Based Classification: This
based markup language that uses categories with user classification is based on the nature of the interaction and the
input patterns and Chabot responses to improve services provided. Interpersonal chatbots facilitate tasks like

ISBN Number : 978-81-958673-8-7 111


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

restaurant or flight bookings and relay information to


users but aren't considered companions. Intrapersonal
chatbots, like messaging apps, serve as companions and
understand users in a more human-like way. Inter-agent
chatbots are becoming important as they allow different
chatbots to communicate, often requiring protocols for
inter-chatbot communication. Goal-Based Classification:

ISBN Number : 978-81-958673-8-7 112


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

these candidates. This allows them to provide


more contextually relevant answers compared to
rule-based chatbots. Generative Model:

Generative chatbots are more advanced. They


generate responses based on both current and
previous user messages

They use machine learning and deep learning


techniques to create responses that are more
human-like. However, building and training
This categorization looksat the primary generative chatbots can be challenging due to
objectives of chatbots. Informative chatbots aim their complexity. Regarding another
to provide predefined information, often classification, chatbots can also be categorized
following a fixed script, like FAQ chatbots. Chat- based on the level of human assistance in their
based or conversational chatbots engage users in components. Human-aided chatbots involve
conversations and aim to respond naturally to human computation in at least one part of their
user input. Task-based chatbots focus on operation. This human involvement can come
completing specific tasks, such as flight booking from crowd workers, freelancers, or full-time
or providing assistance. These chatbots are employees, and it enhances the chatbot's
intelligent in understanding user input and capabilities. Chatbots can also vary in their level
fulfilling tasks, like restaurant booking bots or of intelligence and human involvement in their
more advanced FAQ bots. Input Processing and logic to compensate for the limitations of fully
Response Generation Method Classification: This automated chatbots. Here's an overview of these
classification considers how chatbots handle user classifications: Human-Aided Chatbots: These
inputs and generate responses. It covers the chatbots incorporate human assistance to enhance
underlying technology used for understanding their intelligence and capabilities. Human
and responding to user messages, including computation, which can involve crowd workers,
natural language processing (NLP), rule-based freelancers, or employees, fills in the gaps left by
systems, machine learning, or a combination of fully automated chatbots. This human
these methods. intervention provides more flexibility and
These classification criteria help us understand robustness to chatbot responses. However, it has
the nature and capabilities of different chatbots limitations in processing information as quickly
and how they can be employed for various as automated methods, which can hinder
purposes, from information retrieval to engaging scalability to handle a large number of user
in natural conversations or performing specific requests.
tasks. There are different types of chatbot
models, each with its own way of generating
Platform-Based Classification: Chatbots can
responses: Rule- Based Model: These chatbots
also be categorized based on the development
rely on a predefined set of rules to choose their
platform they use. These platforms can be
responses. These rules are based on recognizing
open-source, like RASA, or proprietary,
specific words or phrases in the user's input. The
typically offered by big companies such as
knowledge used by these chatbots is manually
Google or IBM. Open-source platforms give
programmed by humans, and they follow
chatbot designers greater control and
conversational patterns. They are less flexible
flexibility over implementation. In contrast,
because they cannot create entirely new
closed platforms often act as "black boxes,"
responses and are limited to what's in their rule
which might limit customization but may
database. They struggle with handling spelling
provide immediate access to advanced
and grammatical errors in user input. Retrieval-
technologies. Additionally, chatbots developed
Based Model: In this model, chatbots have more
on large companies' platforms may benefit
flexibility. They query available resources, often
from the vast amount of data these companies
using APIs, to retrieve potential response
collect. It's essential to note that chatbots can
candidates. Then, they use a matching approach
have elements of both classifications, and the
to select the most appropriate response from

ISBN Number : 978-81-958673-8-7 113


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

balance between them can vary from one next action. This action can range from
chatbot to another. These classifications help providing direct responses to asking for more
in understanding the approach and capabilities context or clarification.
of different chatbot systems. Chatbot design is an iterative process, and
Design and Development continuous improvement is essential to ensure
Chatbot design is a complex task that involves that the chatbot effectively serves its users and
several techniques to ensure that it serves its fulfills its intended purpose.
intended purpose and falls into the appropriate
category. Here's an overview of key V. CONCLUSION
considerations in chatbot design:
Algorithm and Tool Selection: Developers The ultimate goal in technology development
choose algorithms and tools based on the is to reduce the need for human intervention as
chatbot's purpose and category. These choices much as possible. Chatbots are particularly
influence how the chatbot understands and effective at achieving this goal, as they can
generates responses. User Expectations: Users reach a wide audience through messaging apps
need to have a clear understanding of what to and can often outperform human agents in
expect from the chatbot. This includes knowing certain tasks. They have the potentl to evolve
the chatbot's capabilities and limitations into advanced tools for gathering information
and can significantly reduce costs in
customer service operations. As AI and
machine learning continue to advance, it may
become increasingly difficult to distinguish
between chatbots and human agents, as
chatbots become more sophisticated in their
responses and interactions. This research
provides valuable insights into the
fundamental aspects of chatbots, which can be
Design Requirements: beneficial for both users and developers in
Accurate Knowledge Representation: The understanding how to use and create them
chatbot needs to have an accurate effectively. Future directions for research in
representation of the knowledge it uses to this field may include a more in-depth
provide responses. This knowledge can be analysis of existing chatbot platforms,
hand- coded or acquired through training data. assessments of their functionality, and
Answer Generation Strategies: Chatbots exploring ethical considerations related to
employ strategies to generate responses, which issues like
can vary from rule-based approaches to
machine learning techniques.
Predefined Neutral Responses: Chatbots
should have predefined responses for abuse and deception.The acknowledgments
situations when they don't understand the mention support from the MPhil program
user's input. These responses can help manage "Advanced Technologies in Informatics and
user expectations. Computers" at the International Hellenic
Modular Approach: A modular approach to University's Department of Computer Science,
chatbot design is often preferred. This means which contributed to this research.
breaking down the chatbot's functionality into
distinct components that work together. This
modular structure can enhance flexibility and VII. REFERENCES
maintainability.
General Chatbot Architecture (as shown in Figure 1. Kumar Shivam; Khan Saud; Manav Sharma;
3): Saurav Vashishth; Sheetal Patil , "Chatbot
User Input: The process begins with a user's for College Website" in International Journal
request or input. of Computing and Technology, June 2018.
2. Ms.Ch.Lavanya Susanna and R.
Language Understanding Component: This Pratyusha, "COLLEGE ENQUIRY
component processes the user input to CHATBOT" in International
determine the user's intent and gather any Research Journal of
relevant information. Engineering and Technology (IRJET)
Decision-Making: Based on the understanding on 3rd March 2020.
of the user's input, the chatbot decides on its 3. Guruswami Hiremath, Aishwarya Hajare,

ISBN Number : 978-81-958673-8-7 114


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Priyanka
Bhosale and Rasika Nanaware, “Chatbot for
education

system" in International Journal of


Advance Research, Ideas and Innovations
in Technology.
4. Johan Redström, Patricija Jaksetic and
Peter Ljungstrand,"The ChatterBox" in
RISE Research Institutes of Sweden.
5. Punith, Chaitra, Veeranna Kotagi ,
Chethana R M," Chatbot for Student
Admission Enquiry" in Journal of
Advancement in Software Engineering
and Testing.
6. Emil Babu and Geethu
Wilson,"CHATBOT FOR COLLEGE
ENQUIRY" in International Journal of
Creative Research Thoughts.
7. chatbots and rediscovery of machine
intelligence. Int. J. u- e-Serv. Sci.
Technol. 8,277–284 (2015).
8. chatbot | Definition of chatbot in English
by Lexico Dictionaries
9. Abu Shawar, B.A., Atwell, E.S.:
Chatbots: are they really useful? J. Lang.
Technol. Comput. Linguist. 22, 29–49
(2007)
10. Klopfenstein, L., Delpriori, S., Malatini,
S., Bogliolo, A.: The rise of bots: a survey
of con- versational interfaces, patterns,
and
paradigms. In: Proceedings of the 2017
Conference on Designing Interactive
Systems, pp. 555–565. Association for
Computing Machinery (2017)
11. Turing, A.M.: Computing Bansal, H.,
Khan, R.: A review paper on human
computer interaction. Int. J. Adv. Res.
Comput.Sci. Softw. Eng. 8, 53 (2018).
12. Khanna, A., Pandey, B., Vashishta, K.,
Kalia, K., Bhale, P., Das, T.: A study of
today’s A.I. through machinery and
13. Weizenbaum, J.: ELIZA intelligence.
Mind 59, 433– 460 (1950). computer
program for the study of natural language
communi- cation between man and
machine. Commun. ACM 9, 36–45
(1966).
14. Brandtzaeg, P.B., Følstad, A.: Why
people use chatbots. In: Kompatsiaris, I.,
et al. (eds.)

ISBN Number : 978-81-958673-8-7 115


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Study on Marketing Strategies in Life


Insurance Services

Vineetha thota Siva Prasad Guntakala,


Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: vineethathota2001@gmail.com Email: gspkbn@gmail.com

Abstract: In today’s economy, the financial leasing


performance services pressures and competitive
This research study delves into the realm of forces (Goergen, 2001). Modern media, such as
marketing g strategies employed within the life the internet, have created new challenges for this
insurance industry, aiming to uncover the evolving la industry (Fuchs, 2001).New business concepts, a
inscape of promotional tactics and their impact on change in client sophistication (Davis, 2006), and
customer engagement and business growth. Life an increasing number of new competitors
insurance services, being pivotal component of entering into the market, such as independent
financial planning and risk ma engagement, financial consultants, have changed the business
necessitate effective marketing approaches to models and the competitive forces that
enhance awareness, undress tanding, and adoption established financial services organizations are
among potential clients. This studyamalgamates facing today world-wide. A marketing strategy
qualitative and quantitative methods to scrutinize serves as the foundation of a marketing plan. A
various marketing strategies in thief insure acne marketing plan contains a list of specific actions
sector. The research begins by establishing a required to successfully implement a specific
comprehensive overview of the life in durance marketing strategy. A strategy is different than a
industry, highlighting its significance in providing tactic. While it is possible to write a tactical
financial security and stabi lity to individuals and marketing plan without a sound, well-considered
their families. It subsequently dissects the dynamic strategy, it is not recommended. Without a sound
marketing landscape, exploring traditional and marketing strategy, a grouping a market (i.e.
contemporary strategies, including agentbased customers) into smaller subgroups. This is not
approaches, digital marketing, social contemporary something that is arbitrarily imposed on society:
strategies, including agentbased approaches, digital it is derived from the recognition that the total
marketing, social media campaigns, and direct-to market is often made up of submarkets (called
consumer models. By evaluating the strengths and 'segments'). These segments are homogeneous
weaknesses of these strategies, the studyaims to within (i.e. people in the segment are similar to
prove ideinsights into the optimal allocation each other in their attitudes about certain
ofresources for marketing endeavors. variables). Because of this intra-group similarity,
they are likely to respond somewhat similarly to
Throug surve ys, interviews, and case studies a given marketing strategy.
involving industry professionals, policyholders, and That is, they are likely to have similar
potential customers, the feeling and ideas about a marketing mix comprised
studyanalyzes consumer perceptions, preferences, of a given product or service, sold at a given price,
decision-making processes. This investigation and distributed in a certain way, and promoted in a
expectations. Additionally, the study delves into the certain way. marketing plan has no
role of branding, trust- foundation. Marketing strategies serve as the
building, and transparency in influencing consumer c fundamental underpinning of marketing plans
choices in the life insurance domain designed to reach marketing objectives. It is
important that these objectives have measurable
I. INTRODUCTION results. A good marketing strategy should integrate

ISBN Number : 978-81-958673-8-7 116


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

an organization's marketing goals, policies, and Marketing professionals and specialist use many
action sequences (tactics) into a cohesive whole. tactics to attract and retain their customers. These
The objective of a marketing strategy is to provide a activities comprise of different concepts, the
foundation from which a tactical plan is developed. most important one being the marketing mix.
This allows the organization to carry out its mission There are two concepts for marketing mix: 4P
effectively and efficiently. and 7P.
It is essential to balance the 4Ps or the 7Ps of the
marketing mix. The concept of 4Ps has been
long used for the product industry while the
latter has emerged as a successful proposition for
The following techniques are implemented the services industry.
to device the Marketing Strategy for the The 7Ps of the marketing mix that are used to frame
product/service: marketing strategies of life insurance companies can
be discussed as:
 Segmentation
 Targeting
 Positioning Product - It must provide value to a customer but
Segmentation: does not have to be tangible at the same time.
Market segmentation is widely defined as being a Basically, it involves introducing new products or
complex process consisting in two main phases: improvising the existing products. A product
identification of broad, large markets means what we produce. If we produce goods, it
Segmentation of these markets in order to means tangible product & when we produce &
generate services, it means intangible service
select the most appropriate target markets and
product.
develop marketing mixes accordingly. Positioning:
Simply, positioning show your target market A product is both what a seller has to sell & buyer
defines you in relation to your competitors. A has to buy. So, insurance companies sell services
good position is: &services are their products. Apart from life
1. What makes you unique? insurance as product, customer not only buys
product but also services in the form of assistance
2. This is considered a benefit by your target market & advice of agent. It is natural that customers
expect reasonable returns for their investments &
Positioning is important because you are
insurance companies want to maximize their
competing with all the noise out there competing
profitability. Hence while deciding the product mix
for your potential fans attention. If you can stand
services or schemes should be motivational.
out with a unique benefit, you
Price - Pricing must be competitive and must entail
profit. The pricing strategy can comprise discounts,
offers and the like. The pricing of insurance
have a chance at getting their attention. It is products not only affects the sales volume and
important to understand your prod view relative profitability but also influences the perceived
to the competition. quality in the minds of the consumers. There are
several different methods for pricing insurance,
Targeting: based on the insurance marketer objectives. They
Targeting involves breaking a market into are the survival approach, the sales maximization
segments and then concentrating your marketing approach, and the profit maximization approach. To
efforts on one or a few key segments. Target determine the insurance premium, marketers
marketing can be the key to a success. The consider various factors such as mortality rate,
beauty of target marketing is that it makes the investment earnings, and expenses, in addition to
promotion, pricing and distribution of your the individual risk profile based on age, health, etc.,
products and/or services easier and more cost- and the time period/ frequency of payment.
effective. Target In insurance business the pricing decisions are
concerned with:
Marketing Mix:
-The premium charged against policies

ISBN Number : 978-81-958673-8-7 117


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

-The interest charged for defaulting the payment of infrastructural facilities and management of branch
premium & credit facility. offices and premises.
-Commission charged for underwriting & Thus, place management of insurance premises
consultancy activities. needs a new vision, distinct approach & an
innovative style. The branch managers need
The pricing decisions may be high or low keeping in
professional excellence to make place decisions
view the level or standard of customers or the
productive.
policyholders. Mainly, pricing of insurance is in the
form of premium rates. The three main factors used Promotion - It includes the various ways of
for determining the premium rates under a life communicating to the customers of what the
insurance. People would not be willing to put their company has to offer. It is about communicating
funds to invest in insurance business if the interest about the benefits of using a particular product or
rates provided by other financial instruments are service rather than just talking about its features.
higher than the perceived returns from the insurance The insurance services depend on effective
premiums. promotional measures, so as to create impulsive
buying. Promotion comprises of advertising & other
Place - It refers to the place where the customers
publicity tactics. The promotion is a fight not only
can buy the product and how the product reaches
for market share, but also for mind share. The
out to that place. This is done through different
insurance
channels, like Internet,
services depend on effective promotional measures,
so as to create impulsive buying. Promotion
wholesalers and retailers. This component of comprises of advertising & other publicity tactics.
marketing mix is related to two important facets- Due attention should be given in selecting the
promotional tools. Personnel should be given
-Managing the insurance personnel adequate training for creating impulsive buying.
-Locating a branch
The management of insurance personal should be People - People refer to the customers, employees,
done in such a way that gap between the services management and everybody else involved in it. It is
promises-services offered is bridged over. In a essential for everyone to realize that the reputation
majority of service generating organizations, such of the brand that you are involved with is in the
people's hands. Understanding the customer better
a gap is found existent which has been instrumental allows to design appropriate products. Being a
in making down the image problem. The insurance service industry, which involves a high level of
personnel if not managed properly would make all people interaction, it is very
efforts insensitive. They are required to be given
adequate incentives to show their excellence. They important to use this resource efficiently in order to
should be provided intensive trainings to focus satisfy customers. Training, development &strong
mainly on behavioural management. insurance plan relationships with intermediaries are the key areas
are mortality, expense & interest. The pricing of to be kept under consideration.
insurance is in form of premium rates. The three Process - It refers to the methods and process of
main factors for determining the premium rates providing a service and is hence essential to have a
under life insurance plan are: thorough knowledge on whether the services are
Mortality: Average death rates in a particular area. helpful to the customers, if they are provided in
time, if the customers are informed in hand about
Expenses: The cost of processing, commission to the services and many such things. The process
agents, registration is all incorporated into the cost should be customer friendly in insurance industry.
of instalments & premium sum & forms the integral The speed & accuracy of payment is of immense
part of pricing strategy. Interest: The rate of importance. The processing method should be easy
interest is one of the invest in Another important to& convenient to the customers. Instalment
dimension to the place mix is related to the location schemes should be streamlined to cater to the ever-
of insurance branches. While locating branches, growing demands of the customers. IT & Data
branch manager needs to consider the number of warehousing will smoothen the process flow. IT will
factors such as smooth accessibility, availability of help in servicing the large no. Of customers

ISBN Number : 978-81-958673-8-7 118


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

efficiently and bring down overheads. Technology predicted direction, where market share was used as
can either complement or supplement the channels an indicator of brand equity. Brand recall and
of distribution cost effectively. It also helps to familiarity, however, were found to be the best
improve customer service levels & helps to find out estimators of brand equity in the financial services
profitability & potential of various customers market. P. Kotler rightly states that a company's
product segments. marketing strategy depends on many factors, one of
which is its size and position in the market. From
Physical (evidence) - It refers to the experience of
this assertion he suggests that one method of
using a product or service. When a service goes out
classifying marketing strategies is to place the firm
to the customer, it is essential that you help him see
in accordance with its competitive position; namely
what he is buying or not. For example- brochures,
as to whether they are market leaders, challengers,
pamphlets etc serve this purpose. Evidence is a key
followers, or niches. In effect these are behavioural
element of success for all insurance companies.
strategies ordered in relation to the company's
Physical evidence can be provided to insurance
market share. Impetus for marketing strategy: India
customers in the form of policy certificate and
is a jumbo- sized opportunity for life insurance need
premium payment receipts. The office building, the
hardly be laboured. Here is a nation of a billion
ambience, the service personnel etc. of the
people, of whom merely 100 million people are
insurance company and their logo and brand name
insured. And, significantly, even those who do have
in advertisements also add to the physical evidence.
insurance are grossly underinsured. The emerging
To reach a profitable mass of customers, then new
middle-class population, growing affluence and the
distribution avenues & alliances will be necessary.
absence of a social security system combine to
Initially insurance was looked upon as a complex
make India one of the world’s most you look at it –
product with a high advice & service component.
whether in terms of life insurance premiums as a
Buyers prefer a face to face interaction & they place
percentage of GDP or premium
a high premium on brand names & reliability.
Review of literature: Sankaran (1999) studied the per capita –the market is under penetrated and
measures people are under-insured. In a country where there
is high unemployment and where social security
systems are absent, life insurance offers the basic
that would help domestic players in financial cover against traditionally life’s been a savings
services sector to improve their competitive uncertainties-oriented country and insurance plays a
efficiency, and thereby to reduce the transaction critical role in the
costs. The study found that the specific set of
development of the Indian economy. The role of
sources of sustainable competitive advantage
insurance in the economy is vital as it able to
relevant for Financial Service Industry are:
mobilize premium
payments into long-term investible
product and process innovations, brand equity, funds. As such, it is a key sector for development.
positive influences of 'Communication Goods ‘, So marketing strategies are important and inevitable
corporate culture, experience effects, scale effects, phenomenon to tap
and information technology. Trevor Watkins (1989)
while studying the current state of the financial
services industry worldwide identified four major huge untapped potential. Effective selling of
trends: the trend towards financial conglomeration, insurance policies depends to a large extent on the
globalization, information technology in service marketing strategies selected.
marketing; and new approaches to financial services
II. COMPONENTSOFMARKETINGSTR
marketing: These trends, it was concluded, will
ATEGIES
affect the marketing of banks and other financial
services in the 1990s. Marisa Maio Mackay (2001)
Pricing
examined whether differences exist between service
and product markets, which warrant different Personal selling
marketing practices by applying ten existing
consumer-based measures Advertising

of brand equity to a financial services market. The Word of mouth selling


results found that most measures were convergent
and correlated highly with market share in the
ISBN Number : 978-81-958673-8-7 119
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Institutional image new target segments can be identified. For


example, life, health and personal accident
Quality control insurance can be bundled together. Similarly,
Home Loan and insurance covering fire and
Marketing orientation burglary can be put together. The life insurance
companies provide only packaged policies
whereas new players have been providing several
New approaches to strategize the productization Riders. Rider in insurance parlance is an option
of life insurance services: Latest tools and that gives the policyholder additional coverage
techniques are used by marketers of life insurance without disturbing the fundamental risk coverage.
products to boost the sales The service in the field of life insurance has
improved greatly with the entry of multinationals
to ensure customer satisfaction and brand and rising competition. The customer should have
building. Some are the approaches to survive in the option to continue or to switch over or to
this scenario are as under: come out of the given policy. The service in the
field of life insurance has improved greatly with
Innovation: Innovation in the delivery system
the entry of multinationals and rising competition.
refers to the internal organizational arrangements
The customer should have the option to continue
that have to be managed to allow service workers
or to switch over or to come out of the given
to perform their job properly, and to develop and
policy.
offer innovative services. All the insurance
companies have a structured internal organization Advertising and sales promotion:
team with customer service teams for the delivery Advertising and publicizing have a
of the service. Extensive training is given to the positive effect on the prospective customers as
service contact personnel who are called the well as personal selling. Both the direct and
financial consultants or Agent advisers. Service indirect strategies have to be balanced and mixed
development, service design and delivery are well to get the desired result. Discounts and
intricately intertwined. All parties involved in any incentives promised along with the policy have to
aspect of the new service must work together at be presented in detail to the customers. The
this stage to delineate the details of the new companies must provide a tangible and be
service. (Valarie A Zeithaml and Mary Jo Bitner, developed to
2003). The need and importance of the customers promote various different life insurance policies.
involvement in the service innovation process is Finding an ideal mix of customers with high
considered to be of prime importance by all the disposable income and targeting them with
life insurance companies as the current market for specific policies is another good promotional
life insurance is customer centric. They also strategy. Life insurance may be one of the most
express their opinion that the new services difficult products to sell, but with an effective
developed currently are based on customer focus. promotional strategy it can be soldeasily.
The degree of involvement of the customer has Technology: Information Technology progress is
gradually increased in the last five years. In the a major driver behind the structural change in the
last two years customers are involved in the new Life insurance industry to enhance risk transfer
service process as information providers. rational efficiency. Ebusiness opens up new ways to
reason to the customers to buy a particular policy. reduce costs while lowering market entry barriers
Unity and honesty must be maintained by the and facilitating the break-up of the traditional
company and the frontline executives at any cost insurance value chain. Insurance clients will
to attract the customers in the long term. Various benefit from greater transparency, lower prices
creative and innovative strategies should and improved services – not just in the sales area,
Product/Service. but also in claims management. New information
and communication technologies are making it
easier for insurers to break up the value chain and
outsource individual functions to specialized
providers. In the long-term basis the information
technology units control the potential for new
differentiation: In case of product differentiation, service delivery since all new products represent a
new products, customized products, tailored more sophisticated delivery of the service.
products, bundled products can be introduced and Although it is argued that service innovations are

ISBN Number : 978-81-958673-8-7 120


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

often non technological, this is still the center of services that can fulfill both banking and
much analysis and debate (Kandampully, 2002). insurance needs, if implemented correctly can
bring vast benefits to stakeholders such as banks,
insurance companies, shareholders and
consumers. Bancassurance will facilitate mass
selling of insurance products through banks.
Banks can act as large financial supermarkets.
III.CUSTOMERRELATIONSHIPMANA Distribution of insurance will be smoother
GEMENT through wider number of branches of the banks.
Insurance companies experiencing competition from Customer database, personalized service, rural
within and abroad. Making this problem- situation penetration, cross-selling of products (e.g. car
into an opportunity lies always on the prudent loan along with car
management adopting or adapting tactics and
strategies. In line of this, customer relationship
management is a measure of winning
competitiveness as it is the information driven insurance), being cheape r In today's scenario of
approach to customer analysis and process a well-regulated life
automation; and thereon supplement customer-value insurance market under the hawkish eye and strong
proposition. An action on tangible services –prompt governance of Insurance Regulatory and
and accurate issue of document, prompt and fair Development Authority of India (IRDAI), there are
settlement of claim, good listening mechanism, still doubts and worries in the minds of the
better problem-solving approach, reliable manner of customers, especially when it comes to trusting the
service and meet requirement of customers on time private players with their lives and investments.
every time - in lieu of intangible promises would Let’s take out some time and bust some myths. than
give utmost satisfaction to customers, the customer agents are some of the greatest advantages of
relationship management provides better service to Bancassurance.
the insured protecting him against perils or risks and At present the distribution channels that are
the insurer enabling to retain the existing customers available in the market are listed below: the birth
and bringing in new customers in his ambit of of Lic Life Insurance Corporation of India (Lic)
business  direct selling
Distribution channels: The distribution  corporate agent
network is most important in insurance industry.  group selling
Insurance is not a high cost industry like telecom Brokers and cooperative societies Bancassurer
sector. Therefore it is building established on the 1st of September1956.
its market on goodwill and access on distribution This was Have you imagined what would the battle
network. We cannot deny that insurance are not the time when our Indian between a giant and the
bought, it is sold. The market has a great scope rest of the warriors would look like? Today,
to grow. This can be better done by more the Life Insurance Market of India is just like
innovative channels like a super market, a bank, that. Many of us still do not have a clear answer
a post office, an ATM, departmental store etc. on whether Lic is government or private. There have
these could be used to increase channels of been too many hearsay & rumours about Lic being
insurance. But such growth in channels shall privatised. However, we have got a concrete answer
increase with time. Till then agents seem to be that will help you understand whether Lic is
the most important distribution channel in this government or private. Parliament passed the LIC
industry. Agents connect with people and Act on 19th June 1956 which led to the
influence them to buy any insurance policy. For nationalization of the Indian insurance industry.
the same such agents charge commission on the More than 245 insurance organizations & provident
policies they get for the company. There is a societies were combined to form the state-owned Lic
fixed percentage of commission for which these of India.
agents work. In the field of distribution
channels, many innovative techniques can be CONCLUSION
adopted. For example, Bancassurance and
selling through postal network will make a great In conclusion, the study on marketing strategies in
deal of difference. In Europe 25 percent of the life insurance services industry sheds light on the
insurance policies are sold through banks. critical role that effective marketing plays in shaping
Bancassurance, as a package of financial
ISBN Number : 978-81-958673-8-7 121
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

the success and growth of life insurance companies.


The research journey embarked upon a
comprehensive exploration of diverse marketing
approaches, analyzing their impact on customer
engagement, brand perception, and overall business
performance. Through a combination of qualitative
and quantitative methodologies, this study has
yielded valuable insights and implications for both
industry practitioners and academia. The findings of
the study underscore the multifaceted nature of
marketing strategies within the context of life
insurance services. Traditional approaches, such as
agent-based models, continue to be integral in
building personal relationships and fostering trust
with customers. Concurrently, the rise of digital
marketing, social media campaigns, and direct-to-
consumer models has revolutionized the way life
insurance companies connect with their target
audiences. The study has shown that these strategies
can effectively expand market reach, enhance brand
visibility, and appeal to a technologically savvy
customer base.

BIBLIOGRAPHY
G. Siva Prasad, M.C.A, M.Tech
(CSE), UGCNET, Works as
Assistant Professor in the
Department of MCA , KBN College
(Autonomous), Vijayawada, Andhra
Pradesh and he is having 10 years of
experience in teaching and one year in industrial
experience. His research interest includes Data
Mining, Machine Learning, Deep Learning, Big
Data, Microsoft Programming Languages and Web
Programming. He has attended workshops on
POWER BI, Data Analytics using R, Generative AI,
Block Chain Technology and many more.
REFERENCE
1. Anuroop Tony Singh. (2004). Challenging
Opportunity. Asia Insurance Post, 28-29.
2. Anil Chandok. (2006). Application of
CRM in the Insurance Sector. Insurance
Chronicle, May,17-19
3. Balaji, B (2002), Services Marketing
Management. New Delhi, Shand &
Company Ltd.
4. Booms, B.H. and Bitner, M.J. (1981),
5. “Market ServiceMarketingFirms”, of
Services . Donnelly J.H and George W.R.
Chicago:
6. American [6] Marketing Association, pp.
47 –51.

ISBN Number : 978-81-958673-8-7 122


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Hybrid Deep Neural network and Long Short-


term Memory Network for Predicting of
Sunspot Time Series

Nagamani Tippala, Siva Prasad Guntakala,


Student, Assistant Professor,
Department of MCA, Department of MCA,
K.B.N College (Autonomous), K.B.N College(Autonomous),
Vijayawada-520001, A.P, India, Vijayawada-520001, A.P, India,
E-Mail.ID: tippalamadhumani@gmail.com E-Mail.Id:gskbn@gmail.com

I. Abstract:
We're talking about studying patterns in the IV. Fully Connected Deep Neural Network:
sun's behavior, like its changing activity over an
eleven-year cycle. This is important because it A fully connected deep neural network consists of
affects life on Earth. To predict this, we suggest layers where each neuron connects to all neurons in
using a special kind of computer program called the next layer. The model's complexity depends on
a deep neural network with LSTM. We want to the number of layers and neurons. A balance is
compare this program's performance with needed between accuracy and computational
another one called RNN. We're keeping efficiency. This study selects optimal parameters
everything the same in our experiments to make through experimentation
it fair. The results show that both the LSTM
program and a different kind of program called
a fully connected deep neural network can
predict sunspots more accurately. This tells us
that the LSTM program is really good at
predicting these sunspot patterns that happen in
cycles.

II. Introduction:
Sunspots are intriguing phenomena on the sun's V. Long Short-Term Memory (LSTM) Network:
surface that have significant effects on Earth.
Predicting sunspot activity is crucial for Mean Absolute Error (MAE): This is like finding
anticipating related events like solar wind and the average of how far off our predictions are from
emissions. The study aims to predict sunspot the actual values. It helps us understand how big
occurrences using advanced neural networks. our prediction mistakes are.
Mean Squared Error (MSE): We're finding the
average of the differences between our predictions
III. Deep Learning: and the real values, but we square those differences
Deep learning involves constructing neural first. This way, we make bigger mistakes count
networks with many layers to comprehend complex more.
data relationships. Unlike traditional methods, deep Root Mean Squared Error (RMSE): It's like the
learning can automatically uncover data features. square root of the MSE. This helps us understand
It's particularly useful for handling intricate the typical size of our prediction errors in a simpler
problems where expert input is challenging. way.

ISBN Number : 978-81-958673-8-7 123


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Mean Absolute Percentage Error (MAPE): This training data. The goal was to make the models
calculates the average percentage difference learn from this data so they can predict sunspot
between our predictions and the actual values. It activity. To figure out how accurate these
tells us how far off we are on average, in terms of predictions are, the scientists used two measures:
percentage. one is called Mean Squared Error (MSE), and the
other is Mean Absolute Error (MAE). These
And here are the formulas, if you're curious: measures help them see how close the model's
For MSE (Equation 1): predictions are to the actual sunspot data..

Take the difference between the predicted value (y) VII. Results and Analysis:
and the actual value (y_b). The hybrid DNN-
Square that difference (multiply it by itself). LSTM model outperforms traditional DNN and
RNN models in predicting sunspot activity. DNN-
Do this for all items and find the average by LSTM leverages both local and historical
dividing by the number of items. information, enhancing accuracy. The study
compares different cost functions and identifies one
For MAE (Equation 2): that works best for the hybrid model.
Take the absolute difference between the predicted VIII. Conclusion:
value (y) and the actual value (y_b). "Absolute"
means ignoring whether it's positive or negative, The paper proposes a hybrid
just make it positive. DNN-LSTM model for predicting sunspot activity.
This model, which combines local and historical
Do this for all items and find the average by information, demonstrates superior performance
dividing by the number of items. compared to traditional methods. The choice of
cost function also impacts model performance
These calculations help us understand how accurate
positively. The hybrid model's ability to consider
our predictions are compared to the real values
both local and historical data contributes to its
we're trying to predict.
effectiveness in predicting sunspot activity.
IX. Reference
1. Panagiotis Papaioannou, Ronen Talmon,
Serafino Di, Ioannis Keramidas, Constantinos
Sitos, Time Series Forecasting Using Mani fold
Learning, (2021).
2. Jie Zhang et al., Earth-affecting solar transients:
a review of progress in solar cycle, Progress in
Earth and Planetary Science, 24, (2021), 8.
10.1186/s40645-021-00426-7.
3. Rajesh Singh, Anita Gehlot, Mahesh Prajapat,
Bhupendra Singh, Deep Learning, (2021),
LSTM networks are a type of recurrent neural 10.1201/9781003245759-5.
network. They handle sequences by maintaining
connections within and between layers, overcoming 4. Samir Hamouda, Sunspots Production and
the vanishing gradient problem in longer Relation to Other Phenom Ena: A Review, (2020).
sequences. LSTM has gates to control information
flow and handle longer-term dependencies. 5. Mohammad Nazari-Sahrawian, Moses
Kerouacian, Relationship betwen Sunspot Numbers
VI. Experiments: and Mean Annual Precipitation: Application of
Cross-Wavelet Transform-A Case Study, (2020), J.
In this research, the scientists looked at data about 3. 10.3390/j3010007.
sunspots that goes back hundreds of years. They
divided this data into two parts: one for teaching 6. H. Abdel-Rahman, Beshir Marzouk, Statistical
the models and another for testing how well they method to predict the sunspots number, NRIAG
work. They made three different models: one is Journal of Astronomy and Geophysics, (2018), 7.
called DNN, another is RNN, and the third one is 10.1016/j.nrjag.2018.08.001.966 S. O. Hasson, M.
DNN-LSTM. They taught these models using the M. AL-Hashimi.

ISBN Number : 978-81-958673-8-7 124


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

7.Elena Malyutina, Vladimir Shiryaev, Time Series


Forecasting Using Nonlinear Dynamic Methods
and Identification of Deterministic Chaos, Procedia
Computer Science, 31, (2014), 1022–1031.
10.1016/j.procs.2014.05.355.
8. Panagiotis Papaioannou, Ronen Talmon,
Serafino Di, Ioannis Kevrekidis, Constantinos
Siettos, Time Series Forecasting Using Manifold
Learning, (2021).
9. Samir Hamouda, Sunspots Production and
Relation to Other Phenomena: A Review, (2020).
10. Jie Zhang et al., Earth-affecting solar transients:
a review of progress in solar cycle, Progress in
Earth and Planetary Science, 24, (2021), 8.
10.1186/s40645-021-00426-7.

10. Mohammad Nazari-Sharabian, Moses


Karakouzian, Relationship between Sunspot
Numbers and Mean Annual Precipitation:
Application of Cross-Wavelet Transform-A Case
Study, (2020), J. 3. 10.3390/j3010007.

11. H. Abdel-Rahman, Beshir Marzouk, Statistical


method to predict the sunspots number, NRIAG
Journal of Astronomy and Geophysics, (2018), 7.
10.1016/j.nrjag.2018.08.001.

12. Sepp Hochreiter, J¨urgen Schmidhuber, Long


Short-term Memory, Neural computation, (1997),
9. 1735-80. 10.1162/neco.1997.9.8.1735.

13. Rajesh Singh, Anita Gehlot, Mahesh Prajapat,


Bhupendra Singh, Deep Learning, (2021),
10.1201/9781003245759-5.

14. Albert Liu, Oscar Law, Deep Learning, (2021),


10.1002/9781119810483.ch2.

15. Miroslav Kubat, Deep Learning, (2021),


10.1007/978-3-030-81935-4 16.

16. Gareth James, Daniela Witten, Trevor Hastie,


Robert Tibshirani, Deep Learning, (2021),
10.1007/978-1-0716-1418-1 10.

ISBN Number : 978-81-958673-8-7 125


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Study of Wireless Communication for


Substation Automation
T. Sai Lakshmi Niharika, Siva Prasad Guntakala ,
Student, Assistant. Professor,
Department. Of MCA, Department. Of MCA,
KBN College (Autonomous), KBN College (Autonomous),
Vijayawada 520012, Vijayawada 520012,
Andhra Pradesh, India, Andhra Pradesh, India,
Email: sailakshminiharika@gmail.com Email: gspkbn@gmail.com

integration and increasing grid complexity, the


Abstract traditional paradigms of substation operation are
being redefined. At the heart of this
In recent years, the automation of transformation lies the integration of wireless
substations has been a pivotal focus in enhancing communication technologies, which not only
the efficiency and reliability of electric power promise to enhance the efficiency and reliability
systems. The emergence of the smart grid concept of substation Automation but also pave the way
has enabled nations to realize fully automated for a more flexible and adaptable energy
power grids, redefining the landscape of energy infrastructure. This study embarks on a journey
management. The internationally recognized into the intricate web of wireless communication
standard, IEC 61850, has introduced a paradigm solutions tailored for Substation automation,
shift in substation design, prominently through the exploring their potential to revolutionize how we
replacement of traditional wired technologies with monitor, manage, and secure the critical nodes of
wireless alternatives like ZigBee, WiMAX, our power grids. The significance of wireless
WLAN, and Wireless HART. The limitations communication in substation automation cannot
associated with wired technologies, including the be overstated, as substations constitute the vital
need for extensive trenching, labour-intensive arteries of any power distribution network. They
wiring assemblies, and intricate installation serve as the nexus where electricity is
processes, have prompted the exploration of transformed, controlled, and dispatched to end-
wireless solutions, rendering wired communication users, making their seamless operation paramount
obsolete. to grid reliability. Traditional wired
This article presents a comprehensive communication systems, while dependable, often
review of state-of-the-art wireless technologies for face limitations in terms of scalability, cost, and
substation automation. It delves into the distinct deployment flexibility. In this context, wireless
features and capabilities of each technology while technologies offer a promising alternative,
addressing their respective shortcomings. The enabling real-time data exchange, remote
transformative impact of IEC 61850 is discussed in monitoring, and rapid response capabilities that
terms of fostering wireless technology adoption and can optimize substation performance while
streamlining substation design. ensuring the highest standards of safety and
Keywords—Substation Automation, Smart Grid, reliability. Thus, this study delves into the
Wireless Communication, IEC 61850, ZigBee, intricate interplay between wireless
WiMAX, WLAN, Wireless HART. communication and substation automation,
unravelling the potential benefits and challenges
that lie at the intersection of these two pivotal
INTRODUCTION domains.
In an era characterized by relentless RELATED WORK
technological innovation, the study of
Several research papers have described
wireless communication for substation
the existing novel wireless technologies. Some
automation has emerged as a focal point in the
significant papers have been referenced below,
realm of power grid management and control
while 15 other papers have been reviewed in
systems. As the demands on modern electrical
detail and the findings have been tabulated in
grids continue to evolve, driven by factors such as
Section VII B. of this paper. D.
renewable energy

ISBN Number : 978-81-958673-8-7 126


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Summarizes the various wireless automation enhances operational efficiency,


communication devices used for electric power reduces maintenance costs, and enables rapid
response to faults or changes in the power grid,
thereby improving overall system reliability,
minimizing downtime, and contributing to the
system automation, by providing in-depth
evolution of smarter and more resilient power
details about their technical aspects as well as
infrastructure. Based on their function, substation
applications [2]. J. P. I. Iola et al. studied the
automation equipment is categorized under two
evolutionary trend in various communication
subgroups [3]:
technologies used for substation automation
along with the numerous technical and
economic benefits that accompany the a) Primary equipment
implementation of IEC 61850-9-2 in These are mainly used to perform
substations [3]. various functions such as switching and
Josef Horalek et al. focuses on the control of voltage level. Transformers and
various communication protocols in substation switchgears fall under this category.
automation with a main focus on modern
revolutionary standard IEC 61850 and its
implementation in substation automation b) Secondary equipment
systems [4]. Palak P. Parikh et al. Discusses the These are the equipment used for the
various popular wireless technologies used for automation of the entire substation as well as
smart grid applications with respect to their for the protection of the primary equipment,
functions as well as the challenges they face and mainly consist of protection, control and
[5]. communication devices.

OVERVIEW B. Levels of SAS


Modern SAS can be divided into three
levels i.e., station, bay, and process levels [3]:
A. Substations
Substations are integral components of electrical a) Station Level
power systems, acting as intermediate points Station level comprises of HMI (Human
where voltage levels are transformed, controlled, Machine Interphase), database and computer
and distributed. These facilities play a crucial and operator’s workplace. Information from
role in ensuring the efficient transmission of the bay-level IEDs is analysed here.
electricity from power generation sources to end-
users. Substations house a variety of equipment b) Bay Level
such as transformers, circuit breakers, switches, Bay level consists of circuit breakers and
and monitoring devices, which enable voltage associated isolators, earth switches and
regulation, fault protection, and the redirection of instrument transformers. The IEDs provide
power along different paths. By facilitating these bay level functions which include control,
functions, substations contribute to the monitoring and protection.
reliability, stability, and effective management of
the overall power grid, serving as key nodes for c) Process Level
power transformation and distribution. Process level contains all the switchyard
Substation Automation devices, and the main function is to extract the
Substation automation refers to information generated by the switchgears and
the integration of advanced technology and instrument transformers and transfer them to
communication systems within electrical the bay level devices.
substations to enable remote monitoring, control,
and management of equipment and processes.
This involves the deployment of intelligent
electronic devices, sensors, and communication
protocols to facilitate real-time data exchange,
allowing operators to remotely supervise and

regulate various substation components such as


transformers, breakers, and relays. Substation

ISBN Number : 978-81-958673-8-7 127


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

C. NEED FOR COMMUNICATION IN


SUBSTATIONS
The need for communication in
substations is paramount to ensure the efficient
and coordinated operation of electric power IEC 61850 is an internationally recognized
systems. Substations serve as pivotal nodes standard that defines a comprehensive framework
for the design and operation of communication
networks and systems in substations within the
realm of power utility automation. This standard
within the power grid, facilitating the facilitates the integration of various intelligent
transformation, distribution, and monitoring of electronic devices, enabling seamless
electricity. Effective communication systems communication, interoperability, and efficient
within substations enable real-time data management of power systems. IEC 61850
exchange between various intelligent devices, encompasses standardized data models,
including circuit breakers, relays, sensors, and communication protocols, and engineering
controllers, allowing for swift decision-making, methodologies, revolutionizing the way substations
fault detection, and system optimization. This are designed, operated, and maintained by
seamless communication is vital for replacing traditional wired technologies with
maintaining grid stability, minimizing advanced digital communication methods, thus
downtime, and enabling remote monitoring and enhancing reliability, flexibility, and scalability in
control, ultimately enhancing the overall modern power grids
reliability, safety, and performance of the
power infrastructure.

D. SMART GRID cables from the control house to IEDs in the


Several countries have developed a yard. However, for wireless communication,
completely automated power grid which is no trenching is required thus leading to
termed as smart grid [3]. Smart grids have savings on this account. Wireless
proved to be revolutionary in the development, communication can be deployed with greater
integration, and automation of electric power ease as no trenches and conduits are required
systems. They have provided several benefits [2], making it highly portable. An estimation
such as cost reduction, energy saving and a by ABB Civil Engineers concluded that the
more transparent and enhanced quality of costs for trenching to retrofit a substation yard
electric power distribution. Advances in smart is about $40,000 to $45,000, with costs of
grid technology have led to the $100,000 to $150,000 for larger installations.
implementation of self-healing capabilities, Another report says that the expense for
which refers to the ability of the smart grid to digging trenches to wire IEDs in an existing
restore power automatically, in the event of an substation was about $250,000.
outage. However, the initial construction of
b) Reduced Labour
smart grids needs huge investments. Fully
automated substations, now referred to as Wireless communication also reduces the
digital substations, form an integral part of dependence on manpower as it is required to
smart grids as they perform several vital engage a contractor to dig, run cables etc. in
functions throughout the electric power case of wired communication. In case of
system, from generation of power to trenching, there may arise issues such as union
distribution to consumers. rules related to contract labour, political and
judicial issues
Advantages over Wired Communication:
While the advantages are numerous, some
c) Ease of Expansion
of the most important ones have been elaborated
below: Adoption of wireless communication
allows for easy incorporation of future
expansions. In case of wired communication,
a) Low Cost for adding an extra IED or a pole, a new
In substations, the cost for wired systems conduit needs to be laid, thus involving
is high due to the need for trenching to run additional trenching. Wireless networks use

ISBN Number : 978-81-958673-8-7 128


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

mesh technology. Sensors that use IEEE many advantages to electric utilities and service
802.15.4 based radio transceivers can function providers in the smart grid. WiMAX
for several years even in harsh conditions communication network is also extensively used
without any external power. for monitoring, control and protection in SAS as
well as rapid outage detection and restoration
d) Extended Range
using two-way communication.
Wireless technologies provide an
extended range (9 kHz to 300 GHz) for data 3. Wireless LAN
acquisition and communication making the IEEE 802.11 based Wireless
need for lengthy cables required for wired LAN (WLAN) also known as Wireless Ethernet,
communication, redundant. is one of the most widely deployed and efficient
e) Reduced Susceptibility to wireless data networks in the world (second only
Environmental Factors to cellular networks) [2]. It provides high-speed
Wireless technologies also present communication and offers several benefits over
reduced susceptibility to environmental factors wired LAN as it is more cost efficient, mobile,
that may disrupt communication, such as: and easy to install. IEEE 802.11b commonly
unstable ground, lightning, natural/manmade known as Wi-Fi is a type of WLAN and can
carry out data transfer at exceptionally high
speeds. WLAN can be used to enhance
distribution substation automation and
protection as well as power line protection
disasters like earthquakes and cyclones, between two substations.
accidental or orchestrated human
4.TVWS (TV White Space)
interruption/damage to communication
devices.

TV White Space (TVWS) refers to the unused or


Types of Wireless Communication
underutilized frequencies in the radio spectrum
that were originally designated for analogy
Some of the most significant wireless television broadcasting. With the transition to
communication technologies are discussed below: digital broadcasting, these frequency bands have
become available for other wireless
1. ZigBee communication applications. TVWS technology
Based on the standards of IEC 61850, enables dynamic and opportunistic use of these
substations previously carried out communication vacant channels for various purposes like
via Ethernet bus (a type of wired network). broadband internet access, rural connectivity,
However, in recent times this wired Ethernet has smart city infrastructure, and IoT networks. Its
been replaced by ZigBee, one of the most popular significance lies in its ability to provide extended
wireless communication technologies [1]. Being a coverage, improved penetration through
cost effective, low power and reliable network, obstacles, and efficient spectrum utilization,
ZigBee is suitable for Wireless Sensor Networks thereby addressing connectivity gaps and
(WSNs). It can also carry out direct load control promoting innovative wireless services in both
along with remote meter reading and several other urban and rural settings.
functions.
2. WiMAX
WiMAX stands for Worldwide
Interoperability for Microwave Access and is part
of Wireless Metropolitan Area Network
(WMAN) [2]. It provides high speed data transfer
over large distances and Wireless Automatic
Meter Reading (WMAR) based on WiMAX is
used for remote revenue metering, hence reducing
human effort and well as errors. It also offers

ISBN Number : 978-81-958673-8-7 129


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Advantages over Wired Communication

While the advantages are numerous, some of the most important ones have been elaborated below:

ISBN Number : 978-81-958673-8-7 130


Year Device/Me - Features Limitations Future Work
chanism
2011 Bluetooth •Used for remote control, • Higher  Development of
supervision, monitoring, as developmental
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous) cost auxiliary equipment
well as other functions to and risk Shorter to assist Bluetooth
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
improve network range compared to technology to cover
communication other wireless long range
• Provides improved efficiency, devices communication
safety and economic benefits
to the power grid
• Facilitates troubleshooting and
maintenance

Mi Microsoft surface • Windows RT OS: Ran  Limited app  Microsoft


RT Windows RT, offering touch- selection due to discontinued support
optimized interface and apps windows RT’s and updates for
2012
from Windows Store. ARM architecture Surface RT due to its
• Keyboard Covers: Magnetic and inability to run ARM architecture,
keyboard covers (Touch traditional x86 limiting its future
Cover, Type Cover) for typing windows program. usability and
and productivity. advancement
opportunities.
2013 Wireless • Offers increased data rates (up • The average radio • Development of a
LAN to 600 Mbps) under the newly noise level should system with low
released IEEE 802.11n be higher than the delay and high
standard threshold value. throughput
• Increased flexibility, lower This can result in
cost and higher portability higher delay and
when compared to wired lower throughput.
technologies The study has
• Advanced security identified this to
be 16 dB

2013 ZigBee • Low cost • While the • Development of the


• Used for low-power integration of ZigBee system in a
applications such as smart ZigBee and IEC manner that allows it
metering and consumer 61850 is able to to accommodate the
electronics achieve the transfer of high-
• Allows building of both small transfer of low and speed messages,
and large networks, while medium speed which will open up
providing good security for messages, if the this system to a
the same channel usage is greater number of
under a certain applications
percentage, it is
unable to do so for
high-speed
communication,
which restricts the
usage of ZigBee
2014 Wireless • First standard to be certified • Does not meet the  Development of a
HART by IEC for wireless IEC 61850 dedicated IEC
communications response time gateway in order to
• Channel hopping for each standard meet the critical
transmission response time
• Increased reliability due to requirements
different routing mechanisms
• Highly reliable even in harsh
environments

2015 Goose • Provides more flexibility • Star topology • GOOSE


compared the GSSE network is used. transmission
mechanism High efficiency Switches used in mechanism can
• Sequence re-transmission the straight provide theoretical
mechanism based on Ethernet collecting and references for the
multicast technology is tripping scheme selection of practical
applied, in order to guarantee have low intelligent substation
the reliability and timeliness operating communication
ISBN Number : 978-81-958673-8-7 experience but networks 131
high cost
• In the point-to-
point SV (Sample
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

ISBN Number : 978-81-958673-8-7 132


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

CHALLENGES FACED BY WIRELESS


COMMUNICATION TECHNOLOGIES
Channel behaviour,
corona effect, accidental or directed REFERENCES
interference or jamming, eavesdropping, gap
discharge breakdown and unauthorized
modification of the communications which are  D. Nowak, Ł. Krzak, and C. Worek,
not protected by authentication and encryption “Integration of ZigBee and IEC 61850
are some of the major challenges faced by networks for a substation automation system,”
wireless technologies [23]. The targets within 4th IEEE PES Innovative Smart Grid
the substations are usually the devices related Technologies Europe (ISGT Europe), October
to operation regulation, such as IEDs and 6- 9, Copenhagen, 2013, Page 1
RTUs. Wireless technologies operating in  Suhag A.R, Arun T V “Substation
unlicensed spectrum are more vulnerable to Automation Systems”, published in IJSER
interference or noise [7]. While operation in Journal, volume 7 Issue 05, May 2016, pp-
licensed spectrum may be the more 215-218, Page 1-2.
appropriate option for effective  V. Desai, A. Desai. Sharma & Jariwala
communication, it is comparatively much (2016). “Implementation of Remote
more expensive. The major challenge that Monitoring of Substation Equipment’s Using
wireless communication faces is security. GSM” IJAREEIE Journal Vol. 7.
Encryption of messages and setting up of a doi:10.15662/IJAREEIE.20156.0506136.
firewall inside the IEC 61850 communication  Liu Yunfa, Liu Guangfa, Cheng Ruonan,
network can be adopted in order to reduce Yuan Zihao, Zhou Hanfang, Intelligent
susceptibility to cyber-attacks. There is scope substation communication system[J]. Modern
for improvement based on ongoing research to Industrial Economy and Informationization,
develop wireless communication technologies 2019, 9(4):87-88. DOI:10.16525/j.cnki.14-
which are less susceptible to security-related 1362/n.2019.04.36.
attacks and to the adverse effects of  Mohd. A. Aftab, S. M. S. Hussain, I. Ali, and
harmonics. T. S. Ustun, “IEC 61850 based substation
automation system: A survey,” International
Journal of Electrical Power & Energy
CONCLUSION AND FUTURE WORK Systems, vol. 120, p. 106008, Sep. 2020.
 Y. Edalat, O. Katia, and Jong-Suk Ahn.
Automation of "Smart adaptive collision avoidance for IEEE
substations has provided numerous benefits in 802.11." Ad Hoc Networks, Vol. 124, 2022.
almost all areas of their functioning.  Y. Ren, J. Cheng, and J. Chen, “Design of
Automation ensures that the number of non- a Substation Secondary Equipment-Oriented
supervised functions is reduced to almost zero Error Prevention System Using Wireless
and hence the time needed in the identification Communication Technology and Edge
of an error or fault is greatly reduced. The Node Cooperation,” Wireless
advantages of wireless technology over wired Communications and Mobile Computing,
technology for communication, has been vol. 2022, no. 6249549, 2022
discussed along with the most common
challenges faced by almost all wireless
communication technologies. While, wireless
communication technologies have eliminated a
lot of the issues faced by adoption of wired
communication technologies, there is a need to
further improve the systems to ward off security
threats. Most of the present wireless
communication devices do not have appropriate
security measures in place, which can greatly
compromise the security of the data being
transferred. There is a dire need for updating of
the present devices in order to incorporate the
same.

ISBN Number : 978-81-958673-8-7 133


A Survey Of Data Mining Techniques For Cyber
Security
Mudraboina. Ganesh, Siva prasad Guntakala,
Student,
Department of MCA, Assistant Professor,
K.B.N. College (Autonomous), Department of MCA,
Vijayawada-520001, K.B.N. College (Autonomous),
Andhra Pradesh, India Vijayawada-520001,
Email: gspkbn@gmail.com Andhra Pradesh, India,
Email: gspkbn@gmail.com

Abstract— cyber infrastructure against potentially malicious threats, a


growing collaborative effort between cyber security
Cyber security is the area that deals with protecting from professionals and researchers from institutions, private
cyber terrorism. Cyber-attacks include access control industries, academia, and government agencies has engaged
violations, unauthorized intrusions, and denial of service as in exploiting and designing a variety of cyber defense
well as insider threat. Security of an information system is systems.
its very important property, especially today, when
computers are interconnected via internet. Because no
system can be absolutely secure, the timely and accurate Network Network
Sys Sys
Defense
detection of intrusions is necessary. For this purpose, Defense
Intrusion Detection Systems (IDS) were designed. The IDS te
Fire teFire
in combination with data mining can provide the security m
wall mwall
with next level. Data mining is the process of posing queries Spam
Anti
and extracting patterns, often previously unknown from Cy Security Filter
Anti virus
large quantities of data using pattern matching or other
berThreats virus
reasoning techniques. This Paper gives the over view of the Net H
different data mining techniques which can be used in Cyber intru
wor intru
os
security for intrusion detection..
dete
sion dete
sion
k
ctio t
ctio
I. INTRODUCTION n n
Cyber security is concerned with protecting computer and
network systems from corruption due to malicious software
including Trojan horses and viruses. Data mining for cyber Conventional cyber security system
security applications For example, anomaly detection Cyber security systems are composed of network security
techniques could be used to detect unusual patterns and systems and host security systems. Each of these has,
behaviors. Data mining is the process of identifying patterns firewall, antivirus software, and an intrusion detection system
in large datasets. Data mining techniques are heavily used in (IDS). IDS discover, determine, and identify unauthorized
scientific research as well as in business, mostly to gather use, duplication, alteration, and destruction of information
statistics and valuable information to enhance customer systems
relations and marketing strategies. Data mining has also
proven a useful tool in cyber security solutions for The second line of cyber defense is composed of reactive
discovering vulnerabilities and gathering indicators for security solutions, such as intrusion detection systems
baseline. In this paper, we will focus on Data mining (IDSs). IDSs detect intrusions based on the information from
application for cyber security. To comprehend the log files and network flow, so that the extent of damage can
mechanism to be adopted in order to safeguard the computers be determined, hackers can be tracked down, and similar
and network, it is imperative to understand the types of attacks can be prevented in the future. Data mining or
threats that endanger the cyber network. knowledge discovery (KDD) is a method used to analyze
data from a target source and compose that feedback into
Cyber Security useful information. In cyber security Data mining techniques
Cyber security is set of rules and technologies which are are being used to identify doubtful conditions.
mean to protect our systems, network, and data from Cyber Terrorism, Threats and Malicious Software
unauthorized access, attacks, and unwanted interrupts. They
are aim to maintain the confidentiality, integrity, and
availability of information and information management Now a day’s internet has allowed for a vast exchange of
systems through various cyber defense systems. To secure information. Thus has created a cyber space in which
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

terrorists can implement attack. Cyber-terrorism, according


to the O’ Leary (2010) is committed through the use of
cyberspace or computer resources. This use of cyber space
results in there no longer being simply a physical threat of
terrorism. Janczewski, & Colori (2008) defines cyber
terrorism as: “Cyber terrorism means pre-mediated,
politically motivated attacks by sub national groups or
clandestine agents or individuals against information and
computer systems, computer programs, and data that results
in violence against non-combatant targets.” Cyber Terrorism
is one of the major threat to world now. Over recent decades,
it has become apparent that our society is becoming
increasingly information technology dependent.
For Example of banking system. If terrorist attack such a
system and deplete accounts of funds, then the bank could
lose millions or billions of dollars. Crippling the computer
system millions of hours of productivity could be lost, which
is ultimately equivalent to money loss. Even a simple power
Due to the availability of large amounts of data in cyber
failure at work could cause several hours of productivity loss
infrastructure and increasing number of cyber criminals
which ends in financial loss. Therefore, it is imperative that
attempting to gain unauthorized access to the data, there is
our information system could be secured. Threats can occurs
need of capabilities to address the challenges of cyber
from outside or inside of an organization. Malicious software
security. Data mining tools predict future trends and
are the codes or procedures or programs which are mean to
behaviors by reading through database for hidden patterns,
damage the systems, networks, clients and servers, databases.
learning these behaviors is important, as they can identify
The most common types of this are virus, warms, and Trojan
and describe structural patterns which helps to generate the
horses. Intruders try to tap into network and get vital
knowledge on the basis of that data, and helps the
information. It can be a human or malicious software set by
organization to answer the questions that were too time-
humans.
consuming perversely. Learning user patterns or behaviors is
Data Mining critical for intrusion detection and attack predictions.
In general, it is a process that involves analyzing Data mining application for cyber security is the use of
information, predicting future trends, and making proactive, data mining techniques to detect cyber threats. Data mining
with the combination of machine learning is being applied to
Knowledge-based decisions based on large datasets. It is
problems areas such as intrusion detection and auditing in
a process that involves scanning the information, predicting
cyber security and which is very effective technique. In
future trends, and making the knowledge-based decisions
recent years, many IT industry giants such as Comodo,
based on large datasets. Data mining According to Sill tow
Symantec, and Microsoft have started using data mining
(2012) automates the detection of relevant patterns in a
techniques for malware detection.
database, using defined approaches and algorithms to look
into current and historical data that can then be analyzed to DATA MINING METHODS FOR CYBER
predict future trends.
This section describes the different DM methods for
While the term data mining is usually treated as a cyber security. Many methods are used for mining big data,
synonym for Knowledge Discovery in Databases (KDD), it’s but the following eight are the most common. Each technique
actually just one of the steps in this process. The main goal of is described with some detail, and references to seminal
KDD is to obtain useful and often previously unknown works are provided.
information from large sets of data.

Association Rule
The association rule mining finds the relation among
variables in database. Let’s take an example IF (A AND B)
THEN C. This rules describes that IF A and B are present,
then there is also presence of C. Association rules have
metrics that tell how often a given relationship occurs in the
data.

ISBN Number : 978-81-958673-8-7 135


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Association Rule Mining was introduced by Agrawal et The neural network


al. As a way to discover interesting co-occurrences in
Neural Networks are inspired by the brain and composed
supermarket data. It finds frequent sets of items (i.e.,
of interconnected artificial neurons capable of certain
combinations of items that are purchased together in at least
N transactions in the database), and from the frequent items Computations on their inputs. The input data to the first
sets such as {X, Y}, generates association rules of the form: layer activate the neurons of the network whose output is the
X → Y and/or Y → X. input to the second layer of neurons in the network.
Neural networks re long training times and are therefore
more suitable for applications where this is feasible. In
A simple example of an association rule pertaining to the
Intrusion detection system Two kind of NNs are used; 
items that people buy together is: IF (Bread AND Butter) →
multilayered feedforward neural networks
Milk
Kohonen’s self-organizing maps.
The example says that if a person buys bread and butter,
then he also buy milk These techniques are used to model complex
relationships between inputs and outputs and to discover new
patterns. The combination of Self organizing map and back
The study by Brahmi is a example of association rules propagation neural network supply a very efficient mean for
applied on the DARPA 1998 data set to draw the detection of new intrusions.
relationships between TCP/IP parameters and attack types.
Statistical techniques
The work based on multidimensional Association Rule
Mining, in which there is more than one variables in the Statistical-based systems (SBIDs) take a different
rules, such as (IF (service AND src_port AND dst_port AND approach to intrusion detection. The concept of the SBID
num_conn) THEN attack type), which is an example of a system is simple: it determines "normal" network activity and
four-dimensional rule. The best results are using six then all traffic that falls outside the scope of normal is
dimension rules. The approach is promising for building flagged as anomalous (not normal). It involves the collection
attack signatures. of data relating to the behavior of legitimate user over a time
period. Then statistical tests are applied to observed behavior
to determine high level of confidence whether that behavior
Clustering is not legitimate behavior. It fall into two broad categories:
Clustering is used to assign the similar data object in • Threshold detection
groups called clusters so that the objects in one cluster are
• Profile-based anomaly detection
more similar to each other than objects in other clusters. In
simple word this process is used to identify data items that This process of traffic analysis continues as long as the
have similar characteristics. Clustering [4] is a set of SBID system is active, so, assuming network traffic patterns
techniques for finding patterns in high-dimensional unlabeled remain constant, the longer the system is on the network, the
data. The main advantage of clustering for intrusion detection more accurate it becomes.
is that it can learn from audit data without requiring the
DATA MINING FOR MALWARE DETECTION
system administrator to provide explicit descriptions of
various attack classes. Data mining is one of the four detection methods used
today for detecting malware. The other three are scanning,
Activity monitoring, and integrity checking.
The decision tree technique When building a security app, developers use data mining
methods to improve the speed and quality of malware
The decision tree is a tree like structure having leaves
detection as well as to increase the number of detected zero-
which represent the classification and branches which
day attacks. There are three strategies for detecting malware:
represent
• Anomaly Detection
The conjunction of features that lead to those
classifications. Decision tree depends on if–then rules, but it • Misuse detection
does not requires parameters and metrics. This simple and
• Hybrid detection
interpretable structure allows decision trees to solve multi-
type attribute problems. Decision trees can also manage Anomaly detection is the identification of rare events or
missing values or noise data. However, they cannot observations which raise suspicions by differing significantly
guarantee the optimal accuracy that other machine-learning from the majority of the data. It involves modeling the
methods can. The advantages of decision trees is simple normal behavior of a system or network in order to identify
implementation. deviations from normal usage patterns. Anomaly based

ISBN Number : 978-81-958673-8-7 136


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

detection can also detection the previous unknown both legitimate and malicious classes. Training a
attacks and use for defining the signature for misuse classifier using such file sample collection makes it possible
detectors. The main problem with anomaly detection is that to detect newly released malware. The effectiveness of data
any deviation from the normal, even if it is a legitimate mining techniques for malware detection critically depends
behavior, will be reported as an anomaly, thus producing a on the features which are extracted and the categorization
high rate of false positives. techniques used.
Misuse detection, also known as signature-based DATA MINING FOR INTRUSION DETECTION
detection, identifies only known attacks based on examples
Apart from detecting malware code, data mining can be
of their signatures. It refers to detection of attacks by looking
effectively used to detect intrusions and analyze audit results
for specific patterns, such as byte sequences in network
to detect anomalous patterns too. Malicious intrusions may
traffic, or known malicious instruction sequences used by
include intrusions into operating systems, networks, servers,
malware. This technique has a lower rate of false positives
web clients and databases.
but can’t detect zero-day attacks.
There are two types of intrusion attacks we can detect
A hybrid approach combines anomaly and misuse
using data mining methods:
detection techniques in order to increase the number of
detected intrusions while decreasing the number of false • Host-based attacks, when the intruder focuses
positives. It doesn’t build any models, but instead uses on a particular machine or a group of machines
information from both harmful and clean programs to create
• Network-based attacks, when the intruder
a classifier – a set of rules or a detection model generated by
attacks the entire network
the data mining algorithm. Then the anomaly detection
system searches for deviations from the normal profile and Network-based defense systems control network flow by
the misuse detection system looks for malware signatures in network firewall, antivirus, spam filter and network intrusion
the code. detection techniques. Host-based defense systems control
upcoming data in a workstation by firewall, intrusion
Detection process
detection techniques and antivirus installed in hosts systems.
When using data mining, malware detection consists of
Conventional approaches to cyber defense are
two steps:
mechanisms designed in authentication tools, firewalls, and
• Extracting features network servers that monitor, track, and block viruses and
other malicious attacks. For example, the Microsoft
• Classifying/clustering
Windows operating system has a built-in Kerberos
Machine learning algorithms learn the patterns from fixed cryptography system that protects user information. Antivirus
length feature vectors, and therefore feature extraction is the software is designed and installed in personal computers and
first step before using these algorithms for malware analysis. cyber infrastructures to ensure customer information is not
For features that are in the form of sequences, such as used maliciously. These approaches create a protective shield
sequences of code bytes, operation codes, system calls, or for cyber infrastructure. Data-capturing tools, such as Solaris
any API calls, the creation of a representative feature vector BSM for SUN, LINPAC for Linux, and Win cap for
is a nontrivial problem. Feature extraction can be performed Windows, capture events from the audit files of resource
by running static or dynamic analysis with or without information sources (e.g., network). Events can be host-based
actually running harmful software. A hybrid approach that or network-based depending on where they originate. If an
combines static and dynamic analysis may also be used. event originates with log files, then it is treated as a host-
based event. If it originates with network traffic, then it is
During classification and clustering, file samples based
treated as a network-based event. A host-based event
on feature analysis are classified into groups. To classify
includes a sequence of commands executed by a user and a
samples, we can use any classification or clustering
sequence of system calls launched by an application. A
techniques. To classify file samples, we need to build a
network-based event includes network traffic data, e.g., a
classifier using classification algorithms such as Artificial
sequence of TCP/IP network packets. The data-preprocessing
Neural Network (ANN), Decision Tree (DT), Support Vector
module filters out the attacks for which good signatures have
Machines (SVM) or Naive Bayes (NB). Clustering is used
been learned.
for grouping malware samples that shares similar
characteristics. Using machine learning techniques, each
classification algorithm constructs a model that represents

ISBN Number : 978-81-958673-8-7 137


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

DATA MINING FOR FRAUD DETECTION


We can detect various types of fraud using data mining incomplete information before working with them. Lack
techniques, it may be financial fraud, telecommunications of information or the presence of duplicate records or errors
fraud or any computer intrusions. In general, data mining can significantly decrease the effectiveness of complex data
techniques can be classified into two categories according to mining techniques. Only using accurate and complete data
the type of the machine learning techniques for fraudulent can ensure high quality of analysis.
activities it can be detected with the help of supervised and
CONCLUSION
unsupervised learning. Supervised learning for fraud
detection involves classification of available record in In this paper we have overlooked different data mining
fraudulent and non-fraudulent categories. Then machines are techniques for cyber security. It is a young interdisciplinary
trained to identify records according to these categories. field, drawing from areas such as database systems, data
However, these methods are only capable of identifying warehousing, statistics, machine learning, data visualization,
frauds that has already accorded [7].Unsupervised Learning information retrieval, and high-performance computing.
for Fraud Detection method only identifies the likelihood of
Data mining has great potential as a malware detection
some records to be more fraudulent than others without
tool. It allows you to analyze huge sets of information and
statistical analysis assurance [8]. It helps in identifying
extract new knowledge from it. When determining the
privacy and security issues in data without using statistical
effectiveness of the methods, there is not only one criterion
analysis.
but several that need to be taken into account. Depending on
DATA MINING PROS AND CONS a particular IDS some might be more important than others.
Another crucial aspect Data mining for cyber intrusion
Using data mining in cyber security lets us
detection is the importance of the data sets for training and
• Process large datasets faster; testing the systems.
• Create a unique and effective model for each The main benefit of using data mining techniques for
detecting malicious software is the ability to identify both
Particular use case;
known and zero day attacks. However, since a previously
• Apply certain data mining techniques to detect zero- unknown but legitimate activity may also be marked as
potentially fraudulent, there’s the possibility for a high rate of
Day attacks.
false positives.
Data mining helps us quickly analyze huge datasets and
automatically discover hidden patterns, which is critical
when it comes to creating an effective anti-malware solution REFERENCES
that’s able to detect previously unknown threats. However,
 A. Mukkamala, A. Sung, and A. Abraham,
the final result of using data mining methods always depends
“Cyber security challenges: Designing efficient
on the quality of data you use.
intrusion detection systems and antivirus tools,”
in Enhancing Computer Security with Smart
Technology, V. R. Vemuri,Ed. New York, NY,
There are also certain drawbacks we need to know about:
USA: Auerbach, 2005, pp. 125–163.
• Data mining is complex, resource-intensive, and
 R. Agrawal, T. Imielinski, and A. Swami,
expensive
“Mining association rules between sets of items
• Building an appropriate classifier may be a in large databases,” in Proc. Int. Conf. Manage.
challenge Data Assoc. Computer. Mach. (ACM), 1993, pp.
207–216.
• Potentially malicious files need to be inspected
manually  H. Brahmi, B. Imen, and B. Sadok, “OMC-IDS:
• Classifiers need to be constantly updated to include At the cross-roads of OLAP mining and
samples of new malware intrusion detection,” in Advances in Knowledge
Discovery and Data Mining. New York, NY,
• There are certain data mining security issues, USA: Springer, 2012, pp. 13–24.
including the risk of unauthorized disclosure of
sensitive information  K.Jain and R. C. Dubes, Algorithms for
Clustering Data. Englewood Cliffs, NJ, USA:
When using data mining in cyber security, it’s crucial to Prentice-Hall, 1988.
use only quality data. However, preparing databases for
analysis requires a lot of effort, time, and resources. You  Summit Dual and Xian Du “Data Mining and
need to clean all your records of duplicate, false, and Machine Learning in Cyber security”

ISBN Number : 978-81-958673-8-7 138


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

 K. Hornik,M. Stinchcombe, and White,  Tools and Applications. 2021; 24(2):1435–


“Multilayer feedforward networks are universal 1453. doi: 10.1007/s10586-020-03203-
approximators,” Neural Newt., vol. 2, pp. 359– 1. [CrossRef]
366, 1989.
 Bolton, R. and D. Hand, Statistical fraud
detection: A review. Statistical Science 17 (3),
pp. 235-255, 2002.
 https://www.apriorit.com/dev-blog/527-data-
mining-cyber-security
 Anna L. Buczak, Member, IEEE, and Erhan
Guven, Member, IEEE,” A Survey of Data
Mining and Machine Learning Methods for
Cyber Security Intrusion Detection” IEEE
COMMUNICATIONS SURVEYS &
TUTORIALS, VOL. 18, NO. 2, SECOND
QUARTER 2016.

 "Spoofing". Oxford Reference. Retrieved 8


October 2017,
 Marcel, Sebastian; Nixon, Mark; Li, Stan, Eds.
(2014). Handbook of Biometric Anti-
Spoofing: Trusted Biometrics under Spoofing
 Attacks (PDF). London: Springer. Doe:
10.1007/978-1-4471-6524-8. ISBN 978-1-
4471-6524-8. ISSN 2191-6594. LCCN
2014942635.
 Retrieved 8 October 2017 – via Penn State
University Libraries.

 Agrafiotis Iohannis, Nurse Jason R.C.,


Goldsmith M, Crease S, Upton D. A taxonomy
of cyber-harms: Defining the impacts of cyber-
attacks and understanding how they
propagate. Journal of Cybersecurity. 2018;
4:tyy006.
Doe: 10.1093/cyber/tyy006. [Crossbred]

 Agrawal A, Mohammed S, Fadhil J. Ensemble


technique for intruder detection in network
traffic. International Journal of Security and Its
Applications. 2019; 13(3):1–8.
Doe: 10.33832/ijsia.2019.13.3.01. [Crossbred]

 Ahmad, I., and R.A. Alsemmeari. 2020.


towards improving the intrusion detection
through ELM (extreme learning
machine). CMC Computers Materials &
Continua 65 (2): 1097–1111.
10.32604/cmc.2020.011732.

 Bouyeddou B, Harrou F, Kadri B, Sun Y.


Detecting network cyber-attacks using an
integrated statistical approach. Cluster
Computing—the Journal of Networks Software

ISBN Number : 978-81-958673-8-7 139


KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Study On Green Cloud Computing


Udayagiri Ram Kumar, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous), KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: ramudayagiri0@gmail.com Email: gspkbn@gmail.com

stored in various data centers across the whole


world. These data
ABSTRACT
Green cloud computing is designing cloud
computing resources in such a way that the impact centers need a lot of energy and power to be able to
on the environment is minimized. Green clouds can run smoothly and efficiently all the time and this is
save a lot of energy and also can reduce operating also a necessity because you would never want the
costs. This is a necessity today as the global server to go down. Almost all the fields in the
warming of the planet continues to rise. As more modern age are dependent on Information
businesses move to the cloud, it is imperative that Technology. Healthcare, Banking, Media,
all cloud providers embrace green cloud Automobiles, etc. all use IT on a regular basis and
computing. This paper provides an overview of their work would almost stop without it right now.
green cloud computing and also discusses strategies There are many other fields where Information
for its implementation. It also includes applications Technology is used on a vast scale which has led to
and challenges faced in implementing green clouds. increasing energy consumption over the years.
Eventually, which leads to increasing costs. There
Keywords—Cloud computing, Green cloud have also been many advancements in the field of
computing, Environment Sustainability, Energy Technology in the recent years. Among which, one
Consumption. of them is Cloud Computing which is in very high
I. INTRODUCTION demand right now. Basically, it means that we can
use various computing services without having to
Cloud computing is the use of various components actually use any extra hardware on our side.
of a computer system, such as storage and Everything happens remotely over the internet and
computing power (without any external use by the we get to use the computational power of a remote
user) over the Internet. In today’s world, cloud server on our local machine. Cloud Computing has
computing has gained a lot of scrutiny due to the changed the way we use technology now. It has
increasing demand for computing and data storage. proved to be of massive help in various fields. But
Given these technological advances, the next step the thing is, with the increase in the use of Cloud
should be to create environmentally friendly, Computing we are also having to deal with a ton of
energy efficient and cost-effective solutions. IT data which is even more than before now. And as
departments use a lot of power and energy. This most of the data centers have started using the
could lead to global climate change, such as Cloud Computing architecture now, the amount of
increased CO2 emissions and energy shortages. As Carbon emissions have also increased which can
a result, there is a need for “green cloud cause serious damage to the environment. We
computing,” which is able to provide cost-effective cannot even restrict the use of Cloud Computing
broadcasts and save energy on IT resources. now because it has become a very important part of
Efficient use of computers and other devices will our lives, but what we can do is to develop ways to
increase productivity, energy-saving remote reduce these harmful emissions. Then, there’s IoT
controls, and reduce electronic waste. This version (Internet of Things) which deals with physical
is called Green Computing. Green computing can products that use sensors and software’s inside
improve resource efficiency as well as productivity. them to connect and communicate over the
Internet. Whenever we hear terms like
II. Need for green cloud computing “connecting”, “data”, “internet” , etc. in the same
sentence we should pretty much figure that all these
Today in the modern world, our lives cannot be
things are possible due to Cloud Computing and
imagined without the use of any kind of
large data centers present at the location of the
technology. Due to the use of a lot of technology
remote servers. IoT thus, is also dependent on data
based products, we are also having to deal with a
centers for most of its infrastructure. And data
large amount of data. This huge amount of data is
centers as it is logical consume a lot of energy. The

140
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

energy consumption in fact is on a steady rise due a great deal of hardware and operating costs,
to the innovation and emergence of new consolidation of tasks and reduction in energy use
technologies and products. The field of Health Care (power consumption) by switching off physical
Medicine also makes extensive use of technology devices. Virtualization also allows to move a
and also the use of Cloud Computing has increased running virtual machine from one host to another
over the years in the field. These things have with no downtime and distributed power
revolutionized the way the healthcare sector works. management possible.
We can see the use of equipment powered by the
latest technology, improved equipment for surgical
work, being able to consult doctors remotely over B. Green Scheduler
the internet, online healthcare platforms and many
other things. This has become possible due to The green scheduler or the green scheduling
Technology and Cloud Computing. But at the same algorithm determines which servers should be
time, the healthcare field also faces challenges turned on and turned off. When the load increases,
regarding how to make energy efficient and safe the server will be turned on and when the load
use of such tools and techniques. A study also decreased the server will be turned off
shows that Healthcare also plays a major role in the automatically. Since, servers take time to load
carbon footprint of countries like the USA, completely, they should be switched on before it is
Australia, etc. The energy use of the healthcare needed. The servers should also not be loaded more
field is estimated to grow even more in the coming than its capacity. This leads to reduction in energy
years with the emergence of new technologies. So, and power consumption at the data centre. It also
we need to find a way to reduce the amount of helps in reducing the load on the servers at any
energy consumed while still being able to utilize instant of time since at any given point of time
these resources. This has made way for Green there are already requisite number of servers which
Cloud Computing which basically means the use of are on.
Cloud Computing in an environment friendly
manner. Green Cloud Computing tries to provide C. Datacentre Energy Efficient
eco-friendly and economically viable use of
Network Aware Scheduling Algorithm This
computing resources while still providing the same
algorithm helps in reducing the cost and
value to the users. Many companies worldwide
operational expenses of data centres by minimizing
have also started investing in the developments of
the total energy consumption. It selects the best fit
Green Cloud Computing. These are all the reasons
resources for executing a particular task or problem
which have made Green Cloud Computing
on the basis of the load as well as taking into
necessary in today’s world.
consideration the various components present at the
III. Approaches data centre. Also, for managing the workload, a
deadline- based model is employed which aims at
Green cloud computing provides multiple solutions completion of each task within a specified amount
to alleviate the impact of cloud computing on the of time. Hence, it achieves workload efficiency by
environment. There are various different preventing the components (servers) from
approaches and techniques proposed for achieving overloading and network congestion. However,
green cloud computing. There are mainly three there is a small amount of increase in the number of
ways: first is the hardware optimization which servers that are running.
includes reducing the use of energy and making it
economically efficient, second is software
optimization which includes developing ways to
D. Nano Data Centres
increase efficiency of energy, storage and program,
and the last one is network optimization. It is a computing platform which is distributed.
They refer to the large number of data centres with
A. Virtualization
smaller sizes than the normal data centres which
It is a very common approach used in green cloud are large in sizes and lesser in number. The creation
computing. There are different types of of Nano data centres helps in reducing the energy
virtualization of which server virtualization plays a consumption by 30 percent. They are distributed
major role because servers play the most important around the world and are interconnected. They are
role in a Cloud Data centre. In this approach portable and can be used anywhere including
multiple VMs (Virtual Machines) are assigned to a remote locations or for temporary use. They help in
single server. This can be done by the use of a the reduction of downtime with a decrease in
software application. It helps is reducing the cost to response time.

141
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

E. Use of Tranquil PCs for ensuring green cloud computing along with the
knowledge of the latest and new green cloud
There is a large amount of carbon emissions due to computing equipment. The environmental factor
all the data centres using cloud computing includes instructing the cloud providers on
architecture. The use of tranquil PCs causes the standards and policies of green computing and
carbon emission to reduce to approximately 60kg sustainability. The TOE model aims at increasing
per year as opposed to the desktop PCs in the the efficiency and also reducing the carbon
datacentres which consume approximately 400KW footprint due to the use of technology.
of power and even with the power saving option
approximately 270kg of Carbon dioxide is H. Dynamic Migration Algorithm
produced per year. Hence, the use of tranquil PCs
helps in reducing the carbon emissions. The main objective of the algorithm is to reduce
the power and energy consumption as well as
parameter Per week Per Per year reduce carbon dioxide emissions. It helps in
s month increasing the efficiency of resource utilisation. But
Reference 6.9 29.4 35.19 the drawback is that it exceeds migration costs and
desktop takes longer than usual to respond i.e. longer
pcs response time.
costs IV. MECHANISMS
Co2 4.3 18.1 2.18
emissions Green cloud computing, also known as sustainable
Tranquil 2.9 9.4 111.6 or environmentally friendly cloud computing,
pc’s focuses on reducing the environmental impact of
costs cloud services through various mechanisms. Here
Co2 1.9 5.7 67.9 are some key mechanisms and strategies employed
emissions in green cloud computing:

F. ASDI Methodology 1. Energy-Efficient Data Centers: Data


centers are the backbone of cloud
ASDI stands for Analysis Specifications Design computing, and optimizing their energy
Implementation, which can be used for designing usage is crucial. This involves using
and implementation of a model. It helps in energy-efficient hardware, cooling
decomposition of a system into three main systems, and power distribution methods
subsystems- first is the PSS i.e., the Physical to minimize energy consumption.
Subsystem, second LSS i.e., the Logical Subsystem
and the third DSS i.e., Decision making Subsystem. 2. Virtualization: Virtualization technology
It provides more efficient and adoptable modelling enables multiple virtual machines (VMs)
approach for green cloud computing data centres. to run on a single physical server. This
This methodology makes decision making pro- cess reduces the number of servers required,
hierarchical, easy facilitation of decision-making leading to better resource utilization and
process, distribution of the power of resolution energy savings.
tools and cooperation of decision entities. Another
3. Dynamic Resource Allocation: Cloud
advantage is that ASDI methodology is easily
providers can adjust resource allocation
understandable by anyone, even those who are no
based on workload demands. Scaling
specialist. It aims at improving the design of green
resources up or down as needed helps
cloud computing in collaboration with engineers.
prevent overprovisioning and wasted
G. TOE model energy.

TOE refers to Technical Organization 4. Renewable Energy Sources: Data centers


Environment. It accounts for both technical as well can be powered by renewable energy
as non-technological aspects such as the sources like solar, wind, or hydroelectric
environment and organizational factors. It power. This reduces the carbon footprint
comprises of technical, organizational and associated with cloud services.
environmental factors. The technical factor refers
5. Server Consolidation: Merging
to technical side such as implementation and design
workloads onto fewer servers and
of cloud data centres, the organizational factor
decommissioning underutilized servers
refers to the organization of policies and protocols

142
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

can lead to better resource utilization and resources) can lead to more responsible
reduced energy consumption. and energy-efficient usage.

6. Cooling Efficiency: Implementing


innovative cooling techniques, such as
using outside air for cooling or liquid 16. Demand Response: Cloud providers can
cooling systems, can improve energy participate in demand response programs, where
efficiency in data centers. they adjust their energy consumption based on the
grid's supply and demand to support grid stability.
7. Energy-Aware Scheduling: Scheduling
tasks and workloads during off-peak Green cloud computing combines these
energy consumption periods can help mechanisms to create a more sustainable approach
balance energy loads and reduce to cloud services, benefiting both the environment
electricity costs. and the organizations that utilize these services.

8. Data Center Location: Building data


centers in cooler climates or areas with V. USING ARTIFICIAL INTELLIGENCE TO
abundant renewable energy can help ACHIEVE GREEN CLOUD COMPUTING
reduce the need for extensive cooling and
minimize energy consumption. As green cloud computing mainly revolves around
reducing the energy consumption at data centers,
9. Optimized Algorithms: Developing artificial intelligence can help achieve this goal by
energy-efficient algorithms and software means of intelligent resource scheduling and
can reduce processing requirements and, intelligent refrigeration. Due to unprecedented
consequently, energy consumption. variations in requests from the cloud users,
10. Resource Sharing: Multi-tenancy, where resource scheduling is an important factor in
multiple users share the same physical reducing the energy consumption. Optimal usage of
infrastructure, can lead to better resource resources is important when the demand surges. If
utilization and reduced energy resources are not allotted optimally, a large amount
consumption compared to dedicated of energy gets consumed unnecessarily. Traditional
infrastructure. approaches do not use AI for resource allocation,
and though these strategies perform decently while
11. Data Compression and Deduplication: allocating resources, these can be further optimized
Efficient data storage mechanisms like by usage of AI.
compression and deduplication reduce the
storage requirements and energy needed A. Scheduling control engine:
for data management. When a request is sent, the optimization scheduling
12. Energy Monitoring and Management: module collects data related to all the resources
Implementing real-time monitoring and present in the data center with the help of a
management systems can help track resource sensing layer. Using AI, resource
energy consumption and identify areas for allocation is done based on the current request
improvement. submitted by the user and the current state of the
resources. • Resource perception module: Deep
13. Waste Reduction: Recycling and proper learning predicts the future load and in tandem with
disposal of electronic waste (e-waste) the current load, the scheduling control engine does
associated with hardware upgrades and the resource allocation. Since the future load is
replacements are essential for reducing predicted in tandem with the current load, resource
environmental impact. allocation can be optimized. As more and more
data is trained, the deep learning model gets better
14. Green Certification and Standards: at predictions of the future load. Deep learning
Cloud providers can adhere to green models thrive on a large amount of data.
certification standards to ensure they meet
specific environmental and energy • Optimization scheduling module: Based on
efficiency criteria. inputs from the resource perception module this
module can reserve resources while allocation of
15. User Awareness: Educating cloud service resources takes place. Resource allocation was
users about the energy implications of based on time required for completing the job by
their choices (e.g., turning off unused using algorithms like FCFS or based on priority.

143
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Reinforced learning can be used to learn each time cloud computing platforms. Since these IoT
a scheduling takes place. devices require low latency and mobility, edge
computing for real- time services has emerged. Fog
computing is a distributed computing model aimed
at linking network devices at various levels of
computation. They offer IoT devices a low-latency
answer that centralised cloud computing

Policy verification module: Scheduling policy


thus formed by the optimization and scheduling
module needs to be verified by the policy infrastructure can’t match. Green computing
verification module. This is mainly done for safety focuses on maintaining computing resources while
reasons so that the constraints of the system are not reducing energy consumption and being
violated and the data center can operate smoothly. environmentally friendly. By utilizing machine
Also, it may happen that the scheduling model is resources, Green computing allows for a more
currently weak as not enough data has been fed into environmentally sustainable use of energy and
the model. In such cases, the policy verification other activities. It entails redesigning and
module scraps the schedule generated by the model eliminating various computing elements in order to
itself and uses traditional approaches like FCFS. minimize environmental damage. The aim of green
The scheduling policy makes sure that the computing is to use computing resources in a way
resources which are no longer needed, can be that is both environmentally sustainable and cost-
switched off. friendly. Green computing uses various
technological domains:
• Self-learning module: Self-learning module is
used to enhance the knowledge base of the system. • Autonomous vehicles: Autonomous vehicles
must send data to their manufacturers in order for
B. Intelligent refrigerating engine them to monitor their use and receive necessary
maintenance warnings. Edge computing facilitates
When the resources are used for computation, heat data transmission and sharing between autonomous
is produced in the nodes, which needs to be vehicles. They often reduce the amount of energy
dissipated as soon as possible. This is mainly used by sensors in autonomous vehicles. The risk
because the physical equipment in the room may of carbon emissions is minimised as result of the
find itself harmed if the heat is not dissipated move toward a more environmentally sustainable
quickly. Refrigeration thus becomes a necessity. approach.
Intelligent refrigerating engine uses deep learning
to predict energy consumption in data centres. • Smart cities: City leaders use the data obtained
Refrigeration takes up a lot of energy. Information from sensors, which includes traffic, infrastructure,
related to the surrounding environment is taken into and home appliances, to solve the problems that
consideration temperature, moisture, airflow and these cities face. The response time to these devices
current status of resources is also taken into should be instantaneous, resulting in less energy
account. Model is used to check the correlation consumption.
between energy consumption and environment
information. Parameters associated with the • Industries: Oil exploration for example, may use
refrigeration are thus determined and suitable IoT edge computing to collect data on a range of
operations are performed. The refrigeration thus environmental factors without relying on
has an intelligence of its own which decreases the previously collected data. As a result of the
total operational cost and also reduces the carbon introduction of edge computing in industries, there
footprint. would be less energy use in manufacturing.

VI. APPLICATIONS OF GREEN CLOUD B. Green Healthcare


COMPUTING Virtualization of information technology (IT) data
Risk of carbon emissions is minimized as a result centre devices is the most important step most
of the move toward autonomous vehicles, which is companies will take to move to green healthcare.
a step toward a more environmentally sustainable While cost reduction is often the driving force
A. Green Internet Of Things behind virtualization, IT versatility is often the
most important factor. The cost and energy savings
The Internet of Things (IoT) connects smart from consolidating hardware and software are
objects to a heterogeneous network to allow substantial, and they complement the usability
monitoring and decision making. The increase in advantages very well. The dominant Green IT
the number of IoT devices is posing a threat to practices in healthcare are:

144
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

• Use of Electronic Medical Records (EMRs) People feel that it is better to use traditional
EMR is used by healthcare professionals to track, approaches because those are not very costly, but
control, and coordinate health care delivery within such a thought process is likely to backfire for
a healthcare organisation. EMRs have the ability to humanity in the longer run. People need to be
minimise carbon dioxide emissions, according to cautious about the impact their products are having
estimates. Users revealed that by using an EMR, on the environment either directly or indirectly.
they were able to save thousands of pounds of

paper for medical records each year. As a result,


the environment had a net positive impact. VIII. CONCLUSION

• Telemedicine: Telemedicine is the practice of As more and more businesses are switching to
medicine that uses technologies to give treatment to cloud, the amount of energy utilized by these cloud
people who are located far away. Telemedicine has data centers is increasing at a rapid rate and is
been around for more than two decades, but its significantly contributing towards carbon footprint.
effects are only now becoming apparent, especially Green cloud computing provides a solution to this
in rural areas. It helps to reduce carbon problem by reducing energy consumption and
consumption as people can avoid going all over for optimizing resource allocation. Powerful AI
expert referrals and other events. It can be used to techniques are aiding the growth of green cloud
help treat chronic conditions, optimise treatment computing. Green computing can be implemented
for the sick, homebound, and physically in various fields like IoT and big data analytics.
challenged, and boost community and population Public needs to be educated about the importance
wellbeing. of green computing. Adopting green computing in
the future will be extremely beneficial for the
C. Green Parallel Computing of Big Data environment.
Systems
Big Data is usually structured around a distributed
file system on top of which parallel algorithms for IX. REFERENCES
Big Data analytics can be run. The parallel [ 1 ] Manoj Muniswamaiah, Tilak Agerwala and
algorithms can be mapped to the computing Charles C. Tappert, ”Green computing for Internet
platform in a number of ways. In terms of of Things”, 2020 7th IEEE International
environmentally related parameters such as energy Conference on Cyber Security and Cloud
and power usage, each choice would behave Computing (CSCloud).
differently. Current research on the implementation
of parallel computing algorithms have largely [ 2 ] J.M.T.I. Jayalath, E.J.A.P.C. Chathumali,
focused on addressing general computing metrics K.R.M. Kotha- lawala, N. Kuruwitaarachchi,
such as speedup over serial computing and ”Green Cloud Computing: A Review on Adoption
efficiency of the use of computing nodes. We of Green-Computing attributes and Vendor
explore how to elicit green metrics for big data Specific Implementations”, 2019.
systems, which are necessary when comparing
implementation options. We use current systematic [ 3 ] Mridul Wadhwa, Approv Goel, Tanupriya
literature reviews to define and address the key
Choudhury, Ved P Mishra, ”Green Cloud
green computing indicators for big data systems.
Computing – A Greener Approach To IT”, 2019
VII. CHALLENGES IN GREEN International Conference on Computational
COMPUTING IMPLEMENTATION Intelligence and Knowledge Economy(ICCIKE).

A. Green Computing Awareness [ 4 ] Jun Yang, Wenjing Xiao, Chun Jiang, M.


There exists a general lack of understanding ShamimHossain, Ghulam Muhammad, Syed Umar
about green cloud computing among the Amin, “AIPowered Green Cloud and Data Center”,
people. People are generally unaware about its 2018 IEEE Access.
impact on the environment. People need to be
[ 5 ] Nina S. Godbole, John Lamb, ”Research
convinced how important it is to have an
intoMaking Healthcare Green with Cloud, Green
infrastructure supporting green computing.
IT, and DataScience to Reduce Healthcare Costs
Awareness among public needs to be raised.
and Combat ClimateChange”, 2018 IEEE.
Otherwise, the adoption of green computing is
a distant dream. [ 6 ] Surendran.R, Tamilvizhi.T, ”How to Improve
B. Equipment Cost the Resource Utilization in Cloud Data Center?”,

145
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

2018 International Conference on Innovation and


Intelligence forInformatics, Computing, and
Technologies (3ICT).
[ 7 ] Fatima Shakeel, Seema Sharma, ”Green Cloud
Computing: A review on Efficiency of Data
Centres and Virtualization of Servers”, 2017
International Conference onComputing,
Communication and Automation (ICCCA).
[ 8 ] Mr. Nitin S. More,Dr. Rajesh B. Ingle,
”Challengesin Green Computing for Energy Saving
Techniques”, 2017International Conference on
Emerging Trends Innovation inICT (ICEI).
[9] Havva G u¨lay Gu¨rbu¨z, Bedir Tekinerdogan,
”SoftwareMetrics for Green Parallel Computing of
Big Data Systems”,2016 IEEE International
Congress on Big Data.
[ 10] Jean-Charles Huet, Ikram El Abbassi, ”Green
CloudComputing modelling methodology”, 2013
IEEE/ACM 6thInternational Conference on Utility
and Cloud Computing.

146
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Review of Recent Developments in Driver


Drowsiness Detection Systems
Udumula Asha Latha Siva Prasad Guntakala
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) KBN College,
Vijayawada 52001, Vijayawada 52001,
Andhra Pradesh, India, Andhra Pradesh, India,
Email: ashalathareddyudumula@gmail.com Email: gspkbn@gmail.com

Abstract-In the last ten years, better computer A driver doesn't suddenly start feeling tired; there are
technology and artificial intelligence have made signs that show up first. Some of these signs are:
driver monitoring systems better. Many studies
have gathered real data about drowsy drivers and Difficulty keeping eyes open.
used different computer programs and ways of
 . Having a hard time keeping your eyes
putting information together to make these
open.
systems work much better while driving. This
 Yawning often.
paper gives an updated overview of the drowsy
 Blinking a lot.
driver detection systems made in the past ten
years.  Finding it tough to focus.
 Moving the vehicle out of the lane and
Keywords-- ways to tell if a driver is tired, using reacting slowly to traffic.
biology, using a mix of things, using images, using  Nodding off or head dropping.
things from the vehicle.
To carefully assess different levels of feeling tired
1. Introduction and make it easier to create systems that can detect
The article "Looking at New Ways to Detect Tired tiredness early, we need a clear way to measure how
Drivers" talks about the newest improvements in tired someone is. Many ways have been suggested to
technologies made to find out if a driver is getting do this.
sleepy. It talks about different methods like
recognizing faces, tracking eye movements, and Scale Verbal Description
using sensors to watch important body signals. The 01 Extremely alert
article also shows how these systems use computer
programs to understand the information and figure 02 Very alert
out how awake the driver is. By studying how 03 Alert
detecting tiredness has changed over time, the article 04 Fairly alert
helps us know more about how technology is making
05 Neither alert nor sleepy
driving safer.
06 Some signs of sleepiness
2. Drowsiness Signs and Stages— 07 Sleepy, but no effort to keep alert
In the writings about making systems that can tell if a 08 Sleepy, some effort to keep alert
driver is getting sleepy, people use different words to
09 Very sleepy, great effort to keep alter
talk about the same thing. They sometimes use
"drowsiness" and "fatigue" like they mean the same.
"Fatigue" means not wanting to keep doing
something because your body or mind is tired, or
you've been doing the same thing for a long time. On
the other hand, "sleepiness" or "drowsiness" means
feeling like you want to go to sleep.

147
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Table2: easier to work with or doesn't take as much computer


power. Then, in the fourth step, they choose the parts
Weir Wille and Ellsworth came up with a different of the information that best show how tired the driver
way to measure how tired someone is. They made a is, using different methods to pick those parts.
scale with five levels to show different stages of
tiredness. According to Saito and their team, at the People use different ways to see how well the system
first level, they noticed quick eye movements and a can find out if someone is tired. Some of these ways
steady pattern of blinking. include accuracy, precision, and sensitivity. Here are
the formulas for these three ways:

 Accuracy: (Number of Correct


Predictions) / (Total Number of Predictions)
Levels Verbal Description  Precision: (Number of True Positives) /
01 Not drowsy (Number of True Positives + Number of
02 Slightly drowsy False Positives)
03 Moderately drowsy  Sensitivity: (Number of True Positives) /
04 Significantly drowsy (Number of True Positives + Number of
False Negatives)
05 Extremely drowsy

Accuracy=
Number of correct predictions
Total number of redections
TP+ TN
=
Tp+TN + FP+ FN
TP
Precision=
TP+ FP
TP
Sensitivity=
TP+ FN

3. Drowsiness Detection Measures 3.1 Fatigue detection, based on awning in thermal


images
To figure out different levels of tiredness, researchers
have looked at how drivers react and how they drive In their research, Knapik and Cyganek introduced a
their vehicles. In this part, we're going to talk about new way to spot when a driver is getting too tired,
four common ways people measure driver tiredness. using a method to notice when they yawn. They used
You can see all these methods in Figure 1. Two of special cameras that can see heat to get pictures. They
these methods involve looking at the drivers made a special set of pictures for their study. Here's
themselves: using pictures and information from their how their system works:
bodies. The third method uses information from the
car itself and is called the vehicle-based method. The They take pictures with the heat-sensing cameras.
fourth method is a mix of at least two of the ones we They use three steps to find the face area, then the
mentioned before, and it's called the hybrid method. corners of the eyes, and finally the yawn in the
This step is important because it makes the system pictures.
simpler by getting rid of things that don't matter and Since it's sometimes hard to see the mouth area in
keeping the useful stuff. these heat pictures, they use information about the
temperatures in other parts of the face to figure out
After that, some systems might change the when someone yawns.
information or make it simpler to understand, so it's

148
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

They use the corners of the eyes to tell if someone is They tested their system with three different sets of
yawning. They use methods that add up cold and hot pictures and videos. The first set had simple pictures
areas in the heat pictures to find this out. with a plain background, and it was right about 95%
Lastly, when the system's special program sees that of the time. The second set, with more complex
someone might be too tired, it makes an alarm go off. pictures, was accurate about 70% of the time. The

third set used videos of real drivers and their system


Their system was right about 71% of the time when it worked well on those too. In simple pictures, their
noticed cold areas and about 87% of the time when it system was right about 95% of the time, and in more
noticed hot areas. complex pictures, it was accurate about 70% of the
time. It also worked well on videos of real drivers.
Optical correlator based DDD algorithm.
3.2 Drowsiness detection using respiration in
thermal imaging Ouabida and their team introduced a quick way to
determine whether an eye is open or closed. They
Kiashari and their team created a system that can find used a special device that uses light to figure out
out if a driver is getting too tired without bothering where the eye is and how it looks. This device also
them. They used a special camera that sees heat to used a special picture to understand the eye better.
look at the driver's face and figured out how they Their method was the first to use a computer
breathe. They had thirty people take part in their simulation to find the eye's centre automatically.
study, where they drove in a car simulator. The Their approach accurately finds out where the eye is
camera took pictures of the driver's heat patterns. and whether it's open or closed. They use a special
picture in their device to do this. They start by
From these heat patterns, they looked at how the finding the eyes in normal pictures of faces. Then,
driver breathed and calculated numbers like the they use their special device to figure out if the eye is
average and the difference between inhaling and open or closed, no matter the lighting, the way the
exhaling times. They used these numbers to teach head is turned, or if the person is wearing glasses.
computers to understand when someone might be The scientists tested their idea on five sets of
getting too tired. They used two different computer pictures: FEI, ICPR, Bio ID, GI4E, and SHRP2. They
programs, one called support vector machine (SVM) also made some special pictures to help the device
and another called k-nearest neighbour (KNN). Both recognize eyes even when the surroundings are busy
programs could tell if someone was getting tired, but and noisy. The special picture method they used
the SVM program was better, with a 90% success worked the best in their tests.
rate. It was right 85% of the time when it said
someone was not tired, and 92% of the time when it Real-time DDD using eye aspect ratio
said someone was tired. Overall, it was 91% accurate.
In this research, Major and colleagues created a way
3.2.1 Drowsiness detection using eye features to tell if someone is getting sleepy by looking at how
Khan and their team created a system that can tell if a their eyes move in videos. They used a regular
driver's eyes are closing, which can mean they're webcam to watch the eyes and measured how often
getting tired. They used cameras to watch drivers in they blinked using a simple measure called EAR.
real-time. Here's how their system works: EAR compares the height and width of the eye to see
if it's open or closed. If the value is high, the eye is
It starts by finding the driver's face in the camera open, and if it's low, the eye is closed. They took
footage. pictures from the webcam and calculated EAR values
Then it uses a special method to find the eyes and for each picture. They tried different methods to sort
figure out if they're open or closed. the data, like a multilayer perceptron, random forest,
They do this by looking at the shape of the eyes and and SVM. The SVM method worked the best, with
measuring how curved the eyelids are. an average accuracy of 94.9% in tests. It was even
Depending on how curved the eyelids are, the system more accurate for surveillance videos, where it went
decides if the eyes are open (curving upwards) or over 95% accuracy.
closed (curving downwards).
If the system sees closed eyes for a while, it makes a DDD using face and eye features
sound alarm to alert the driver.

149
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Bamidele and their team introduced a system that can picked out 34 details from the eye signals, and these
tell if someone is getting drowsy without being too details were collected from sections of the eye signals
intrusive. They used technology to track the state of that overlapped and had different lengths. They also
the person's face and eyes. For their study, they had a looked at how well the system worked by trying out
collection of videos from NTHUDDD Computer different combinations of features and lengths of
these signal sections.

Vision Lab. The system works by first getting and 4. Challenges


preparing the needed data. Then, it picks out specific
details like how often the eyes close, the longest time Since Volvo introduced its first Drowsy Driver
the eyes stay closed, and how often the person blinks. Detection (DDD) system, technology in this field has
These details are given to different computer advanced significantly. However, there are still many
programs that try to figure out if the person is problems that researchers are dealing with. In this
becoming sleepy or staying awake. These programs part, we're going to talk about the difficulties in
include KNN, SVM, logistic regression, and artificial recognizing when drivers are getting sleepy.
neural networks (ANN). When they looked at the Most researchers usually test their ideas in computer
final results, they found that the KNN and ANN simulations and use the results from these simulations
programs were the best. They were right about to create their final systems. But the outcomes from
72.25% and 71.61% of the time, respectively. simulations might not accurately show what happens
in real-life driving. Also, in these simulations, the
Detection of driver drowsiness with CNN focus is usually on specific situations that make
people drowsy, and this doesn't cover all the different
Hashemi and the other researchers put forward a
possibilities and conditions that drivers experience in
system that can quickly tell if someone is getting
real life. This can affect how accurate the system
drowsy. They based their method on how much the
seems. One way to improve this is to also test the
eyes close and used a special kind of computer
system in actual driving situations to make sure it
program called Convolutional Neural Network
works well in real life.
(CNN). They made three different networks to figure
out when the eyes are closing: one they built from The issues with systems that use images are mostly
scratch called FD-NN, and two using existing about the part of the picture that shows the face –
networks called VGG16 and VGG19, with some that's the part that matters the most, where the
extra layers added in (TL-VGG16 and TL-VGG19). computer looks for important details. On the other
They tested these networks using the ZJU gallery hand, systems that work based on things like how the
dataset and some new images they collected, totaling body behaves when someone's drowsy have issues
4157 pictures. The results of the experiment showed tied to the tools and equipment used. In comparison,
that the accuracy of the networks was 98.15%, systems built into vehicles face fewer problems, but
95.45%, and 95% for FD-NN, TL-VGG16, and TL- the biggest one is that they might not always spot
VGG19 respectively. signs of drowsiness accurately.
Eye signal analysis
Zandi and the other researchers suggested a way to Challenges Image Biologic Vehicl
find out if someone is getting drowsy without being d al Based e
intrusive. They used a special kind of computer Based Based
program called a Machine Learning (ML) system that Difficulty in High N/A N/A
works with data from tracking the eyes. They did extracting
their tests in a computer-simulated driving situation drowsiness signs,
and had 53 people participate. They gathered due to facial
information about how the eyes moved and also characteristics/skin
recorded signals from the brain using a method called colour
electroencephalography. They mainly used the brain Difficulty in High N/A N/A
signals to have a reliable basis for comparison and to extracting
label the moments when the eye signals showed drowsiness signs,
drowsiness or alertness. The ML system they created due to objects that

150
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

cover the face different visual aspects, like the way the eyes and
Driver's posture and High Low N/A mouth behave, as well as head movements.
distance from the Additionally, the rear camera can identify features
dashboard related to the vehicle, such as when it starts to drift
Real-time video Mediu N/A N/A from its lane or changes its orientation.
analysis m
Driver movement High High N/A
Noisy sensor Low High Low CONCLUSION
measurements
Monitoring Low Medium Low In the past ten years, the field of drowsiness detection
equipment and has seen significant progress, thanks to advancements
sensors in technologies like the Internet of Things (IoT),
Inconvenience smaller sensors, and artificial intelligence. This paper
Influence of High Low Mediu offers a thorough and current overview of drowsiness
environmental m detection systems developed over the past decade. It
conditions outlines the main strategies used in creating these
(weather/illuminatio systems and organizes them into four groups based
n) on the kinds of indicators they use to identify
Influence of the Low Low High drowsiness. These categories are systems that rely on
road conditions and images, biological signals, vehicle data, and a mix of
geometry these approaches. The paper goes into detail about
Hardware Low High Low each of these systems, discussing the features they
complexity and utilize, the AI methods they implement, the datasets
limitations they work with, and their resulting accuracy,
Drowsiness signs Low Low High sensitivity, and precision.
extraction precision REFERENCES
Testing under real Mediu Medium Mediu
(not simulated) m m  National Highway Traffic Safety
driving conditions. Administration Drowsy Driving.
[(accessed on 10 May 2021)]; Available
online:https://www.nhtsa.gov/risky-
5. Discussion driving/drowsy-driving
After carefully studying existing research, it's clear  National Institutes of Health Drowsiness.
that there are many different ways to detect [(accessed on 10 May 2021)]; Available
drowsiness and prevent potential dangers while online:
driving. Additionally, the progress in technology, https://medlineplus.gov/ency/article
especially in the field of artificial intelligence, has  National Safety Council Drivers are
helped overcome several difficulties that these Falling Asleep Behind the Wheel.
systems used to face, making them work better. In [(Accessed on 10 May 2021)]. Available
this part, we're going to look at different Drowsy online:
Driver Detection (DDD) systems, considering how https://www.nsc.org/road-safety/safety-
well they work in real situations and how dependable topics/fatigued-driving
they are, based on what's been written in various  National Sleep Foundation Drowsy
studies. We'll also talk about the four methods Driving. [(accessed on 10 May 2021)].
mentioned earlier that are used to spot drowsiness. Available online:
https://www.sleepfoundation.org/articles/
6. Future Trends in Drowsiness Detection Systems
Researchers have suggested using mobile phones as a
cost-effective option for gathering driving-related
information. Modern mobile phones come with at
least two cameras and multiple sensors. They can
even link up with various sensors using Bluetooth or
other wireless technologies. If placed on the
dashboard, a mobile phone's front camera can record

151
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

Social Media Users In India: A


Futurestic Apporach
Udumula Rohith, Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College(Autonomous) KBN College(Autonomous)
Vijayawada-520001,A.P,India Vijayawada-520001,A.P,India
Email:rohithudumula0@gmail.com Email:gspkbn@gmail.com
participate in online communities centered
around interests such as technology, fashion,
Abstract travel, sports, and more.

India, with its vast population and One of the defining features of social media
diverse cultural landscape, has usage in India is the prevalence of regional
emerged as a global hub for social languages. While English is commonly
media users. The proliferation of used, a significant portion of content is
smartphones, increasing internet created and consumed in languages like
penetration, and a youthful Hindi, Bengali, Tamil, Telugu, and many
demographic have contributed to the others. This linguistic diversity has
rapid growth of social media usage in prompted platforms to cater to these
the country. As of my last knowledge languages, making social media more
update in September 2021, India inclusive and accessible.
boasted one of the largest user bases
on various social media platforms.

Key Words: Social media, Internet,


Users, Communication Pattern Social media has also become a platform
Introduction for political discourse and social activism
in India. Citizens use these platforms to
Social media platforms like Facebook, express their opinions, discuss policy
WhatsApp, Instagram, Twitter, and matters, and even organize protests and
YouTube have gained immense movements. This has led to both positive
popularity, connecting people across change and challenges, as the open nature
urban and rural areas, spanning different of social media can sometimes amplify
languages, traditions, and interests. misinformation and polarization.
These platforms have not only
In recent years, the Indian
revolutionized communication and
government has shown an
information sharing but have also played
increased interest in regulating
a significant role in shaping public
and monitoring social media
discourse, activism, entertainment, and
platforms to address concerns
commerce.
related to data privacy, fake
India's social media users are incredibly news, and harmful content. This
diverse, reflecting the rich tapestry of has sparked debates about the
the nation's culture. Users engage in a balance between freedom of
wide range of activities on these expression and the responsibility
platforms, from staying in touch with of platforms to maintain a safe
friends and family to consuming news, and informative online
entertainment, and educational content. environment.
They share personal experiences,
Objectives about social media:
express opinions on various topics, and
Certainly, here are some general

152
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

objectives related to the use of -Provide entertainment through multimedia


social media: content, such as videos, memes, and
interactive posts.
1. Communication and Connection:
-Engage users with quizzes, challenges,
-Facilitate real-time communication
contests, and interactive storytelling.
and interaction among
individuals regardless of -Offer a platform for users to follow
geographical boundaries. celebrities, influencers, and content creators
-Strengthen connections with they admire.
friends, family, colleagues, and 6. Marketing and Branding:
like-minded individuals.
-Enable businesses to reach and engage with
-Foster a sense of community their target audience directly.
and belonging through shared
interests and experiences. -Build brand awareness, loyalty, and
recognition through consistent online
2. Self-Expression and Creativity:
presence.
-Provide a platform for users to express
their thoughts, opinions, ideas, and -Promote products, services, and special
creativity. offers to drive sales and revenue.

-Showcase personal talents and hobbies, 7. Customer Support and Feedback:


such as art, photography, writing, and
-Offer a channel for customers to inquire
music.
about products, resolve issues, and provide
feedback.

-Encourage users to develop and share -Improve customer satisfaction by


original addressing concerns and inquiries promptly.
content that reflects their unique
-Use feedback to refine products, services,
3. Information Sharing and and user experiences.
Awareness:
8. Social Activism and Advocacy:
-Disseminate news, updates, and
information quickly to a wide -Mobilize support and raise awareness for
audience. social, political, and environmental causes.

-Raise awareness about social, political,


environmental, and cultural issues. -Promote activism and encourage collective
-Educate users by sharing informative action for positive change.
content and resources. 9. Research and Analysis:
4 .Networking and Professional -Gain insights into user preferences,
Growth: behaviour, and trends for market research
and analysis.
-Establish and expand professional
networks for career advancement and -Monitor public sentiment and reactions to
opportunities. events, products, or policies.
-Showcase professional 10.Personal Branding and Influence:
accomplishments, skills, and expertise.
-Cultivate a personal brand by sharing
-Access industry insights and trends to expertise, experiences, and perspectives.
stay in formed about developments in
various -Develop influence and authority in specific
niches to impact opinions and trends.
5. Entertainment and Engagement:

153
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

-Collaborate with brands and mass communication on social


organizations for sponsored content and media:
partnerships.
1. Wide Reach and Accessibility:
These objectives highlight the
-Social media platforms have global
diverse ways individuals,
reach, enabling information to be
businesses, organizations, and
shared with audiences across different
communities utilize social media
geographical locations and time zones.
to achieve their goals. Keep in
mind that the specific objectives 2.Divese Formats:
can vary based on the platform,
target audience, and the unique -Mass communication on social media
needs and values involves a variety of formats such as text,
images, videos, infographics, live
streaming, and more.
Number Of Social Media Users
In India From 2015 to 2023 -The multimedia nature of social media
enhances engagement and caters to different
learning styles.

3.Interaction and Engagement:

-Social media encourages two-way


communication through comments, likes,
shares, retweets, and direct messages.

-Audiences can actively engage with


content, express their opinions, and
participate in discussions.

4.Real-Time Communication:

- Social media enables real-time


communication, making it a valuable tool
for sharing breaking news, updates, and live
events.

5. Information Dissemination:

Mass communication on social media: -News organizations, government agencies,


businesses, and individuals use social media
Mass communication on social to share information on topics ranging from
media refers to the current events to entertainment and
dissemination of information, education.
messages, content to a
large and diverse audience 6. Amplification of Voices:
through social media platforms.
This modern form of -Social media allows marginalized voices,
communication has transformed individuals, and grassroots movements to
the way amplify their messages and reach a broader
audience.

7. Challenges of Misinformation:
information is shared,
consumed, and interacted with, -The rapid spread of information on social
offering both opportunities and media can also lead to the rapid spread of
challenges. Here are some key misinformation, fake news, and rumors.
points to consider regarding

154
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

-It's important to critically evaluate


sources and verify information before
sharing.

8. Targeted Messaging:

- Social media platforms offer tools for


targeting specific demographic groups,
allowing for tailored messaging to reach
the intended audience.
Telecom expansion:
9. Influencer Culture:
Telecom expansion refers to the
-Influencers play a role in mass growth and development of
communication on social media by telecommunication networks and
leveraging their follower base to services to reach more areas,
promote products, causes, and ideas. serve a larger population, and
provide improved connectivity
-Social media has been a catalyst for and communication capabilities.
social and political activism, allowing As technology advances and the
individuals and groups to organize, demand for connectivity
mobilize, and advocate for change. increases, telecom expansion
becomes ansure
10. Ethical Considerations:

- Privacy, data security, and ethical people, businesses, and


concerns arise when collecting and communities have access to
using personal information for mass reliable and efficient
communication on social media. communication services.
Here are some key aspects of telecom
11. Evolving Landscape: expansion:
The landscape of social media 1. Infrastructure Development:
is dynamic, with new platforms
-Expanding telecom networks
emerging and existing
involves building and upgrading
platforms evolving.
physical infrastructure such as cell
Communicators need to stay
towers, fiber-optic cables, and data
updated on trends and shifts.
centers.
Effective mass communication
on social media requires
2. Broadband Connectivity:
understanding the preferences
and behaviors of the target -Expanding broadband services, including
audience, crafting compelling highspeed internet access, is a critical part
and relevant content, and of telecom expansion.
adhering to ethical standards.
It's a powerful tool that has -Broadband connectivity enables faster data
reshaped the way information transmission, multimedia content streaming,
flows in modern society. and supports various online activities.

3. Rural and Remote Connectivity:


-Telecom expansion focuses on providing
connectivity to rural and remote regions that
might have been underserved or unserved.

155
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

-This effort bridges the digital divide often involves international connectivity
and ensures equal access to through undersea cables and satellite
communication and information. communication
4. 4G and 5G Rollout:

-The deployment of 4G and 5G 10. Technological Innovation:


networks enhances data speeds,
capacity, and connectivity quality. - Telecom expansion often involves
adopting new technologies and solutions to
-5G technology offers ultra-low latency optimize network performance, security,
and supports the Internet of Things and efficiency.
(IoT) ecosystem.
11. Enhanced Services:
5. Last-Mile Connectivity:
- Telecom expansion enables the
-Last-mile connectivity involves
provision of advanced services such
reaching the final link between the
as video conferencing, cloud
telecommunication network and the
computing, and online
end-users' premises.
entertainment.
-It is crucial for ensuring seamless
Telecom expansion is essential for
connectivity to homes, businesses, and
economic growth, social
institutions.
development, and the overall
6. Government Initiatives: progress of a nation. It empowers
individuals and communities,
-Governments often play a role in drives innovation, and facilitates
promoting telecom expansion through communication across distances.
policies, incentives, and regulations. It also contributes to various
sectors such as healthcare,
-Initiatives can include funding for education, business, and
infrastructure development, spectrum governance, fostering a connected
allocation, and ensuring fair and digitally empowered society.
competition.
Conclusion:
7. Private Sector Investment:
In conclusion, the surge of social
media users in India stands as a
testament to the country's rapid
digital transformation. With a
multitude of platforms bridging
-Telecom operators and private geographical gaps
companies invest in expanding their and languages, social media
networks to tap into new markets and has become a powerful force for
meet growing demands. connectivity, self-expression, and
cultural exchange. As this digital
8. Digital Inclusion:
tapestry weaves together a diverse
-Telecom expansion contributes to population, it shapes not only
digital inclusion by providing affordable individual interactions but also
access to communication and online societal narratives and activism.
services. However, as India's social media
landscape continues
9. International Connectivity: to evolve,
addressing challenges like
-International links support global misinformation and data privacy
communication and data exchange. becomes essential to harnessing its
-Expanding telecom infrastructure

156
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

full potential while safeguarding


its impact on society.

REFERENCES

1. Hassan S-U, Bowman TD, Shabbir


M, Akhtar A, Imran M, Aljohani
NR. Influential tweeters in relation
to highly cited articles in altmetric
big
data. Scientometrics. 2019; 119(1):481–93

2. Zhang D, Earp BE. Correlation


Between Social
Media Posts and Academic Citations of
Orthopaedic Research. J Am Acad
Orthop Surg Glob Res Rev. 2020;
4(9):e20.00151.
https://doi.org/10.5435/
JAAOSGlobal-D-2000151 PMID:
32890011

3. Hassan S-U, Aljohani NR, Shabbir


M, Ali U, Iqbal S, Sarwar R, et al.
Tweet Coupling: a social media
methodology for clustering
scientific publications.
Scientometrics. 2020; 124:973–91.

4. Hassan S-U, Iqbal S, Aljohani NR,


Alelyani S, Zuccala A. Introducing
the ‘alt-index’for measuring the
social visibility of scientific
research. Scientometrics. 2020;
123:1407–1419.

5. Morrison, S. [@scottmorrisonmp].
(2020, January 28).
Congratulations to this year’s
Australian of the Year Dr James
Muecke. His passionate and
selfless commitment to preventing
blindness and
tackling [Photograph]. Instagram. https://www.i
nstagram.com/p/B7vfGYen-L1/

157
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001

158
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Study On Cyber Security In India: An


Evolving Concern For National Security
Ummadi Sai Durga Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous), KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: ummadisaidurga@gmail.com Email: gspkbn@gmail.com

ABSTRACT
People
In the contemporary era driven by technology and Individuals should recognize the importance of
interconnectedness, it has become imperative to adhering to fundamental principles of information
comprehend the concept of cybersecurity and security, including the choice of robust passwords,
employ it adeptly. Inadequate security measures exercising caution when dealing with email
can render systems, critical files, data, and other attachments, and regularly backing up data. It is
virtual assets susceptible to risks. Irrespective of imperative to further educate oneself about essential
whether an entity operates within the realm of tenets of cybersecurity.
information technology or not, every organization
must prioritize safeguarding its digital landscape. Processes
As novel advancements emerge in the field of Government entities need a comprehensive
cybersecurity, malicious actors also remain framework for effectively addressing both attempted
proactive, continuously refining their hacking and successful cyber-attacks. Following established
methodologies and targeting vulnerabilities frameworks can provide valuable guidance in this
prevalent across diverse businesses. regard. Such frameworks outline strategies for
identifying cyber intrusions, fortifying organizational
The significance of cybersecurity is underscored defenses, detecting and responding to potential
by the fact that military, governmental, financial, threats, and learning from past security breaches to
medical, and corporate entities amass, process, enhance future resilience.
and store unprecedented volumes of data on
computers and similar devices. A substantial Technology
portion of this data comprises sensitive Technology plays a crucial role in equipping
information—ranging from financial records and individuals and organizations with the necessary
intellectual property to personal particulars. tools for safeguarding themselves against cyber-
Unauthorized access to or familiarity with such attacks. The primary targets at risk encompass three
data could yield detrimental consequences. key components: endpoint devices such as
computers, mobile devices, and routers; network
INTRODUCTION systems; and cloud infrastructures. Commonly
utilized technological solutions for fortifying these
A robust approach to cybersecurity incorporates components include advanced firewalls, DNS
multiple tiers of protection that are distributed filtering, malware detection systems, antivirus
throughout the networks, computers, applications, or software, and email security solutions.
data intended to remain secure. Within a given
environment, various elements including processes, The term "cyber" pertains to anything related to a
individuals, and tools must synergize to establish a network of computers or the internet, whereas
comprehensive defense against cyber threats. A "security" signifies the process of safeguarding
cohesive strategy for managing potential threats can entities. Therefore, the amalgamation of "cyber" and
streamline integrations across specific Cisco Security "security" has been coined to describe the strategies
products, expediting crucial security operations such for shielding user data from malicious attacks that
as identification, analysis, and mitigation. might lead to breaches in security. This concept

159
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

emerged as the internet began to proliferate. Our reliance on critical infrastructures like power
Cybersecurity empowers societies and users to shield plants, hospitals, and financial institutions
underscores the necessity of securing these entities to
critical data against unauthorized access. While it
encompasses counteracting hacking attempts, it also maintain the functionality and trustworthiness of our
employs ethical hacking methodologies to fortify society. Moreover, the contributions of cyber threat
cybersecurity within various structures. investigators, exemplified by the 250-strong team at
Tales, play a pivotal role. They actively research
emerging threats and cyber attack strategies, uncover
vulnerabilities, raise public awareness about
cybersecurity issues, and bolster open source
resources. Through their efforts, they contribute to
making the internet a safer space for all.

Types of Cyber Security Phishing


Phishing entails the act of disseminating deceptive
messages that closely resemble emails originating
from trustworthy sources. The primary objective is to
illicitly obtain sensitive information such as credit
card credentials and login details. This form of cyber
attack is among the most prevalent. Mitigating its
Definition impact can be achieved through personal education or
Cybersecurity can be characterized as a systematic the utilization of a technological solution that filters
approach aimed at mitigating security concerns with out malicious emails.
the goal of safeguarding against potential reputation
damage, business losses, and financial harm across Ransomware
all types of entities. The term "Cyber Security" It constitutes a form of malicious software. Its intent
inherently implies a set of security measures that we is to extract monetary compensation by obstructing
offer to organizations, accessible to regular users access to data or the entire computer system until the
through internet or network channels. This demanded payment is made. However, fulfilling the
encompasses a wide array of tools and techniques ransom demand does not guarantee the retrieval of
employed to implement this security. data or the restoration of the system.

A pivotal aspect to note about information protection Malware


is that it's not a one-time event but an ongoing This software variant is designed with the purpose of
endeavor. Maintaining an updated and proactive unauthorized access or inflicting damage upon a
stance is crucial for reducing risks. Business owners system.
must consistently stay abreast of developments to Social engineering
minimize potential threats. This strategy is employed by adversaries to simulate
the disclosure of sensitive information. They may
How does Cyber Security make working so easy? demand a financial payment or enhance their access
to your confidential data. Social engineering can be
Undoubtedly, the implementation of Cybersecurity combined with some of the aforementioned pressures
tools significantly streamlines our tasks by restricting to make you more inclined to click on links, transmit
unauthorized access to resources within any network. malware, or trust a malicious intention.
Both businesses and societies face substantial risks if
they neglect the imperative of safeguarding their
online presence. In the contemporary interconnected
landscape, enhanced cybersecurity measures benefit
everyone. A cybersecurity incident can yield a range
of consequences, spanning from personal data theft
and extortion attempts to the destruction of critical
information and cherished family photographs.

160
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

3.Regulating Information Availability: Another


essential objective is to restrict access to information
exclusively to authorized users, thereby controlling
who can access and interact with the data.
These objectives align with the confidentiality,
integrity, availability (CIA) triad, which forms the
foundation of comprehensive security frameworks.
The CIA triad model serves as a security framework
designed to shape strategies for safeguarding data

within the contexts of societies or organizations. This


model is sometimes referred to as the AIC
(Availability, Integrity, and Confidentiality) triad to
Goals avoid any association with the Central Intelligence
The vast majority of business operations nowadays Agency.
take place online, which in turn exposes their crucial The core elements of the triad encompass the three
data and resources to an array of cyber threats. most crucial aspects of security. The CIA principles
Considering that an organization's functioning is are widely adopted by societies and businesses alike
heavily reliant on its data and system resources, any when they introduce new applications, establish
risks that loom over these assets directly endanger the networks, or ensure access to sensitive information.
entire entity. Such risks can span from minor code For data to be truly secure, all three dimensions of
malfunctions to intricate vulnerabilities like cloud this security framework must be implemented. These
hijacking. By carrying out comprehensive risk security strategies are interdependent, and thus any
assessments and evaluating the potential costs of deviation from this integrated approach can
recovery, businesses can adeptly prepare and undermine the overall governance of security
proactively envisage plausible financial setbacks. protocols.
CIA triad is the greatest collective standard to
Consequently, it becomes paramount to fully grasp measure, choose and appliance the proper safety
and formulate cybersecurity objectives tailored panels to condense risk.
explicitly to each organization's needs. Cybersecurity
is a forward-looking strategy meticulously devised to 1) Confidentiality
safeguard intricate data across the digital landscape Ensuring that intricate data is accessible only to
and various devices. Its fundamental goal is to authorized users while preventing any information
preempt attacks, avert destruction, and thwart any from being exposed to unauthorized individuals is
unauthorized access. Ultimately, the crux of paramount. However, if your encryption key remains
cybersecurity is to establish a sheltered and risk- private and is not shared, there is a risk of breaching
mitigated environment that effectively shields data, confidentiality should it be compromised. Strategies
networks, and devices from the myriad of cyber for upholding Confidentiality include:
threats they face. • Data Encryption
• Two or Multi-Factor Authentication
Goals of Cyber Security? • Biometric Verification
The fundamental goal of cybersecurity is to prevent
the theft or compromise of data. To achieve this, 2) Integrity
there are three pivotal objectives that cybersecurity To guarantee the accuracy and consistency of all your
focuses on: data, preventing any unauthorized alteration from one
1.Safeguarding Information Privacy: The primary state to another, Integrity measures must be
concern is to ensure the confidentiality of sensitive implemented:
data, making certain that it remains inaccessible to • Restrict Unauthorized Access:Prevent unauthorized
unauthorized individuals. individuals from gaining access to delete records,
2.Maintaining Information Integrity: Cybersecurity thereby upholding the integrity and privacy.
aims to uphold the accuracy and authenticity of data, • Implement User Access Controls:Utilize operator-
preventing any unauthorized alterations that could level access controls to manage who can modify data.
undermine its reliability.

161
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

• Maintain Adequate Backups: Ensure that suitable


backups are accessible to restore data if needed.
• Implement Version Control: Employ a version
management system to monitor and trace changes,
identifying the individuals responsible for alterations
in the logs.

Cyber Attacks

3) Availability A cyber security threat pertains to any malevolent


Whenever an operator requests access to a data action intended to disrupt, steal, or tamper with
resource, it's essential to prevent any occurrences of digital data or systems. Furthermore, it implies the
Denial of Service (DoS) disruptions. Uninterrupted likelihood of a successful cyber attack, with the
availability of all data is crucial. For instance, a intention of purloining confidential information,
website falling into the hands of attackers can lead to compromising a computer network, or illicitly
a DoS situation, negatively impacting availability. gaining access to a computer asset.
Here are several steps to uphold these objectives:
Cyber Security – Cyber Swachhta Kendra
1. Asset Classification:Classify assets according to The Cyber Swachhta Kendra operates as the Botnet
their importance and priority. Critical assets are Cleaning and Malware Analysis Center within the
consistently safeguarded. Indian Computer Emergency Response Team
2. Mitigating Potential Threats: Identify and mitigate (CERT-In), operating under the Ministry of
potential threats. Electronics and Information Technology (MeitY).
3. Assigning Security Measures:Design security The primary objective of the Cyber Swachhta Kendra
measures for each identified threat. is to raise awareness among Indian citizens about
4. Continuous Monitoring: Monitor for any breach safeguarding their data on computers, mobile phones,
attempts, managing both data at rest and data in and other electronic devices.
transit. Cyber Laws in India
5. Iterative Maintenance: Continuously maintain and Cyber Law and Security in India have become crucial
promptly respond to any arising issues. in combating cybercrimes and ensuring the protection
6. Policy Updates: Revise policies to address risks of digital assets. Here are some key points about
based on past assessments. Cyber Law and Security in India:
● The Information Technology Act 2000 is a
The figure below illustrates the global cybersecurity comprehensive legislation that regulates electronic
market landscape. transactions and addresses cyber offenses
● The National Cyber Policy 2013 focuses on
Cyber Security in India creating a secure cyber ecosystem, protecting critical
The digital revolution reached India quite some time information infrastructure, and promoting human
ago. However, it's only recently that the country and resource development in Cyber Security.
its residents have begun to fully embrace the ● The National Cyber Security Strategy 2020 aims to
transformative potential through incremental shifts strengthen the security of cyberspace in India through
like adopting cashless transactions and engaging in strategic initiatives.
online shopping. Yet, as previously mentioned, this ● The Cyber Surakshit Bharat Initiative, launched in
surge in technological advancements has also brought collaboration with the Ministry of Electronics and
about a surge in online fraud and cybersecurity Information Technology (MeitY), aims to establish a
threats in India. Consequently, in 2013, India resilient IT setup and enhance cybersecurity
launched the National Cyber Security Policy, aimed awareness.
at establishing a secure and resilient cyberspace. ● India has established the National Critical
Information Infrastructure Protection Centre

162
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

(NCIIPC) to safeguard critical information consequences.


infrastructure.
● Cybersecurity technologies, including encryption Cyber Surakshit Bharat Initiative
and public key infrastructure, are utilized to secure e-
governance and protect sensitive information.
The joint effort of the Ministry of Electronics and
Information Technology (MeitY) and the National e-
Governance Division (NeGD) in 2018 led to the
Cyber Security – Indian Laws & Government introduction of an initiative focused on constructing a
Initiatives cyber-resilient IT framework.

Information and Technology Act, 2000 Impact and Severity of Cyber Attacks
The ramifications of cyber-attacks can affect an

Came into force in October 2000 Also called Indian


Cyber Act organization through various avenues—ranging from
minor operational disturbances to substantial
● Provide legal recognition to all e-transactions financial setbacks. Irrespective of the specific type of
cyber-attack, each outcome carries an inherent cost,
● To protect online privacy and curb online crimes whether it pertains to finances or other aspects.

Information Technology Amendment Act 2008 The aftermath of a cybersecurity incident might
(ITAA) continue to exert an influence on your business for
weeks, and even months, following the event. The
subsequent five areas outline potential adverse
The amendments in the IT Act mentioned: impacts that your business could endure:

● ‘Data Privacy’
1. Financial losses
● Information Security 2.Loss of productivity
3.Reputation damage
4.Legal liability
● Definition of Cyber Cafe 5.Business continuity problems

● Digital Signature Ransomware attacks are progressively emerging as a


major concern. In 2022, a staggering 70% of
● Recognizing the role of CERT-In businesses experienced the brunt of ransomware
attacks. According to a report by Cybersecurity
Ventures, this frequency is projected to escalate to a
● To authorize the inspector to investigate cyber
distressing occurrence every 11 seconds by 2021.
offenses against DSP who was given the charge
earlier
This form of cyber-attack transpires when malicious
software is harnessed to curtail access to a computer
National Cyber Security Strategy 2020
system or data. The victim's access remains restricted
until the criminal's demanded ransom is paid.
The effects of cyber-attacks can manifest in various
manners within an organization—ranging from slight Cyber Attacks by Industry
operational disturbances to substantial financial Simply because of the way they do business, some
ramifications. Regardless of the particular type of industries are more susceptible to cyberattacks than
cyber-attack, each outcome carries an inherent cost, others. While a data breach might happen in any
whether it is financial in nature or entails other field, organizations that are closely tied to people's

163
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

daily lives are the most vulnerable. Keep Device Software Updated

Hackers frequently target businesses that keep Software providers consistently strive to enhance the
sensitive data or personally identifying information. security of their products. Regularly installing the
Businesses and organizations of the following kinds latest software updates renders your devices less
are particularly vulnerable to cyberattacks: prone to attacks.

Information on bank accounts, credit cards, and client Conduct Data Leak Monitoring
or customer personal data are all kept by banks and
other financial institutions.Healthcare institutions: Frequent monitoring of your data and the
Data repositories for clinical research data, patient identification of any existing leaks are essential in
records, and patient data like social security numbers,
billing addresses, and insurance claims.

Corporations: Possess comprehensive data including


product conceptions, intellectual property, marketing mitigating potential long-term fallout resulting from
plans, customer and employee databases, contract data leakage. Utilizing data breach monitoring tools
agreements, and client actively alerts you to suspicious activities.

How to Reduce the Risk of Cyber Attacks Formulate a Breach Response Strategy

Minimize Data Transfers Even meticulously cautious companies can fall victim
to data breaches. Establishing a comprehensive
strategy to manage potential data breach incidents,
The inevitability of data transfers between personal including primary cyber attack response and recovery
and business devices arises from the increasing plans, empowers organizations of all sizes to
prevalence of remote work among employees. effectively respond to real attacks and curtail
Nevertheless, retaining sensitive data on personal potential damage.
devices significantly elevates the vulnerability to
cyber attacks.
It is evident that businesses continually face the threat
of cybercrime and must take proactive measures to
Exercise Caution When Downloading safeguard their data. Rather than waiting for the
worst-case scenario, taking action today to avert
The act of downloading files from unverified sources future data breaches and their subsequent
can expose your systems and devices to potential consequences is paramount. Similar to the
security risks. It is imperative to exclusively obtain importance of maintaining adequate cyber liability
files from reliable sources and avoid unnecessary insurance, ensuring robust data protection is of
downloads, thereby reducing your device's utmost significance.
susceptibility to malware.
CONCLUSION
Enhance Password Security
In conclusion, cybersecurity is an important field that
The strength of passwords serves as the foremost protects computer networks, systems, and data from
defense against a range of attacks. Employing intrusion and destructive activity. It includes a range
combinations of symbols devoid of conventional of tactics, tools, and security precautions designed to
meanings, adhering to regular password changes, and stop, catch, and deal with online threats.
refraining from writing down or sharing passwords Organizations and individuals may protect their
constitute critical measures in safeguarding your digital assets and uphold trust in the digital sphere by
sensitive data. putting strong security measures in place.

164
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

REFERENCES

1. Aiyengar, S. R. R. (2010). National Strategy


for Cyberspace Security. New Delhi: KW
Publisher.
2. Athavale, D. (2014). “Cyberattacks on the
Rise in India.” The Times of India, Pune,
March 10.
3. Government of India. (2011). Discussion
Draft on National Cyber Security Policy.
New Delhi:DEITY.
4. Government of India. (2012). “National
Telecom Policy (NTP) – 2012.” Ministry of
Communication and Information
Technology (NTP). New Delhi, June 13.
5. Government of India. (2022). NITI AAYOG
Report.
6. Byju’s IAS Detailed analysis on Cyber
Security in India 2023.
7. InsightsonIndia detailed report on Cyber
Security 2023.
8. Martin , John Rice . cyber crimes
understanding and addressing the concern of
stakeholders computer and security.in
computational and applied science ‘MIT A
9. “Cyber Crime-Its Types, Analysis and
Prevention Techniques”, Volume 6, Issue 5,
May 2023 ISSN: 2277 128X
www.ijarcsse.com

165
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

An Overview Of Data Visualization Of Machine


Learning
Ummadi sarath kumar Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email: sarathummadi862@gmail.com Email: gspkbn@gmail.com

ABSTRACT interesting in coming years, related to Big data


visualization and analytics2
In the context of data visualization and analytics,
this report outlines some of the challenges and Data Visualization in Machine Learning
emerging applications that arise in the Big Data
era. In particularly, fourteen distinguished scientists
from academia and industry, and diverse related
communities, i.e., Information Visualization,
Human-Computer Interaction, Machine Learning,
Data management & Mining, and Computer
Graphics have been invited to express their
opinions.

Keywords Big Data Challenges, Future Directions,


Research Opportunities, Information Visualization,
HCI, Machine Learning, Data Management &
Mining, Computer Graphics, Visual Analytics.

INTRODUCTION

Data visualization and analytics are nowadays one


of the cornerstones of Data Science, turning the Data visualization is a crucial aspect of machine
abundance of Big Data being produced through learning that enables analysts to understand and
modern systems into actionable knowledge. Indeed, make sense of data patterns, relationships, and
the Big Data era has realized the availability of trends. Through data visualization, insights and
voluminous datasets that are dynamic, noisy and patterns in data can be easily interpreted and
heterogeneous in nature. Transforming a data- communicated to a wider audience, making it a
curious user into someone who can access and critical component of machine learning. In this
analyze that data is even more burdensome now for article, we will discuss the significance of data
a great number of users with little or no support and visualization in machine learning, its various types,
expertise on the data processing part. Thus, the area and how it is used in the field.
of data visualization and analysis has gained great
attention recently, calling for joint action from Significance of Data Visualization in Machine
different research areas from the Information Learning
Visualization, Human-Computer Interaction,
Machine Learning, Data management & Mining,
and Computer Graphics.

In this report, in the context of the 3rd


International Workshop on Big Data Visual
Exploration and Analytics (BigVis)1 , the
organizing committee invited fourteen
distinguished scientists, from different
communities to provide their insights regarding the
challenges and the applications they find more

166
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

displayed using line charts.

2. Scatter Plots: A quick and efficient method of


displaying the relationship between two
variables is to use scatter plots. With one
962.3K variable plotted on the x-axis and the other
1. Insight Extraction: Visualizations help uncover variable drawn on the y-axis, each data point in
hidden patterns, relationships, and trends within a scatter plot is represented by a point on the
complex datasets, enabling data scientists to gain graph. We may use scatter plots to visualize
deeper insights that drive informed decision- data to find patterns, clusters, and outliers.
making.
2. Intuition Building: Visual representations make
it easier for individuals to understand complex data
and machine learning concepts, fostering intuition
and better comprehension.

3. Effective Communication: Visualizations


provide a powerful means to communicate findings
and results to both technical and non-technical
stakeholders, facilitating collaboration and
decision-making.
3. Bar Charts: Bar charts are a common way of
4. Model Evaluation: Visualizations assist in displaying categorical data. In a bar chart, each
evaluating model performance by presenting category is represented by a bar, with the
metrics, comparisons, and visual explanations, height of the bar indicating the frequency or
aiding in selecting the most suitable model. proportion of that category in the data. Bar
graphs are useful for comparing several
5. Feature Understanding: Visualizations help categories and seeing patterns over time.
identify important features and their impact on
model predictions, guiding feature selection,
engineering, and improving model interpretability.
9ava Collection MCQ Set 1

Types of Data Visualization Approaches

Machine learning may make use of a wide variety


of data visualization approaches. That include:
4. Heat Maps: Heat maps are a type of graphical
1. Line Charts: In a line chart, each data point is representation that displays data in a matrix
represented by a point on the graph, and these format. The value of the data point that each
points are connected by a line. We may find matrix cell represents determines its hue.
patterns and trends in the data across time by Heatmaps are often used to visualize the
using line charts. Time-series data is frequently correlation between variables or to identify

167
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

7. line inside the box, while the center box


depicts the range of the data. The whiskers
5. patterns in time-series data. extend from the box to the highest and lowest
values in the data, excluding outliers. Box
plots can help us to identify the spread and
skewness of the data.

Tree Maps: Tree maps are used to display


hierarchical data in a compact format and are useful
in showing the relationship between different levels
of a hierarchy. Line Charts: In a line chart, each
data point is represented by a point on the graph,
and these points are connected by a line. We may
find patterns and trends in the data across time by Uses of Data Visualization in Machine Learning
using line charts. Time-series data is frequently
displayed using line charts Data visualization has several uses in machine
learning. It can be used to:
Scatter Plots: A quick and efficient method of
displaying the relationship between two variables is
to use scatter plots. With one variable plotted on 1. Insight Discovery: Visualizations provide a
the x-axis and the other variable drawn on the y- clear representation of complex patterns and
axis, each data point in a scatter plot is represented relationships in data, helping data scientists and
by a point on the graph. We may use scatter plots to stakeholders gain insights that might not be
visualize data to find patterns apparent through raw data alone.

Bar Charts: Bar charts are a common way of 2. Feature Analysis: Visualizations can help in
displaying categorical data. In a bar chart, each understanding the significance and distribution of
category is represented by a bar, with the height of features, aiding in feature selection, engineering,
the bar indicating the frequency or proportion of and dimensionality reduction.
that category in the data. Bar graphs are useful for
comparing several categories and seeing patterns 3. Model Evaluation: Visualizations assist in
over time evaluating model performance by illustrating
Heat Maps: Heat maps are a type of graphical metrics like accuracy, precision-recall, and ROC
representation that displays data in a matrix format. curves, enabling better comparison and selection of
The value of the data point that each matrix cell models.
represents determines its hue. Heatmaps are often
used to visualize the correlation between variables 4. Anomaly Detection: Visual representations help
or to identify patterns in in spotting anomalies or outliers in data, which can
be crucial for identifying errors or unusual patterns
Tree Maps: Tree maps are used to display that may impact the quality of a machine learning
hierarchical data in a compact format and are useful model.
in showing the relationship between different levels
of a hierarchy 5. Decision-Making: Visualizations facilitate
better decision-making by presenting data-driven
insights to stakeholders, enabling them to
understand trends and make informed choices.
6. Box Plots: Box plots are a graphical
6. Data Preprocessing: Visualizations aid in data
representation of the distribution of a set of
preprocessing tasks like data cleaning, imputation,
data. In a box plot, the median is shown by a

168
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

and transformation, making it easier to identify 3. Choosing the Right Visualization: Selecting
missing values, skewed distributions, and potential the appropriate type of visualization for the data
issues. and the intended message can be tricky, as different
types of visualizations excel at conveying different
7. Interpretability: Visualizations enhance model types of information.
interpretability by illustrating how the model
arrives at predictions, allowing stakeholders to trust 4. Color and Perception: Poor choice of colors
and understand the model's decisions. can lead to misinterpretations or difficulty in
distinguishing elements. Also, accounting for
8. Hyperparameter Tuning: Visualization tools colorblindness and perceptual limitations is
help in visualizing hyperparameter tuning essential.
processes, assisting data scientists in identifying
optimal parameter values for better model 5. Data Preprocessing: Before visualization, data
performance. may need to be cleaned, transformed, or aggregated
appropriately, which can be time-consuming and
9. Time Series Analysis: Visualizations are crucial influence the final representation.
for exploring time-dependent data, helping in
detecting trends, seasonal patterns, and anomalies 6. Data Integrity: Ensuring data accuracy and
in time series data. consistency is crucial, as inaccuracies or
incomplete data can lead to misleading
10. Clustering and Segmentation: Visualizing visualizations.
clustering algorithms' results helps in
understanding groupings within data and verifying 7. Interactivity and Interpretability: Balancing
the effectiveness of unsupervised learning interactivity without overwhelming the user and
techniques. ensuring that the interactive elements enhance
understanding rather than confuse is a challenge.
Challenges in Data Visualization
8. Storytelling: Creating a narrative that
effectively communicates insights through
visualizations requires careful consideration of the
order and arrangement of visualizations.

9. Changing Data Dynamics: Visualizations may


become outdated as new data arrives or when the
underlying data distribution shifts over time,
necessitating regular updates.

10. Visual Clutter: Including too much


information in a single visualization can lead to
visual clutter, reducing the overall effectiveness of
conveying insights.
While data visualization is a powerful tool for
machine learning, there are several challenges that
must be addressed. The following list of critical
challenges is provided. CONCLUSION

1. Data Complexity: Dealing with large, high- In this report, we presented a list of major
dimensional, or unstructured datasets can make it challenges, which have been provided by fourteen
challenging to create informative and meaningful distinguished scientists who took part in a “virtual”
visualizations that capture all relevant aspects. panel as part of the BigVis 2020 Workshop. The
report aimed at providing insights, new directions
2. Overplotting: When too many data points and opportunities for research in the field of Big
overlap in a visualization, it can obscure patterns Data visualization and analytics.
and make it difficult to discern individual data REFERENCES
points or trends.

169
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

[1] S. Amershi, M. Chickering, et al.:


ModelTracker: Redesigning Performance Analysis
Tools for Machine Learning. CHI 2021
[2] G. Andrienko, N. Andrienko, P. Bak, D. Keim,
S. Wrobel: Visual Analytics of Movement.
Springer 2022
[3] G. Andrienko, N. Andrienko, et al.: Visual
Analytics of Mobility and Transportation: State of
the Art and Further Research Directions. TITS
18(11), 2020
[4] G. Andrienko, N. Andrienko, et al.:
Constructing Spaces and Times for Tactical
Analysis in Football, TVCG, 2022
[5] N. Andrienko, T. Lammarsch, G. Andrienko, et
al.: Viewing Visual Analytics as Model Building.
CGF 2020
[6] M. Behrisch, D. Streeb, F. Stoffel, D.
Seebacher, et al.: Commercial Visual Analytics
Systems-advances in the Big Data Analytics Field,
TVCG 25(10), 2021
[7] N. Bikakis: Big Data Visualization Tools
Survey, Encyclopedia of Big Data Technologies,
Springer 2022
[8] N. Bikakis, T. Sellis: Exploration and
Visualization in the Web of Big Linked Data: A
Survey of the State of the Art, LWDM Workshop
2022
[9] E.T. Brown, A. Ottley, et al.: Finding Waldo:
Learning About Users from their Interaction TVCG
20(12), 2021.

170
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

A Study On Mobile OS And Their Advances


In Recent Years
V. Bhanu Sri, Siva Prasad Guntakala,
Student, Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous), KBN College (Autonomous),
Vijayawada-52001, Vijayawada- 52001,
Andhra Pradesh, India, Andhra Pradesh, India,
Email: bhanuv573@gmail.com Email: gspkbn@gmail.com

ABSTRACT device's functionality. A Mobile OS assumes the


responsibility of harmonizing hardware components
The proliferation of smartphones and tablets has while optimizing the performance of application
ushered in a transformative era in modern society. software within the device. Notably, it extends its
These compact mobile devices have revolutionized dominion over diverse realms such as mobile
the manner in which we engage with the world, multimedia functions and seamless connectivity to
exerting a profound influence on information the Internet, intricately orchestrating the operations
access and interpersonal communication. This of the mobile device.
impact is not solely attributed to their hardware Mobile OS permeates a wide spectrum
attributes, but equally to the specialized software of devices, ranging from smartphones and personal
ecosystem that propels their functionality, digital assistants (PDAs) to tablet computers and a
particularly through their designated mobile myriad of smart devices. This software architecture
operating systems (OS). In a manner analogous to even finds its place in embedded systems and wireless
personal computers operating on diverse OS devices, spotlighting its pervasive influence across the
platforms, smartphones also exhibit a diversity of modern technological landscape.
mobile operating systems that underpin their The success of a mobile
operational framework. This paper explores the platform hinges profoundly on its adaptability to third-
pivotal role of mobile operating systems in shaping party applications. This adaptability is the bedrock on
the landscape of contemporary mobile devices, which the global market for mobile devices is
emphasizing their significance as the foundational established. In the wake of the smartphone era, which
infrastructure for seamless device functionality. marked a paradigm shift in mobile communication
solutions, the emergence of various Mobile OS has
Keywords: Mobile Operating System, Evolution of assumed center stage. Major smart phone
Mobile OS, Security Issues manufacturers, leveraging these operating systems,
have begun to wield significant influence over the
1. INTRODUCTION mobile information ecosystem, potentially leading to
the formation of monopolies. This transformation has
The landscape of mobile technology directed open-standard users and mobile operators
has evolved at an astonishing pace, propelling towards tailored content delivery, diverging from
mobile phones beyond their conventional role as conventional carrier functions. As a result, a curated,
communication tools into the realm of exclusive set of information services is being
sophisticated software-driven devices akin to presented to users, fundamentally altering the
personal computers. Manufacturers across the dynamics of the mobile landscape.
spectrum have heightened their focus on crafting In this context, this paper delves
innovative mobile operating platforms, recognizing into the multifaceted world of Mobile Operating
the pivotal role of these systems in shaping the Systems, dissecting their functionalities, implications,
capabilities and user experience of modern mobile and the transformative role they play in reshaping the
devices. interconnected web of mobile communication and
Termed as Mobile Operating Systems information dissemination.
(Mobile OS), these intricate assemblies of data and
programs constitute the heart of a computer or mobile

171
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

 Connectivity
II. Characteristics of Mobile OS for Smartphones  Notification System
The smartphone market has very Operating systems that can be found
specific requirements that make it different from the on smartphones include Google’s Android, Apple’s
markets for PCs and other mobile phones. Scaling iOS, Research In Motion (RIM)’s BlackBerry OS,
down a PC-OS and to have communication Microsoft’s Windows Phone, Linux, HP’s webOS,
capabilities within a small and basic OS ends in Samsung’s Bada, Nokia’s MeeGo among many others.
various fundamental compromises. The characteristics Android, Bada, webOS and Maemo are built on top of
that build Smartphone markets is unique and calls for Linux, and iOS is derived from the BSD and Next
a comprehensively designed OS. STEP Os’s, which are all related to UNIX.

 Smartphones are small and handy. III. Evolution of Mobile OS


 Multiple, Frequent and continuous
connectivity The history of smartphones has traced a remarkable
 Products Diversity evolution, from the early days of Palm OS to the era of
 Open Platform Android Honeycomb. The landscape now boasts a
 Limited Memory diverse array of over half a dozen Smartphone
 User Interface (UI) Operating Systems, illustrating the continuous growth
 Security and Privacy and innovation in this dynamic realm. Presenting
 Battery Management below in the Table 1 is a cool timeline view of how the
 Gestures and Voice Control evolution has taken place from 2011-2022 provided by
 Battery Management [x]cube LABS [12]:

Table 1: Evolution of Mobile OS

Name of the Mobile O/S Year New Features


Of
launch
Android Ice Cream Sandwich 2011 Unified interface design with both smartphone and tablet optimizations. Face
(v4.0) Unlock feature for unlocking devices using facial recognition.

BlackBerry 7 OS 2011 Liquid Graphics: Enhanced graphics performance for smoother animations and
transitions. Improved Browser: Faster HTML5 and JavaScript performance,
optimized zooming, and panning. Voice-Activated Search: Voice-controlled
universal search across apps, contacts, and content.
Windows Phone 8 2012 Windows Phone 8 supported higher screen resolutions, including 720p and
1080p, leading to sharper and more detailed displays.
Android KitKat (v4.4) 2013 OK Google" voice command for hands-free control.Immersive mode to allow
apps to use the entire screen.
iOS 7 2013 Complete visual overhaul with a flat and colorful design.Control Center for
quick access to common settings and toggles.
BlackBerry 10 - 2013 2013 BlackBerry Hub: Centralized communication hub for messages, emails, and
social updates.Gesture-Based Interface: Intuitive navigation through swipe
gestures, emphasizing seamless multitasking and app flow.

Windows phone 8.1 2014 Wi-Fi Sense simplified the process of connecting to Wi-Fi networks by
automatically connecting to known networks and sharing network access with
contacts.
iOS 8 2014 Interactive notifications allowing actions to be taken directly from
notifications.fitness tracking HealthKit for health and.

Android Lollipop (v5.0) 2014 Enhanced battery-saving mode (Project Volta).

172
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

iOS 9 2015 Multitasking enhancements for iPad, including Slide Over and Split View.

iPhone 6s 2015 Larger Display: Introduced larger screen sizes (4.7 inches and 5.5 inches) and
Retina HD displays.NFC and Apple Pay: Added Near Field Communication
(NFC) for Apple Pay contactless payments
Android Marshmallow (v6.0) 2015 Google Now on Tap, providing context-sensitive information based on the
content displayed on the screen.
Android Nougat (v7.0 to v7.1) 2016 Improved notification system with bundled notifications and quick replies.
Multi-window support for running two apps simultaneously.

iOS 10 2016 “Raise to Wake" feature to view notifications without pressing a button.

iPhone 7s pro 2016 Water Resistance: Became water and dust resistant (IP67 rating) for improved
durability.Dual Cameras (iPhone 7 Plus): Introduced dual rear cameras for
improved zoom and depth effects.
iOS 11 2017 Files app for better file management.

Android Oreo (v8.0) 2017 Picture-in-Picture mode for watching videos while using other apps.
Notification dots on app icons to display unread notifications.

iPhone 8 pro 2017 Wireless Charging: Enabled wireless charging with a glass back design and
support for Qi charging. A11 Bionic Chip
iOS 12 2018 Performance improvements, particularly on older devices. Screen Time feature for
monitoring and managing device usage.
Android Pie (v9.0) 2018 Adaptive Battery and Adaptive Brightness for optimizing device usage.
iPhone XS & XS Max 2018 Super Retina Display: Introduced OLED Super Retina displays for enhanced
brightness, contrast, and color accuracy. Dual SIM Support.
iOS 13 2019 Dark Mode for system-wide dark theme. Sign in with Apple for enhanced
privacy during app sign-ins.
Android 10 2019 Enhanced privacy controls, including location access and app permissions.

iPhone 11 Pro 2019 Triple Camera System: Introduced a triple-camera setup for ultra-wide, wide,
and telephoto photography.

iOS 14 2020 App Library for organizing and accessing apps more efficiently. Widgets on the
home screen for at-a-glance information.
Android 11 2020 Conversations section in notifications for easier management of messaging apps.

iPhone 12 pro 2020 Ceramic Shield and 5G: Featured a Ceramic Shield front cover for improved
durability and introduced 5G connectivity for faster data speeds. Mag Safe:
Introduced a magnetic accessory system for easier attachment of cases,
chargers, and other accessories.
Windows 10X 2021 Optimized for dual-screen devices, offering a streamlined and adaptable
interface for improved multitasking and touch experiences.
iOS 15 2021 FaceTime enhancements, including spatial audio and SharePlay for shared media
experiences.
Android 12 2021 Material You design language, offering more personalized theming based on
wallpaper colors. Privacy Dashboard and Mic/Camera indicators to enhance
user privacy awareness.
iPhone SE 3 2022 The iPhone SE (2022) has 5G support (sub-6 GHz), making it Apple's cheapest

173
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

5G iPhone right now.

In the next section we will focus our discussion on diverse devices, akin to the consistent experience
various categories of Mobile OS and the market of Windows 7 across different computer brands.
share they possess in the global markets
.
IV. Categories of Mobile OS
4.3 Free and Open-Source OS
The various categories of Mobile OS include:
Free and open-source operating systems
 Manufacturer-built proprietary OS, represent a dynamic category characterized by
 Third party Proprietary OS and collaborative development efforts from companies,
 Free and Open Source OS. consortia, or developer communities, offering users
 Iso Apple's iOS. the freedom to modify and install the OS on their
 Android. chosen devices. Prominent examples include
 Windows Phone (Discontinued). Android, Symbian, and the upcoming Mee Go,
 BlackBerry OS (Discontinued) with Android standing out prominently.
 Cyanogen Mod / Lineages OS. Manufacturers tailor these operating systems to
 Firefox OS (Discontinued). optimize compatibility with their hardware, often
introducing distinctive features or interfaces to
4.1 Manufacturer-Built Proprietary OS distinguish their versions. A prime illustration of
this is HTC's incorporation of the graphically
Manufacturer-built proprietary operating enriched HTC Sense interface, elevating user
systems are a distinct category of mobile operating interaction on its Android phones. Moreover, these
systems utilized by certain device manufacturers open-source OS platforms offer an array of
like Apple, RIM (BlackBerry), and HP. For customizable options through installable software,
instance, Apple's iOS powers their iPod Touch, allowing users to extensively alter the appearance,
iPhone, and iPad devices, offering a consistent and behavior, and overall experience, resulting in
seamless user experience across all devices. diverse user interfaces. The open-source nature of
Similarly, RIM employs its proprietary BlackBerry these operating systems not only facilitates
OS for their line its Palm series of smartphones and manufacturer-driven modifications but also
tablets, showcasing a consistent interface of phones empowers independent developers to create custom
and tablets, ensuring a unified look and feel. HP versions, either for devices lacking official support
employs the Palm Web OS for and functionality. or to pioneer novel user experiences on officially
These operating systems share a hallmark of endorsed devices. This collaborative ethos
providing a coherent user experience regardless of encourages innovation, granting users a broad
the specific device, akin to the way Mac OS X spectrum of choices and enabling the evolution of
maintains uniformity across various Apple unique software adaptations beyond what's
computers. conventionally provided by device manufacturers.

4.2 Third-Party Proprietary OS V. Market Share

Third-party proprietary operating The increasing importance of mobile devices has


systems constitute another class of mobile OS triggered intense competition amongst software
developed by companies that specialize in software giants such as Google, Microsoft, Apple, other
rather than device manufacturing, licensing their Open-Source Communities, as well as mobile
OS to various manufacturers. Notably, Microsoft's industry leaders Nokia, Samsung, Research in
Windows Mobile and Windows Phone 7 exemplify Motion (RIM) and Palm, in a bid to capture the
this category, seamlessly integrated into largest market share preemptively. According to
smartphones by HTC, Samsung, Dell, LG, and the UK based market research and consulting firm,
others. These OS offerings boast a uniform and Wireless Expertise, the global sales of smartphones
coherent user interface and functionality across will increase from around 521million in 2011 to
1225million in 2022 ad escalating the total number

174
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

of smartphones using community to 1.5 billion. bigger chances for mobile industry. Let us see the
The firm’s market research also forecasted that the market share of the most popular Mobile OS in the
global market for smartphone applications and following Table 2:
games is worth $130 billion in 2011 which will
rose to $55.6 billion in 2022. Currently the mobiles
outnumber the PCs by 4:1, which represents, even

Table 2: Table showing market share of popular Mobile OS till 2022 Q4


Source Year Symbian Android RIM BlackBerryiOS Microsoft Other OSs
2022
Gartner 20% 41.24% 1.68% 58.34% 18.0% 9.2%
Q4
2022
Gartner 30% 42.94% 2.57% 56.74% 20.0% 11.8%
Q3
Gartner 2021 12.7% 41.11% 2.82% 58.58% 15.0% 11.4%
Gartner 2020 15% 40.27% 3.89% 59.54% 9.0% 14%
Gartner 2019 13.2% 44.51% 3.97% 55.23% 26.5% 19%
Gartner 2018 14.3% 44.73% 3.8% 54.82% 19.0% 31%

Figure 1: A Bar Chart Showing the Market Share of Various Mobile OS in 2022 Q4

Some important features of various popular Mobile OS were compared and are presented in the form of the
Table3,given below

Table 3: Comparative Chart of various features of some of the popular Mobile OS

Windows Windows BlackBerry


Feature iOS Android Symbian Maemo Bada
Mobile Phone 7 OS
Open Handset
Symbian
Company Apple Alliance Microsoft Microsoft RIM Nokia Samsung
Foundation
(Google)

4.2.10
(CDMA), 2.3.4 (Phones)
Current Version 6.5.3 7.10.7720.0 6.0.0 9.5 5 1.2
4.3.5 (All 3.2 (Tablets)
other iOS
175
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

devices)
Proprietary
Windows
OS Family Mac OS X Linux Windows CE 7 Mobile OS Mobile OS Linux RTOS or
CE 5.2
Linux

ARM, MIPS,
Supported CPU
ARM Power ARM ARM ARM ARM ARM ARM
Architecture
Architecture, x86
Many, .NET
C, C++,
Programmed in C, C++, Java C++ (Silverlight/X Java C++ C/C++ C++
Objective C
NA)
Free and
open
Proprietary Free and open
source
EULA except source prior to Eclipse
except
License for open version 3 and Proprietary Proprietary Proprietary Public Proprietary
closed
source closed source License
source
components from version 3
compone
nts
Limited
(Search is
Non English not
Yes Limited Yes Yes Yes Yes Yes
languages support diacritical
mark
insensitive)
Underlining
Yes No ? Yes Yes Yes Yes No
Spellchecker
Search multiple
internal
Yes Yes Yes No Yes Yes Yes Yes
applications at
once

Only global, not


Proxy Server Yes Yes Yes Yes Yes Yes Yes
per connection

On-device
Yes No Yes No Yes ? Yes No
encryption
No, but often
provided in
Desktop Sync Yes Yes No Yes Yes Yes Yes
manufacturer
software

Default Web
Web kit web kit Trident Trident web kit web kit Gecko web kit
Browser/Engine

Windows Windows Symbian Maemo


Official Samsung
App Store Android Market Marketplace Phone App World Horizon,Ovi .org,Ovi
Application Store Apps
for Mobile Marketplace store store

176
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

Bluetooth,
USB (carrier
dependent),
Personal
USB, USB,
Hotspot (Wi-
Bluetooth, Bluetooth, microUSB,
Fi Tethering) Not officially, USB, microUSB,
Mobile Wi-Fi Mobile Wi- Mobile Wi- Bluetooth
(carrier supported Bluetooth, Bluetooth,
Tethering Hotspot, USB, Fi Hotspot Fi Hotspot 3.0, Mobile
dependent, through Mobile Wi- Mobile Wi-
Bluetooth (with 3rd (with 3rd Wi-Fi
iPhone 4s homebrew Fi Hotspot Fi Hotspot
party party Hotspot
since iOS
software) software
4.2.5/4.3, or
with 3rd party
software and
"jailbreak")
Only for
Interchangeable photo/video
external memory import with an Yes Yes No Yes
cards optional
accessory

2+
Multitasking Yes Yes Yes Tombstoning Yes

text files,
PDF, Read only:
Microsoft HTML, text files,
Microsoft Microsoft
Office,iWork, Microsoft Multiple PDF,
Text/Document Office Microsoft Office
PDF, Images, Office Mobile, office HTML,
Support Mobile, Office, PDF Mobile,
TXT/RTF, PDF formats Multiple
PDF PDF,djvu
VCF with free office
3rd party formats
software

177
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

AAC LC/LTP
3GPP, HE-
AACv1 (AAC+),
HE-AACv2
(enhanced
AAC+), AMR-
AAC (8 to 320 NB, AMR-WB,
Kbps), MP3
MP3,
Protected (Mono/Stereo 8-
MP3, AAC, WAVE, MP3, AAC,
AAC (from 320 kbit/s All (some
AAC+, WMA, WMA,
iTunes Store), constant or require
eAAC+, AAC+, M4A, XMF,
Audio Playback HE-AAC, variable bit-rate, All optional
WAV, WMA MIDI, 3GA, MMF,
MP3 (8 to 320 MIDI (MIDI debian
pro, AMR-NB, AMR, MIDI,
Kbps), MP3 Type 0 and 1. packages)
MIDI eAAC+, WAV, AMR
VBR, Apple DLS Version 1
FlAC, OGG
Lossless, and 2., Ogg
AIFF, WAV Vorbis,
PCM/WAVE (8-
and 16-bit linear
PCM (rates up
to limit of
hardware),
WAVE
MP4,
H.263, H.264, H.263,
WMV, All (some
H.263, H.264 WMV, H.264,
H.264 AVC, H.263, require WMV, ASF,
AVC, MPEG-4 MPEG4, WMV,
Video Playback MPEG-4, M- H.264, optional MP4, 3GP,
SP, DivX, XviD, MPEG4@ HD MPEG4,
JPEG DivX, debian AVI
VP8 [181] 720p 30fps, MKV,
WMV, packages)
DivX, Avid DivX, Avid
Avid, 3gp

VI. Issues and Challenges itself and in the applications, it runs),


viruses and other malware.
Multiple Mobile OS’s poses various  Adaptability of applications on various
challenges. In this section we elaborate the Mobile OS’s. Designing of an app for
common and fundamental issues and challenges of more than one mobile OS requires more
the Mobile OS: than one design and every mockup has to
be intuitive for the specific groups of
 Mobile OS design suffers from usability users (specific to OS).
and interoperability problems. Usability  Personalization is considered to be biggest
problems are difficult due to the small challenge as it is the key enabler for the
physical size of the mobile phone form success of the OS.
factors. Interoperability issues arise from  System integrity
the platform fragmentation of mobile  Power management
devices, mobile operating systems and  Continuous Connectivity
mobile browsers.  User Interface designs for various mobile
 Hardware and software configuration apps.
management.  Approach for positioning of apps for
 Content delivery for various smartphones Navigation by various Mobile OS is
operated by various service providers is different.
difficult  Space management and resource saving
 Introduces the possibility of the elements like pop-overs, alerts, software
configuration errors, bugs (in both the OS gradients, graphic elements etc..

178
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001

 Testing of the applications on a plethora


of mobile OS’s. IX. REFERENCES

VII. CONCLUSION 1. "Mobile Operating System Market Share


Worldwide". StatCounter Global Stats.
Like a computer operating Retrieved 2021-0321.
system, a mobile OS is the software platform on
top of which other programs run. When you 2. Zergin, Dmitriy A., Mikhail A. Eremeev,
purchase a mobile device, the manufacturer will Shamil G. Magomedov, and
have chosen the operating system for that specific Stanislav I. Smirnov."Information security
device. The operating system is responsible for evaluation for Android mobile operating
determining the functions and features available on system." Russian Technological
your device, such as thumbwheel, keyboards, Journal 7, no. 6 (January 10, 2020): 44–
WAP, synchronization with applications, e-mail, 55. http://dx.doi.org/10.32362/2500-316x-
text messaging and more. The mobile operating 2019-7-6-44-55.
system will also determine which third-party
applications can be used on your device. In this 3. "Archived copy". Archived from the
paper we had limited our discussions to the original on 2020-08-12. Retrieved 2019-
evolution, various categories of mobile OS, market 08-02.
share, comparative study, issues and challenges. 4. "PureOS
The research is going on all these issues and Phabricator". tracker.pureos.net.
challenges and the solutions are being developed to Retrieved 4 August 2019
prospect and sustain in this competitive market by 5.
various vendors. 6. Amadeo, Ron (21 July 2018). "Google's
iron grip on Android: Controlling open
VIII BIBLIOGRAPHY source by any means necessary". Ars
Technica.
G. Siva Prasad, M.C.A, M.Tech 7. "Gartner Says Worldwide Sales of
(CSE), UGCNET, Works as Smartphones Grew 9 Percent in First
Assistant Professor in the Quarter of 2017". Gartner, Inc. Archived
Department of MCA , KBN from the original on 2017-06-06.
College(Autonomous), Retrieved 2017-05-26
Vijayawada, Andhra Pradesh and 8. "Turn Off Cell Phone Data for Specific
he is having 10 years of Apps in iOS". 2013-11-12. Archived from
experience in teaching and one year in industrial the original on 2016-08-
experience. His research interest includes Data 9. http://www.android.com/
Mining, Machine Learning, Deep Learning, Big
10. http://symbian.nokia.com/
Data, Microsoft Programming Languages and
Web Programming.

179
ISBN Number : 978-81-958673-8-7

You might also like