Professional Documents
Culture Documents
INDEX
S. NO TITLE OF THE PAPER PAGE
NO
1 A Study And Survey Of Big Data Using Data Mining Techniques 1
and random forest employ diverse methodologies, Optimization involves the process of identifying
while rotation forest employs feature extraction to the most efficient and effective alternatives,
create effective ensembles. considering specified constraints. This is achieved
by maximizing desired factors and minimizing
2. Descriptive Data Mining: undesired ones. Genetic algorithms stand as well-
Descriptive models are designed to examine known tools for optimization and search problems,
historical events within the data, offering insights employing a method akin to simulated evolution to
that can guide future approaches. These models breed computer-generated solutions. This
delve into past occurrences, mining historical data evolutionary process begins with a population of
to decipher the factors contributing to past randomly generated individuals. With each
successes or failures. By quantifying relationships successive generation, the fitness of each individual
within the data, descriptive models have the is evaluated, guiding their selection for the next
capacity to classify entities, such as grouping iteration. The algorithm concludes either when a
customers into segments based on their attributes. predetermined maximum number of generations is
This distinction sets them apart from predictive reached or when an acceptable fitness level is
models, which focus on predicting the behavior of attained for the population.
individual entities, such as customers. Data mining techniques play a crucial role in data
Several approaches have stemmed from the realm preprocessing. This involves tasks such as
of descriptive models, including: cleansing data by eliminating outliers through
● Association Rules Mining: clustering methods and refining data by applying
It is an approach for exploring the relationships of regression techniques to reduce noise. Sampling, a
interest between variables in huge databases. statistical technique, is frequently employed in data
Considering groups of transactions, it discovers preprocessing before deploying most data mining
rules that forecast the existence of an item methods. The reason behind this usage is that
depending on the existences of other items in the working with the complete dataset of interest can
transaction. It is applied to guide positioning be prohibitively expensive and time-consuming. By
products inside stores in such a way to increase utilizing sampling, a representative subset is
sales, to investigate web server logs in order to selected for analysis, facilitating more efficient and
deduce information about visitors to websites, or to manageable processing.
study biological data to discover new correlations.
Examples for association rules mining techniques IV. EVOLUTION TO BIG DATA ANALYTICS
are: Frequent Pattern (FP) Growth and Apriori. TECHNIQUES:
Apriori explores rules satisfying support and The term 'Big Data' made its debut in 1998 within
confidence values that are greater than a predefined a Silicon Graphics (SGI) slide deck by John
minimum threshold value. Mashey titled "Big Data and the Next Wave of
InfraStress". The connection between Big Data and
● Clustering: data mining was evident from the outset. The first
Cluster Analysis is one of the unsupervised book mentioning 'Big Data' emerged in 1998,
learning techniques, which collects similar objects authored by Weiss and Indrukya, and the first
together that are far different from the rest of academic paper with 'Big Data' in its title was
objects in other groups. Examples include grouping written by Diebold in 2000.
of related documents in emails, or proteins and
genes having similar functionalities. Many types of The term 'Big Data' stems from the vast amount of
clustering techniques have been introduced like the data generated daily. Usama Fayyad's presentation
nonexclusive clustering, where the data may belong at the KDD BigMine'12 Workshop highlighted
to multiple clusters. Whereas fuzzy clustering staggering data statistics from internet usage. For
considers a data item to be a member to all clusters instance, Google handles over 1 billion queries
with different weights ranging from 0 to 1. daily, Twitter produces more than 250 million
Hierarchical (agglomerative) clustering, on the tweets daily, Facebook witnesses more than 800
other hand, creates a group of nested clusters that million updates per day, and YouTube registers
are arranged in the form of a hierarchical tree. K- over 4 billion daily views. The data generated
means is the most famous clustering algorithm, annually is estimated to be in the order of
where it uses a partitioned approach to separate the zettabytes, growing around 40% annually. Mobile
data items into a pre-determined number of clusters devices are also contributing significantly to this
having a centroid; data items that are in one cluster data surge, with major companies like Google,
are closer to its centroid. K-medoids algorithm is a Apple, Facebook, Yahoo, and Twitter exploring
clustering algorithm related to K-means algorithm, this data for useful patterns to enhance user
which chooses data points as centers . experiences.
analysts, researchers, and business users to make speech recognition, and rules-based decision
more informed and timely decisions using engines have propelled the growth of video, audio,
previously hidden, inaccessible, or unusable data. and image analytics techniques. These
However, the substantial increase in data volume advancements have been driven by the availability
rendered traditional data mining algorithms of real-time data containing rich image and video
inadequate. Consequently, research has focused on content. As a result, these techniques hold
enhancing data mining techniques to accommodate significant potential to address a range of
Big Data, giving rise to the field of big data economic, political, and social challenges.
analytics.
The continuous evolution of big data technology,
Big data analytic techniques encompass various coupled with innovative analytics approaches, is
data mining functions, with association rules shaping a landscape where meaningful insights can
mining and classification tree analysis being of be extracted from diverse data sources, paving the
paramount importance. This section delves into the way for informed decision-making across various
core data mining tasks that have transitioned to big domains.
data analytics. It elucidates the enhancements
introduced to these techniques to facilitate their VI.BIBLIOGRAPHY
adaptation to big data, addressing the "V"
dimensions of big data through such modifications. G. Siva Prasad, M.C.A, M.Tech
Table 1 serves as a comprehensive summary of this (CSE), UGCNET, Works as
analysis, outlining the evolution of data mining Assistant Professor in the
tasks to big data analytics. Techniques are Department of MCA , KBN
categorized based on their data mining task, College (Autonomous),
presenting their current status in terms of being Vijayawada, Andhra Pradesh and
adapted to big data analytics and the specific he is having 10 years of experience in teaching and
dimension of big data they address. The subsequent one year in industrial experience. His research
subsections delve into the enhancements made to interest includes Data Mining, Machine Learning,
different data mining techniques, catering to the Deep Learning, Big Data, Microsoft Programming
dimensions of big data and fostering their evolution Languages and Web Programming. He has
attended workshops on POWER BI, Data Analytics
using R, Generative AI, Block Chain Technology
and many more.
VII. REFERENCES
III. ADVANTAGES OF IOT While we come to the end, IoT holds the lives of
the human being straightforward and satisfactory. It
has made the lives of the people very useful.
Internet of things facilitates the several advantages
Whereas on the other hand with the increased use
in day-to-day life in the business sector. Some of its
of the Internet of Things the treat for security and
benefits are given below:
safety has also improve. So we should be
concerned while giving the details on the Internet
o Efficient resource utilization: If we platform. Fusion informatics is a web, mobile and
know the functionality and the way that best IoT App Development Companies in
how each device work we definitely Atlanta specializing in the development of
increase the efficient resource utilization challenging and complex projects. Since 2000,
as well as monitor natural resources. we’ve delivered compelling solutions for such
o Minimize human effort: As the devices companies as Bosch, Lenovo, Bharat Petroleum,
of IoT interact and communicate with Reliance, Tardebulls and others.
each other and do lot of task for us, then
they minimize the human effort.
o Save time: As it reduces the human effort IoT is an advanced automation and analytics
then it definitely saves out time. Time is system which deals with artificial intelligence,
the primary factor which can save through sensor, networking, electronic, cloud messaging
IoT platform. etc. to deliver complete systems for the product or
o Enhance Data Collection: services. The system created by IoT has greater
transparency, control, and performance.
o Improve security: Now, if we have a
system that all these things are As we have a platform such as a cloud that contains
interconnected then we can make the all the data through which we connect all the things
system more secure and efficient. around us. For example, a house, where we can
Abstract— Clustering, a prominent technique in Clustering methods are commonly used for statistical data
unsupervised machine learning, plays a pivotal role in analysis.
grouping similar data points together in order to discover
inherent patterns and structures within complex datasets. II. CLASSIFICATION OF CLUSTERING
This abstract introduces a novel clustering algorithm ALGORITHMS:
designed to efficiently partition data into cohesive clusters
based on their inherent similarities. The algorithm employs a There are a number of assumptions and initial conditions
multi-step approach, starting with the initialization of cluster that are responsible for many different clustering algorithms.
centers and iteratively optimizing them to minimize intra- The widely accepted classification structures the clustering
cluster distances while maximizing inter-cluster distances. method as follows:
This process involves adaptively adjusting cluster
assignments and refining cluster centers to converge towards Hierarchical clustering
a stable clustering solution. The algorithm's efficacy is Partitional clustering
demonstrated through experimental evaluations on various Density based
datasets, showcasing its ability to handle datasets with Grid based clustering
varying densities, shapes, and sizes. Furthermore, the Model based clustering
algorithm's computational efficiency is highlighted, enabling
its application to large-scale datasets. Overall, this clustering This classification depends on many factors and a few
algorithm offers a valuable tool for exploratory data algorithms have been developed that combine multiple
analysis, pattern recognition, and information organization methods. Recently a number of algorithms have been
across diverse domains. developed to provide solutions in different fields but there is
no single universal solution provided by algorithms that
Keywords — clustering, proximity, similarity, CF solve all available clustering problems It’s always important
tree, KDD, optimization. to talk about it. Here are some of the signs listed.
Density-based algorithms such as DBSCAN can use Min pts aggregate hierarchical method to select two clusters in the
concept to achieve this. Many algorithms based on mean or merging phase whose connectivity and proximity reach a
medoid-based approach fail to meet these two clustering threshold value. The algorithm is repeated until none of the
criteria of forming heterogeneous clusters and convergence neighboring clusters can satisfy both conditions.
of concave size clusters The paper throws some light on this
ECHIDNA- This is an aggregate hierarchical method
in the subsequent paragraphs. Similarity or dissimilarity
for clustering network traffic data. The steps of the
measure: This measure is a real value function which
algorithm are given below.
indicates the degree of similarity between two objects.
Extensive literature on these measures can be found Some • Input data is extracted from network traffic with 6
popular measures are listed in Table I . Tuple value of numeric and categorical attributes.
. • Each record iteratively builds a hierarchical tree of
1. HIERARCHICAL CLUSTERING
clusters called CF-Tree.• Insert each record into the closest
BIRCH - Balance iterative reduction and clustering cluster using a combined distance function for all attributes
using hierarchy. It is an aggregation hierarchical algorithm into CF-Tree. • The radius of a cluster determines if a record
that uses the Clustering Feature (CF-Tree) and changes the should be absorbed into the cluster or if the cluster should be
properties of subgroups sequentially. The algorithm is as split. • Once the cluster is created and all the significant
follows. nodes are to form a Cluster Tree. The Cluster Tree is further
compressed to create a concise and meaningful report.
• Load data into memory: The CF Tree is created with a
single data scan. The next step is faster, more accurate, and SNN - Shared Nearest Neighbors A hierarchy of top to
less procedural. • Data condensation: The CF tree was bottom approach is used for grouping the objects. The steps
rebuilt with a large T. • Global clustering: Use existing of algorithm are given below:
clustering algorithms in CF documents. • Make another pass
• A proximity matrix should be maintained for the
on the cluster refining data set and reset the data points to
distances of set of points. • Objects are clustered together
the nearest center point from the steps above. • Continue the
based on the nearest neighbor and the object with maximum
process until k is created with the cluster.
distance can be avoided.
CURE- Clustering Using representatives: The clustering
CACTUS – Clustering Categorical Data Using
method uses a hierarchy and selects well-dispersed points
Summaries. It is a very fast and scalable algorithm for
from the cluster and then draws them to the center of the
finding the clusters. A hierarchy structure is used to generate
cluster by a specified function. The algorithm is as follows.
maximum segments or clusters. A two-step procedure deals
• Initially, all points are in different clusters, each cluster with the description of algorithm as follows:
is defined by a location within the cluster.• Well-dispersed
• Attributes are strongly connected if the data points are
objects are first selected for the cluster and then reduced or
having larger frequency.• Clusters are formed based on the
moved to the cluster by some specified century. • At each
co-occurrences of attribute value pairs.• A cluster is formed
stage of the algorithm, two clusters containing the two
if any segment is having no of elements α times greater than
closest representative points are selected and merged to form
elements of other.
a cluster.
2. PARTITIONAL CLUSTERING
ROCK - Robust Clustering algorithm for Categorical
objects. It is a hierarchical clustering algorithm that uses a Partitional clustering is highly dissimilar to hierarchical
link strategy to form clusters. Bottom-up connections cluster approach which yields an incremental level of clusters with
together. The algorithm is as follows: initially consider a set iterative fusions partitional clustering assigns a set of objects
of points where each point is a cluster and compute the into K clusters with no hierarchical structure. Research from
correlation between any two points. Create a pile for each very recent years acknowledges that partitional algorithms
group and monitor the pile. Based on the criterion function, are a favoured choice when dealing with large datasets. As
goodness of fit measure will be calculated between two these algorithms have comparatively low computational
groups. Merge the cluster whose criterion function has the requirements however when it comes to the coherence of
highest value. clustering this approach is less effective then agglomerative
approach. These algorithms deduce the shape of clusters as
CHAMELEON- This is an aggregate hierarchical
hyper-ellipsoidal and basically experiment with cutting data
clustering algorithm of dynamic modeling that involves a
into n number of clusters so that partitioning of data
two-step method of clustering. The algorithm is as follows:
optimizes a given criterion. Centroid based techniques as
The two-step method of split and combine is used to form a
used by K-MEANS and ISODATA assign some points to
cluster
clusters so that the mean squared distance of points to the
• Consider all data points as a cluster at the time-initial centroid of the chosen cluster is minimized. The sum of
of the Partition phase. • Partition the cluster into a number of squared error function is the dominant criteria function in
sub clusters using the METIS method using the graph partitional approach. It is used as a measure of variation
partition algorithm. • The process ends when a large subset with in a cluster. One of the most popular partitioning
contains a relatively specified number of vertices.• Use the clustering algorithms implementing SE is k – means.
K-MEANS-K-means is undoubtedly a very popular clusters and no filler structure is used., this product which
partitioning algorithm. It has been discovered, rediscovered makes this algorithm robust against increase in
and studied by many experts from different fields, by dimensionality of data .The previously mentioned methods
Steinhaus (1965),Ball and Hall (1965),Lloyd(proposed 1957 assume that the distance function must be Euclidean but
– published 1982) and Macqueen(1967). It is distance-based CLARANS uses a local search method and prohibits any
and by definition data is partitioned into pre- determined specific distance function. CLARANS claims to recognize
groups or clusters. The distance measures used could be polygon shapes well. A method called “IR – approximation”
Euclidean or cosine. Originally a fixed K cluster centroids is used to cluster non-convex polygon as well as convex
are marked at random; k-means reassigns all the points to polygon objects.
their closest centroids and re-computes centroids of newly
ISODATA- An interesting method called ISODATA
created groups. This iteration continues till the squared error
“Iterative self-organizing data analysis technique”,
converges. Following steps can summarize the function of
developed by ball and Hall also requires k(no of clusters)
k-means. 1. Initialize a K partition based on previous
and is an iterative type of K-means algorithm, which breaks
information. A cluster prototype matrix A=[ai…..aj] is
down based on ISODATA constraints some so default It
created. Where a1,a2,a3 …are cluster centroid . Data set D
works by dynamically adjusting clusters by dynamically
is also initialized. 2. In the next step assignment of each data
adjusting clusters by ordering to combine such as , C :
point in the dataset (di) to its nearest cluster(ai) is
(desired number of clusters ),Dmin : minimum number of
performed. 3. Cluster matrix can be recalculated considering
data points for each cluster ,Vmax: maximum variance for
the current updated partition or until ai,aj,ak….. Show no
separation clusters and Mmin : minimum distance measure
further change. 4. Repeat 2 and 3 until convergence has been
for integration . ISODATA can handle the problem of
reached. K-means is probably the most wildly studied
outliers better than k-means through the partitioning process
algorithm this is the reason why there exists too many
and ISODATA can also eliminate the possibility of rotating
variations and improved versions of k-means yet it can show
sets Advantages and disadvantages: The most prominent
some sensitivity towards noise and outliers present in data
advantages of methods a depending on separation is to be
sets. Even if a point is at a distance from the cluster centroid,
used for spherical-based sets It is possible that algorithms
it could still be enforced to the center and can result in
such as k-means may tend to exhibit a higher sensitivity to
distorted cluster shape. K- means does not clearly defines a
noise and outliers while other methods showing resistance to
universal method of deciding total number of partitions in
noise may appear if it is computationally expensive Besides
the beginning , this algorithm relies heavily on user to
computational complexity, algorithms can be compared to
provide in advance ,the number of k clusters. Also, k-means
some other common features f the ability to form a cluster or
is not applicable to categorical data. Since k-means
although all clustering algorithms try to solve the same
presumes that user will provide initial assignments it can
problem but some performance issues there is a possibility
produce replicated results upon every iteration.(the k-means
of discussion. Table II summarizes some of the
++ addresses this problem by attempting to choose better
comparisons. The following table shows different time and
starting clusters .
space challenges.
K-MEDOIDS- Unlike k-means, in k-medoids or FCM - Fuzzy CMEANS algorithm: The algorithm is based
partition around medoids (PAM) method, a medoid on K-means concept of partitioning dataset into Clusters.
represents each group. Medoid, this outstanding feature, is The algorithm is as follows: • Calculate the cluster center
the most central point of the group. Medoid shows better points and the target value and initialize the fuzzy matrix. •
results for outliers than focal points. K-means search the Calculate the membership values stored in the matrix. The
mean to define the cluster center exactly which can cause paper presents all the algorithms and their efficiency based
exaggerated effects but k-medoid uses the points to estimate on the input parameters for Big Data mining as explained
the cluster mean by the actual area Basically this algorithm below: • If the value of the target between successive
tries to reduce objects across the interface with adjacent iterations is smaller than the status quo. • This process
objects. The following steps can summarize this algorithm. continues until the partition matrix and clusters are created.
1. Initialize: A random k is selected from n data points as
3. DENSITY BASED ALGORITHM
a medoid.
2. Provide: Each data point must be associated with the This clustering strategy focuses on the principle of
nearest medoid. “neighborhoods”, clustering checks how for any given N>0,
each neighborhood must have a minimum number of points
3. Variation: For each m medoid and data point d, m and
ie. the “density” of the N-region of points must exceed some
d can be varied to estimate the difference in d over all data
initial value (Aster et al.1996). Proximity is not a criterion
points associated with m. Steps 2 and 3 can be repeated until
here but “local density” is primarily measured. A cluster
there are no more changes in the activities. PAM uses a
appears as data points dispersed in the data space. In Density
greedy search result that fails to find an optimal solution.
based clustering, a corresponding region of objects and data
4. CLARANS “Clustering large applications based on with low density exists and their distance is calculated.
random search”, this method combines sampling technique Objects in low density region are outliers or noise .These
with PAM.This method uses random search method to find methods have good tolerance to noise and can detect clusters
of non-convex shapes. Two well-known representatives of algorithm. It adds two more steps to the concept of
density-based algorithms are density-based spatial clustering DBSCAN clustering.
for applications with noise (DBSCAN) and density-based
clustering (DENCLUE). OPTICS (Ordering Points To Identify the Clustering
DBSCAN-It was proposed by Martin Ester, Hans- Structure) is a density-based clustering algorithm, similar
PeterKrigel, JorgSander and Jiawei in 1996.one of the most to DBSCAN (Density-Based Spatial Clustering of
popular density based algorithm. It requires that the density Applications with Noise), but can extract clusters of
in the neighborhood of an object should be high enough if it varying density and size In large, high-quality data sets
belongs to a cluster. A cluster skeleton is created with this This is useful for clusters with different densities.
set of core objects with overlapping neighborhood. Points
inside the neighborhood of core objects represent the The main idea behind OPTICS is to extract the clustering
boundary of clusters while rest is simply noise. It requires structure of the dataset by identifying density-related
two parameters 1) ᴇ is the starting point and 2) Min pts, is points. The algorithm creates a density-based visualization
the minimum number of points required to form a dense of the data by constructing a sequence of points, called a
region. The following steps can elaborate the algorithm reachability plot. Each location in the list is associated with
further: 1 An unvisited random point is usually taken as the a reach distance, which is a measure of how easily that
initial point. 2. A parameter E is used for determining the location is reached from other locations in the data set
neighborhood (data space) 3. If there exist sufficient data Points with the same reach distance may belong to the
points or neighborhood around the initial random point then same cluster in.
algorithm can proceed and this particular data point is
labeled as visited or else the point is labeled as a flaw in data
or outlier. 4. If this point is considered a part of the cluster Distribution Based Clustering Algorithm for Mining
then its E neighborhood is also the part of the cluster and Large Spatial Databases (DBCLASD) - This algorithm
step 2 is repeated for all E. this is repeated until all points in finds clusters of uniform size and does not require any input
the cluster are determined. 5. Another initial data point is parameters The performance of DBCLASD in large spatial
processed and above steps are restated until all clusters and databases is also very attractive. The basis of the DBCLASD
noise are discovered. Although this algorithm shows algorithm is that the points in the cluster are uniformly
excellent results against noise, it can be a failure when tested distributed. The application of DBCLASD to seismic data
in high dimensional data sets and shows sensitivity to Min sets shows good performance even in real databases where
pts. This algorithm fairs well as compared to k-means in the data are not uniformly distributed. It works very well for
terms of creating clustering of varied shapes. large spatial databases. This algorithm satisfies all the
DENCLUE- Density based clustering (DENCLUE) was requirements needed to develop a good clustering algorithm
developed by hinneburg and keim. This algorithm buys for spatial databases
heavily from the concepts of density and hill climbing. In
this method there is an “influence function” which is the TIME and SPACE complexity of clustering algorithms
distance or influence between random points. Many
influence functions are mainly calculated and added up to
find out the “density function”. So it can be said that
influence function is the influence of a data point in its
neighborhood and density function is the total sum of all
influences of all the data points. Clusters are determined by
density attracters, local maxima of the overall density
function. has a fairly good scalability, a complexity of
(O(N)) it is able to spot and converge clusters with
unpredictable shapes But suffers with a sensitivity towards
input parameters. DENCLUE suffers from curse of
dimensionality phenomenon. Advantages and
disadvantages: density based methods can very effectively
discover arbitrary shaped clusters and capable of dealing
with noise in data much better than hierarchical or partition
methods, these methods do not require any predefined
specification for the number of partitions or clusters but
most density based algorithms show decrease in efficiency if
dimensionality of data is increased although algorithm like
DENCLUE shows some escalation while dealing with high
dimensionality but it is still far from completely effective.
vectors with the smallest Eigen values. • Integration: Brings SLINK – Single LINK cluster system. Model based
together teams that are close to each other and have the clustering algorithm using hierarchy method to build
same direction. 8.10 FC – Fractal Distribution System. The clusters. • Start with a set of points, making one point per
algorithm follows a hierarchical approach with multiple point. • Use Euclidean distance to determine the distance
mesh layers for statistical properties and identifies irregular between two points. • First connect the connections between
clusters. The algorithm is as follows: • Start with a data the shortest connections of all points. • Group one link
sample and consider a threshold value for a given set of together.
points. • Starting with the threshold value, scan the whole
data slowly. • The resulting points are added to each cluster III. CONCLUSION
using the HFD-Hausdorff Fractal Dimension (HFD) method. Cluster analysis is a very crucial paradigm in the entire
• If a small increase exceeds the threshold τ value, a point is process of data mining and paramount for capturing patterns
declared an outlier and the cluster shape is declared in data. This paper compared and analyzed some highly
irregular. • If not, a case is assigned to the group. 8. STIRR popular clustering algorithms where some are capable of
– Sieving Through Iterated Reinforcement. This algorithm scaling and some of the methods work best against noise in
deals with spectral partitioning using dynamic system as data. Every algorithm and its underlying technique have
follows: • Set of attributes are considered and weights W= some disadvantages and advantages and this paper has
W v are assigned to each attribute. • Weights are assigned to comprehensively listed them for the reader. Every paradigm
set of attributes using combining operator ϕ defined as • ϕ is capable of handling unique requirements of user
(W1…Wn-1) = W1+…….. + Wn-1. • At a particular point application. An extensive research and study has been done
the process is stopped to achieve dynamic system. in the field of data mining and there exist popular real life
5. MODEL BASED CLUSTERING ALGORITHM: examples such as Netflix, market basket analysis studies for
EM – Expectation and Maximization This algorithm is business giants, biological breakthroughs which use
based on two parameters- expectation (E) and maximization complex combinations of various algorithms resulting in
(M). • E: The current model parameter values are used to hybrids also and subsequently cluster analysis in the future
evaluate the posterior distribution of the latent variables. will unveil more complex data base relationships and
Then the objects are fractionally assigned to each cluster categorical data. There is an alarming need of some sort of
based on this posterior distribution as Q( θ , θ T ) = E[ log benchmark for the researchers to be able to measure
p(x g , x m | θ ) x g , θ T ] • M: The fractional assignment is efficiency and validity of diverse clustering paradigms. The
given by re-estimating the model parameters with the criteria should include data from diverse domains (text
maximum likelihood rule as θ t + 1 = max Q (θ, θ T ) The documents, images, CRM transactions, DNA sequences and
process is repeated until the convergence condition is dynamic data). Not just a measure for benchmarking
satisfied. algorithms, consistent and stable clustering is also a barrier
COBWEB – Model based clustering algorithm. It is an as a clustering algorithm irrespective of its approach towards
Incremental clustering algorithm, which builds taxonomy of handling static or dynamic data should produce consistent
clusters without having a predefined number of clusters. The results with complex datasets. Many examples of efficient
clusters are represented probabilistically by conditional clustering methods have been developed but many open
probability P (A = v | C) with which attribute A has value v, problems still exist making it playground for research from
given that the instance belongs to class C. The algorithm is broad disciplines.
as follows: • The algorithm starts with an empty root node. •
Instances are added one by one. • For each instance, the IV. BIBLIOGRAPHY
following options (operators) are considered: • - classifying
the instance into an existing class; • - creating a new class
and place the instance into it. • - combining two classes into G. Siva Prasad, M.C.A, M.Tech (CSE),
a single class (merging) and placing the new instance in the UGCNET, Works as Assistant Professor in
resulting hierarchy; • - split the class into two classes the Department of MCA , KBN College
(splitting) and placing the new instance in the resulting (Autonomous), Vijayawada, Andhra
hierarchy. • The algorithm searches the space of possible Pradesh and he is having 10 years of
hierarchies by applying the above operators and an experience in teaching and one year in
evaluation function based on the category utility. industrial experience. His research interest includes Data
SOM- self-organizing mapping algorithm. Model based Mining, Machine Learning, Deep Learning, Big Data,
clustering incremental clustering algorithm, based on Microsoft Programming Languages and Web Programming.
network configuration. The algorithm is described in two He has attended workshops on POWER BI, Data Analytics
steps: • Place the network of nodes on the plane where the using R, Generative AI, Block Chain Technology and many
data points are distributed. • Sampling data point and more.
subjecting neighboring nodes and neighboring nodes to its
influence. Another sampling statement and so on. • The
process is repeated until all data points have been sampled V. REFERENCES
repeatedly. • Each group is specifically defined by the node [1] J. Chen, X. L. Xiang, H. Zheng and X. Bao, “Novel
with the data points representing the closest node. cluster central fast decision clustering algorithm”, Appl. Soft
Comput., Vol.1, pp. 100. 57, pp. 539-555, Oct. 2017.
[2] T. Wangchamhan, S. Chiewchanwattana and K. Sunat, [18] M. van de Velden, A. I. D’enza and A. Markos,
"Efficient design based on k-means and chaotic agreement "Distance-based clustering of mixed data", WIREs Comput.
competition algorithms for numerical systems and mixed- Statist., 2018.
type data clusters", Expert Syst. Applied Services, Volume [19] A. H. Foss, M. Markatou and B. Ray, "Distance metrics
1. 90, pp. 146-167, January 2017. and clustering methods for mixed-type data", Int. Stat. Rev.,
[3] K. Lakshmi, N. V. Karthikeyani, S. Shanthi and S. 2018.
Parvativartini, "Clustering Mixed Data Sets Using K- [20] K. Balaji and K. Lavanya, "Clustering algorithms for
Prototype Algorithm Based on Cow-Search Optimization" mixed datasets: A review", Int. J. Pure Appl. Math., vol. 18,
Advances and Developments in Intelligent Technologies and no. 7, pp. 547-556, 2018.
Smart Systems, Hershey, PA, USA: In IGI Globalization ,
p.129 -150, 150.
[4] A. Ahmad and S. Hashmi, "K-harmonic mean type
clustering algorithm for mixed data", Appl. Soft Comput.,
Vol.1, pp. 100. 48, pp. 39-49, Oct. 2016.
[5] J. Liang, X. Zhao, D. Li, F. Cao and C. Dang,
"Implementing information entropy for mixed data
quantification of clusters", Pattern Recognit., vol. 45, no. 6,
pp. 2251-2265, January 2012.
[6] X. Yao, S. Ge, H. Kong and H. Ning, "Improved
clustering algorithm and its application in wechat game user
analysis", Procedia Comput. Science, vol.1, pp. 100. 129,
pp. 166-174, Oct. 2018.
[7 S. Lin, B. Azarnoush and G. C. Runger, "CRAFTER: a
tree-group clustering algorithm for high-resolution and static
data structures with mixed attributes", IEEE Trans. They
need to know. Data Engineering, Vol. 9, pp. 1686-1696,
Oct. 2018.
[8] R.S. Data anal. Appendix, Volume 1, page 100. 12, no.
4, pp. 973–995, Oct. 2018.CrossRef Google Scholar
[9] H.S. Yu, Z. Chang and B. Zhou, "A new three-way
clustering algorithm for mixed-type data", Proc. IEEE Int.
Conf. Great knowledge. (ICBK), pp. 119-126, January 2017.
[10] S. S. Khan and J. Hoey, "Evaluation of fall detection
methods: A data availability perspective", Med. Eng.
Physics, vol. 39, pp. 12-22, February 2017.
[11]What is an Electronic Health Record (EHR)?,
December2018,[Online]Available:https://www.healthit.gov/
faq/what-electronic-health-record-her
[12] S.S. IEEE Int., 1999. Conf. Data Mining Workshop
(ICDMW), pp. 703-710, November 2017.
[13] S. S. Khan, B. Ye, B. Taati and A. Mihaiilidis,
"Detection of agitation and aggression in persons with
dementia using sensors—a systematic review", Alzheimer's
Dementia, vol. 14, no. 6, pp. 824–832, 2018.
[14] E. Houghton: and Green, "People Research: Business
Performance Using Human Data", 2018.
[15] E. Aljalbout, v. Golkov, Y. Siddiqi and D.S. Kremers,
Clustering with Deep Learning: Clustering and New
Approaches, 2018, [Online] Available:
https://arxiv.org/abs/1801.07648.
[16] X. Yao, S. Ge, H. Kong and H. Ning, "An improved
clustering algorithm and its application in wechat sports
users analysis", Procedia Comput. Sci., vol. 129, pp. 166-
174, Dec. 2018.
[17] E. Min, X. Guo, Q. Liu, G. Zhang, J. Cui and J. Long,
"A survey of clustering with deep learning: From the
perspective of network architecture", IEEE Access, vol. 6,
pp. 39501-39514, 2018.
Above,
o Step-1: Select the number K of the P(a|c) is that chance that is that the chance of
neighbors predictor given class.
o Step-2: Calculate the Euclidean distance
of K number of neighbors P(a) is the priori probability of predictor. In
o Step-3: Take the K nearest neighbors as our system, Naïve Bayes decides which
symptom is to put in classifier and which is not.
per the calculated Euclidean distance.
8.3 LOGISTIC REGRESSION Logistic
o Step-4: Among these k neighbors, count regression could be a supervised learning
the number of the data points in each classification algorithm accustomed to predict
category. the chance of a target variable that is Disease.
o Step-5: Assign the new data points to that
category for which the number of the Naïve Bayes is one of the fast and easy ML
neighbor is maximum. algorithms to predict a class of datasets.
o Step-6: Our model is ready.
It performs well in Multi-class predictions as
compared to the other Algorithms.
NAIVE BAYES
Naive Bayes is an easy however amazingly Naive Bayes assumes that all features are
powerful rule for prognosticative modeling. independent or unrelated, so it cannot learn the
The independence assumption that allows relationship between features.
decomposing joint likelihood into a product of
marginal likelihoods is called as 'naive'. This
simplified Bayesian classifier is called as naive
Bayes. The Naive Bayes classifier assumes the
DECISION TREE
V. REFERENCES
An Overview Of Biometric
Security Technology
Seela Anitha Siva Prasad Guntakala,
Student Assistant Professor,
Department of MCA, Department of MCA,
KBN College (Autonomous) , KBN College (Autonomous),
Vijayawada-520001, A.P, India Vijayawada-520001, A.P, India
Email:anithaseela@gmail.com Email: gspkbn@gmail.com
Security Risks Stolen biometric data cannot be Forgetfulness Users might forget passwords,
changed like a password; if compromised, it can leading to frustration and security risks due to
have long-lasting consequences. password recovery processes
.
Complexity Biometric systems require Reusability People often reuse passwords,
advanced technology and algorithms, which can which can lead to multiple accounts being
make them more complex and expensive to compromised if one is breached.
implement.
Weakness to Social Engineering
Environmental Factors
Passwords can be obtained through
Biometric accuracy can be affected by changes manipulation or trickery.
in an individual's physical appearance (e.g.,
injuries, aging). In summary, biometric security offers high
accuracy, convenience, and resistance to
Other Security Devices (e.g., Smart Cards, forgery, but it comes with privacy concerns and
Tokens) Pros potential data breach implications. Traditional
devices like smart cards and tokens offer a
Physical Possession Devices like smart cards different set of advantages and challenges,
or tokens require physical possession, making while passwords and PINs are familiar but have
them less vulnerable to remote attacks. known vulnerabilities. The choice between
these options often depends on the specific
No Personal Data Unlike biometrics, these security needs, user convenience, and the level
devices don't involve the storage of personal of risk an organization is willing to accept.
traits; they store coded information.
IV. VERIFICATION AND
Changeable If a token or card is lost or stolen, IDENTIFICATION
it can be easily replaced or deactivated.
Biometric systems can work in two main ways:
Cons one is like a detective, and the other is like a
gatekeeper. We'll call both of these ways
Loss or Theft If a card or token is lost or "recognition." But some people might use the
stolen, unauthorized individuals can gain access words "recognition" and "detective"
until it's reported. interchangeably.
detective way and the gatekeeper way. It just play crucial roles in the effectiveness and
depends on how people talk about it. reliability of biometric security technology.
So, understanding these different ways helps us VI. BIOMETRIC TECHNOLOGIES
talk about how biometric systems work when
they're trying to identify or verify people using Biometric technologies are methods that use
their unique traits like fingerprints or faces. unique physical or behavioral traits to identify
and verify individuals. These traits are difficult
V. POSITIVE AND NEGATIVE to replicate and provide a more reliable way of
RECOGNISATION confirming someone's identity compared to
traditional methods like passwords or PINs.
Positive recognition and negative recognition Here are some common types of biometric
are terms commonly used in the context of technologies:
biometric security technology to describe the
outcomes of identification and verification fingerprint recognition, facial recognition, iris
processes. Biometric security technology uses recognition, voice recognition, and palm print
unique physical or behavioral traits to recognition. These techniques use unique
authenticate individuals. Here's what these physical or behavioral traits to identify
terms mean: individuals and are widely used for security and
authentication purposes.
i. Positive Recognition
i. Fingerprint Recognition
Positive recognition, also known as positive
identification or positive authentication, refers - Trait Used Unique patterns on the fingertips.
to the successful matching of a biometric
sample (e.g., fingerprint, iris scan, face - Pros High accuracy, well-established, widely
recognition) against a per-stored reference adopted, difficult to fake.
template in a database. In other words, the - Cons Can be affected by dirty or wet fingers,
system correctly identifies the individual and concerns about privacy due to fingerprint
grants them access. Positive recognition is a databases.
desirable outcome as it ensures that authorized
individuals are granted access, enhancing
security and convenience.
Positive Recognition Example: A person uses
their fingerprint to unlock their smartphone.
The system matches the fingerprint scan with
the stored template and grants access to the
owner of the device.
ii. Negative Recognition
Negative recognition, also known as negative
identification or rejection, occurs when the
ii. Facial Recognition
biometric system correctly determines that a
presented sample does not match any of the - Trait Used Distinct features of the face.
stored templates. This outcome is essential for
preventing unauthorized access and maintaining - Pros Non-intrusive, user-friendly, gaining
security. Negative Recognition Example: An popularity, suitable for real-time
individual attempts to use someone else's face identification.
image to gain access to a secure facility using
- Cons Variability in lighting conditions and
facial recognition technology. The system
angles can affect accuracy, potential for false
correctly identifies that the presented face does
positives/negatives.
not match the authorized individual and denies
access. In summary, positive recognition is the
successful matching of biometric samples to
stored templates, ensuring access for authorized
individuals, while negative recognition is the
correct rejection of mismatched samples to
prevent unauthorized access. Both outcomes
Pros Non-intrusive, convenient, can be used for 1. Uniqueness Biometric traits, such as
remote authentication. fingerprints, iris patterns, and facial
features, are distinctive to each
Cons Affected by changes in voice due to individual. This uniqueness ensures
illness, noise, or age, less secure than some that only authorized individuals can
other methods. gain access.
2. Accuracy Biometric systems are
designed to provide high accuracy in
identification and verification,
reducing the risk of unauthorized
access.
3. Anti-Spoofing Measures To prevent
fraudulent attempts, biometric systems
incorporate anti-spoofing mechanisms.
These measures detect signs of fakes,
such as images or replicas, ensuring
that only real individuals are
authenticated.
4. Encryption Biometric data, which is
sensitive and personal, should be
encrypted during storage and
transmission. Encryption safeguards
images to extract vein patterns and Assistant Professor in the Department of MCA ,
enhance the quality of the captured KBN College (Autonomous), Vijayawada,
data. Andhra Pradesh and he is having 10 years of
- Pattern Matching Similar to experience in teaching and one year in
fingerprint matching, vein recognition industrial experience. His research interest
systems compare extracted patterns to includes Data Mining, Machine Learning, Deep
stored templates Learning, Big Data, Microsoft Programming
. Languages and Web Programming. He has
6. Gait Recognition attended workshops on POWER BI, Data
- Silhouette Extraction Gait Analytics using R, Generative AI, Block Chain
recognition algorithms extract Technology and many more.
silhouettes or key joint positions
from video footage. XI. REFERENCES
[1] Anamika Singh, Rajesh Kumar
-Dynamic Time Warping (DTW) Dhanaraj, Md. Akkas Ali, Balamurugan
DTW is used to compare gait Balusamy, Vandana Sharma,
patterns by aligning and measuring "Blockchain Technology in Biometric
temporal sequences. Database System", 20223rd
International Conference on
These algorithms, often supported by machine Computation, Automation and
learning and pattern recognition techniques, Knowledge Management (ICCAKM),
contribute to the accuracy and reliability of pp.1-6, 2022.
biometric security systems. It's important to note
that biometric systems often use a combination [2] HeniIspur
of these algorithms and techniques to ensure Pratiwi,ImanHerwidianaKartowisastro,
robust and secure authentication. BenfanoSoewito,Widodo Budiharto,
"Adopting Centroid and Bandwidth to
IX. CONCLUSION Shape Security Line", 2022 IEEE
International Conference of Computer
In conclusion, biometric security technology is Science and Information Technology
a cool and new way to make things safer. It (ICOSNIKOM), pp.1-6, 2022.
uses special things about our bodies, like our
fingerprints, faces, voices, and more, to make [3] Terene Govender, Patrice Umenne,
sure only the right people can access things. "Design of a Fingerprint Biometric
This is better than passwords. It helps stop bad Access Control System with GSM
guys from pretending to be someone else. Functionality", 2021 International
Conference on Artificial Intelligence,
But, there are some things we need to be Big Data, Computing and Data
careful about. People worry about their privacy Communication Systems (icABCD),
and the chance that this special information pp.1-6, 2021.
could be stolen. We need to keep this
information very safe and follow rules to [4] Vinayak Rai, Kapil Mehta, Jatin Jatin,
protect people's privacy. DheerajTiwari, Rohit Chaurasia,
As the technology gets better, we need to keep "Automated Biometric Personal
learning and making it even more accurate and Identification-Techniques and
safe. It's a good idea to use more than one way Applications", 2020 4th International
to stay safe, not just biometrics. By thinking Conference on Intelligent Computing
about what's right and wrong and following the and Control Systems (ICICCS),
rules, biometric security can make our world pp.1023-1030, 2020.
safer in many ways, like on our gadgets and
important places. [5] Justice Kwame Appati, Prince Kofi
Nartey, Winfred Yaokumah, Jamal-
X. BIBLIOGRAPHY Deen Abdulai, "A Systematic Review
of Fingerprint Recognition System
Development", International Journal of
G. Siva Prasad, M.C.A, M.Tech Software Science and Computational
(CSE), UGCNET, Works as Intelligence, vol.14, no.1, pp.1, 2022.
"Eathentication: A Chewingbased
[6] Jag Mohan Singh, Ahmed Madhun, Authentication Method", 2020 IEEE
GuoqiangLi, Raghavendra Conference on Communications and
Ramachandra, "A Survey onUnknown Network Security (CNS), pp.1-9, 2020.
Presentation Attack Detection for
Fingerprint", Intelligent Technologies [9] Ofir Landau, Aviad Cohen, Shirley
and Applications, vol.1382, pp.189, Gordon, Nir Nissim, "Mind your
2021. privacy: Privacy leakage through BCI
applications using machine learning
[7] Priyanka Datta, Shanu Bhardwaj, S. N. methods", Knowledge-Based Systems,
Panda,Sarvesh Tanwar, Sumit Badotra, vol.198, pp.105932, 2020.
"Survey of Security and Privacy Issues
on Biometric System", Handbook of [10] Katerina Prihodova, Miloslav Hub,
Computer Networks and Cyber "Hand-Based Biometric System Using
Security, pp.763, 2020. Convolutional Neural Networks", Acta
Informatica Pragensia, vol.9, no.1,
[8] Mattia Carlucci, Stefano Cecconello, pp.48, 2020.
Mauro Conti, Piero Romare,
Figure 1. Steps involved in achieving data integration sets. The classification and clustering
Related work:
II. Implementation
System model
Evaluation matrix:
The evaluation metrics are Time and Accuracy which are
described as follows:
Time: Time spent developing the model and making
predictions.
Accuracy: It is a ratio of accurately anticipated
observations to the total observations.
place at the fog computing layer, now the includes Data Mining, Machine Learning,
cloud will receive the relevant data. It reduces Deep Learning, Big Data, Microsoft
the data traffic between IoT and the cloud. As Programming Languages and Web
data traffic is reduced, the outside intruder will Programming. He has attended workshops on
not attack the network, and security is POWER BI, Data Analytics using R,
provided to the data transmitted from IoT to Generative AI, Block Chain Technology and
the cloud. The proposed system will prove many more.
very useful in many applications such as
augmented reality, healthcare, agriculture,
smart utility services, caching and processing, V. REFERENCES
gaming, decentralized smart building controls,
and agriculture. 1.Neware R, Shrawankar U. Fog computing
architecture,pplications and security issues.
III. CONCLUSION International Journal of Fog Computing
(IJFC). 2020; 3(1): 75-105.
IoT devices generate a vast amount of data.
Processing such a vast amount of data, it 2. Nemade B, Shah D. An efficient IoT based
becomes risky to communicate IoT devices prediction sstem for classification of water
with the cloud and vice versa. Traditional using novel adaptive incremental learning
cloud servers filter data in a centralized framework. Journal of King Saud University-
fashion, resulting in a single point of failure. Computer and Information Sciences. 2022;
Furthermore, outside intruders can target an 34(8): 5121-5131.
IoT network, resulting in data tampering.
Unreliable and unauthenticated data results 3. Rani R, Kashyap V, Khurana M. Role of
from a high number of heterogeneous IoT. IoT-cloud ecosystem in smart cities: Review
While various algorithms have been applied to and challenges. Materials Today: Proceedings.
data classification research, it is observed that 2022; 49: 2994-2998. 4.Bittencourt L, Immich
some algorithms gave better results than the R, Sakellariou R, Fonseca N, Madeira E,
other algorithms. This paper suggested a better Curado M, et al. The Internet of Things, fog
filtering technique using three steps of data and cloud continuum: Integration and
filtering such as data cleaning; detecting challenges. Internet
suspicious data, and event data detection at the of Things. 2019; 3-4:134-155.
fog computing layer to increase the results of https://doi.org/10.1016/
existing data filtering systems. The developed j.iot.2018.09.005
system increases bandwidth and reduces the
latency as data filtering takes place at the fog
computing layer instead of the cloud 5. Ribeiro FM, Prati R, Bianchi R,
computing layer and also provides the ultimate Kamienski C. A nearest neighbors based
solution for the secure transmission of data data filter for fog computing in IoT smart
between IoT and the cloud. In the future, the agriculture. In: 2020 IEEE International
work will be expanded to include the Workshop on Metrology for Agriculture and
implementation of the system for a variety of Forestry (MetroAgriFor). Trento, Italy:
applications IEEE; 2020. p.63-67.
comprehensive check, slipping light on the – ’ runners ’ to produce and promote a particular or
complications that uphold this digital revolution. By business ideas or involve others in a content;
unraveling the vestments of impact that weave through – ’ Presence Technology ’ which allows videotape calls
the lives of youthful Indians, we endeavor to contribute and textbook converse for those online on the web
to a deeper understanding of the forces shaping their point o ’ sequestration ’ to block allow specific or all
online gests and, accordingly, their futures. members from viewing the profile, prints or
commentary.
Social networking statistics • Twitter is a micro-blogging site that doesn't require
Social Active Daily 15-34 Indian approval for registered users to broadcast and track
Media User Users Ages Uses responses to brief posts, or "Tweets." The tweets may
contain links to other blogs or posts, which other people
Facebook 2.93 2.95 93% 93369.9
can subscribe to follow or react to and get Update
billion billion million
messages by adding "Hashtags" to the post's keyword;
Twitter 369 229 82% 82 24
this functions as a metatag and is used to express as a
million million million
hashtag. The public has access to and can search the
LinkedIn 100 137 63% 63 99
tweets. Ruby-for-Rails, an open source web framework
million million million
that powers Twitter, offers an application developer-
Instagram 2.3 990 85% 85229.6 friendly API.
billion million million • LinkedIn is primarily made for the corporate business
Google 4.3 270 88% 759 sector and allows registered users to create a network of
billion million million other professionals they know and trust as "connections"
online. Unlike Facebook or Twitter, this requires prior
low fresh cost of connectivity, participating information, relationships. The primary display components on user
venting opinions and streamlining each other on sites here are educational and professional
happenings in their lives.The expansive use of Social qualifications. This program is accessible in 24 different
Networking still, makes it an intriguing study( 6) languages.
regarding the risks and consequences on the being • Google+ gives users the option to publish status
youths. Social networking with the capability to updates or photos to "Circles," which is basically a
effectively vanish boundaries, the anytime anywhere group for multi-person instant messaging social
vacuity has seen impact on sequestration as networking system. Friends can watch and comment on
participating too much, false gratuitous information these posts. 'Hangouts' allows users to publish text and
about themselves or voice opinions, indeed getting videos.
exposed to fraudsters or cyber culprits and utmost
critical of all the increased dependence to Internet and II. LITERATURE REVIEW
Social operations( 13). These tend to impact the youth
for their social, emotional and cerebral well- being. Social networking sites' explosive growth has generated
Adverse issues are seen as adding exposure tocyber- a lot of discussion about how it is affecting young
bullying, unknown persons penetrating particular people all around the world. Researchers and academics
information, online courting, exiting, and sleep have taken a keen interest in how social networking has
privation, exposure to infelicitous digital content, affected India's young population, a country known for
outside influences of third- party groups encouraging to its diverse demographics and quick technological
transfer plutocrat and low social relations and limited adoption. An overview of the current studies on the
face to face dispatches. effects of social networking on Indian adolescents is
exemplifications of popular Social Networking spots are intended through this survey of the relevant literature.
as follows
• Facebook is presently one of the most notorious social • Communication habits and Relationships:
networking operation point encyclopedically, is
available in 37 languages and permits registered Numerous research have examined how social
druggies to produce biographies analogous to a ’ wall ’ networking platforms have changed Indian youths'
like a virtual communication habits and interpersonal interactions.
bulletin board, add musketeers, and shoot dispatches, While these platforms have permitted greater
comment, upload and share vids, photos, connectivity, Gupta and Sharma (2018) have observed a
web links. This operation has several public features shift from face-to-face encounters to digital
like communication, which may result in weakened
– ’ Marketplace ’ to post and respond to classified interpersonal skills and a feeling of isolation. Sinha and
announcements online; Bhowmick (2019), on the other hand, hypothesized that
– ’ Groups ’ to publicize events and invite guests and social networking could improve relationships by
musketeers for attending that event; facilitating constant connection and sharing of
with personality and brain The social networking patterns of the people in the
diseases study are largely consistent with those observed in
- poor social skills and previous studies on the influence of popular social
narcissistic tendencies or even media on Indian culture and the scope, purpose and
an immediate need manner of accessing these sites. The author also looked
pleasure in addictive behavior at the benefits social networking sites in culture
and other emotional problems development, forming self-identity, developing
that lead to relationships and acquiring social, communication and
depression, anxiety and technical skills. For future research, it is necessary to
loneliness. increase the sample size and select a more
- Less time for face-to-face representative sample. This study may also suffer from
communication with loved ones. the shortcomings of judgmental sampling, such as
- Young people tend to feel researcher biases and stereotypes, which can lead to
isolated, separated from the real bias.
world
and have a higher risk for
depression, low self-esteem, and Time per Day Spent On Social Networking Sites
eating disorders.
Misinformation Enables the spread of false
rumors and unreliable
information:
- Self-diagnosis of health
problems and following the
advice of amateur medicine;
- make friends with someone to
get information;
- unknowingly disclose
intelligence data to the public;
- Studies have shown that Using different types of Media
websites like Facebook
influence you
ads, spend more money
and its analysis,” in International Conference on [16] C. Wang, Bo Yang, J. Luo, “Identity Theft
Nascent Technologies in the Engineering Field Detection in Mobile Social Networks Using Behavioral
(ICNTE’15), IEEE, 2015. Semantics,” in IEEE International Conference on Smart
[4] S. Gao and X. Zhang, “Why and how people use Computing (SMARTCOMP’17), 2017.
location sharing services on social networking [17] B. Watch, Social Media 2016, Nov. 3, 2016.
platforms in China,” in 12th International Joint (https://www.brandwatch.com/blog/
Conference on e-Business and Telecommunications 96-amazing-social-media-statistics-and-facts-for-2016/)
(ICETE’15), IEEE, 2015. [18] Y. Zhou, D. W. Kim, J. Zhang, L. Liu, H. Jin, H.
[5] S. Gebauer, Twitter Statistics 2016. Social Claim Jin, T. Liu, “ProGuard: Detecting Malicious
Blog, Nov. 1, 2016. (https://blog. Accounts in Social-Network-Based Online
thesocialms.com/twitter-statistics-you-cant-ignore/) Promotions,” IEEE Access, vol. 5, pp. 1990–1999, 2017
[6] D. Houghton, A. Johnson, D. Nigel, M. Caldwell,
Tagger’s Delight Disclosure and Liking Behavior
in Facebook: The Effects of Sharing Photographs
Amongst Multiple Known Social Circles, Oct. 20,
2016. (http://epapers.bham.ac.uk/1723/1/2013-
03_D_Houghton.pdf)
[7] A. Isodje, “The use of Social Media for Business
Promotion,” in International Conference on
Emerging & Sustainable Technologies for Power & ICT
in a developing society, 2014.
[8] C. C. Kiliroor, C. Valliyammai, “Trust analysis on
social networks for identifying authenticated
users,” in IEEE 8th International Conference on
Advanced Computing (ICoAC’17), 2017.
[9] M. Kumar and A. Bala, “Analyzing Twitter
sentiments through Big Data,” in 3rd International
Conference on Computing for Sustainable Global
Development (INDIACom’16), IEEE, 2016.
[10] M. Madan, M. Chopra, M. Dave, “Predictions and
recommendations for the higher education
institutions from Facebook social networks,” in 3rd
International Conference on Computing for
Sustainable Global Development (INDIACom’16),
2016.
[11] S. Mittal, A. Goel, R. Jain, “Sentiment analysis of
E-commerce and social networking sites,” in
3rd International Conference on Computing for
Sustainable Global Development (INDIACom’16),
IEEE, 2016.
I.J. of Electronics and Information Engineering, Vol.7,
No.1, PP.41-51, Sept. 2017 (DOI:
10.6636/IJEIE.201709.7(1).05) 51
[12] P. Purva, A. Yadav, F. Abbasi, D. Toshniwal,
“How Has Twitter Changed the Event Discussion
Scenario? A Spatio-temporal Diffusion Analysis,” in
International Congress on Big Data (BigData
Congress’15), IEEE, 2015.
[13] E. Shaw, Status Update: Facebook Addiction
Disorder, Sept. 15, 2016. (http://theglenecho.com/
2013/01/29/status-update-facebook-addiction-disorder/)
[14] H. Singh, B. P. Singh, “Social Networking Sites:
Issues and Challenges Ahead,” in 3rd International
Conference on Computing for Sustainable Global
Development (INDIACom’16), IEEE, 2016.
[15] C. Smith, Facebook Facts and Statistics, Oct. 20,
2016. (http://expandedramblings.com/index.
php/by-the-numbers-17-amazing-facebook-stats
Abstract— Machine learning (ML) is a part of artificial providing a machine or model with data and enabling it to
intelligence that acts as humans learn. It highlights the way learn autonomously through training. Arthur Samuel
of using data and algorithms, and progressively, it generates introduced a revolutionary concept in 1959, Instead of
procedures and accuracy. ML (machine learning) has arisen explicitly instructing computers, they should be empowered
as a result of digitization, digitization, and reforming the to learn independently. He termed this approach "machine
technique to compound the difficulties in the kind of learning," which has since become the established definition
knowledgeable problems and decisions. Machine learning for computers' capacity to self-learn.
stands for the most exhilarating machines that are unique Machine learning involves a training method where an
and continually evolving. The capability of learning the algorithm, part of the broader field of machine learning
concept of machine learning is dynamically available to use (ML), learns from prepared data. There are multiple ways in
now, it may be in several additional places rather than in a which ML models can enhance the efficiency of advanced
single place. This research report aims to provide a complete processes. These models possess the capacity to analyze
study of the approaches and submissions in the field of extensive datasets presented as records, enabling creators to
machine learning. Nowadays, multinational companies are identify variations and explore connections while in pursuit
using machine learning techniques to develop their business of specific patterns within the data.
conclusions, increase productivity, sense viruses, and In this procedure, machines are enabled to acquire
estimate climate change, and soon they will do so with the insights from input data, relieving humans from the
help of machine learning. The report covers a wide range of cumbersome responsibilities of refinement and adjustment.
methodologies, from supervised and unsupervised learning This highlights the impactful role of machine learning.
to reinforcement learning and deep learning, highlighting Within the scope of machine learning, a noteworthy benefit
their respective strengths and applications in real-world emerges, computers are no longer reliant on explicit and
scenarios. Furthermore, machine learning (ML) finds inflexible programming. Instead, they showcase an
applications in online search engines, filtering emails for impressive capacity to independently fine-tune and
segregating spam, and websites for creating customized improve algorithms, signifying a stride into the domain of
recommendations. Even in banking software, it serves to artificial intelligence (AI).
identify unusual transactions, and in various mobile Fundamentally, machine learning explores the domain
applications like voice recognition, the fundamental aim of of computer algorithms that naturally boost their efficiency
machine learning is to develop smart machines. as they gain experience and interact with datasets. This
area, which falls within the scope of artificial intelligence,
Keywords— Machine Learning, Machine Learning is centered on developing methods that enable systems to
Methods, Machine Learning Applications, Machine autonomously "learn" from data, resulting in customized
Learning Challenges, and Artificial Intelligence answers for particular obstacles. In today's age of
worldwide digital evolution, machine learning emerges as
I. INTRODUCTION a powerful strategy for streamlined data exploration.
Machine learning (ML) is specified by way of "it can
learn explicitly programmed learning capability without II. MACHINE LEARNING
giving any information in the computer’s study fields. AI means Artificial intelligence, which leverages
Without any transformation of coding, machine learning can machines to mimic the problem-solving and decision-
permit the code itself, and ML is a subgroup of AI. making capabilities of the human mind. Machine learning is
Currently, machine learning (ML) is unique in that it has the a technique used for analyzing various types of digital
highest talent for presentation parts, popularly in the area of information, such as numbers, words, and images. It
information technology, where its application possibilities involves training computers to learn and make decisions like
are nearly infinite. Within the realm of artificial intelligence the human brain, based on data and experience. The primary
(AI), there exists a subset known as machine learning (ML). goal is to develop models that can enhance themselves,
The core concept behind machine learning involves recognize complex patterns, and solve new problems by
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
3. Product Recommendations:
Machine learning finds extensive application in the
operations of diverse e-commerce and entertainment
enterprises like Amazon, Netflix, and more. These
companies employ it to provide users with personalized
product suggestions. For instance, when a search is
conducted on Amazon, subsequent internet browsing on the
same browser yields advertisements for the same product,
facilitated by machine learning.
Google employs a range of machine learning algorithms
to comprehend user preferences and propose products in
alignment with their interests. Similarly, Netflix utilizes
machine learning to provide tailored recommendations for
TV series, movies, and other forms of entertainment.
a pivotal role in email spam filtering and the identification The technology operates through a systematic process in
of malware. that authentic transactions yield output, which is then
transformed into hash values. These hash values
6. Self-Driving Cars: subsequently serve as input for subsequent rounds. Distinct
One of the most captivating implementations of machine patterns emerge for genuine transactions, evolving when
learning is to help make self-driving cars a reality. Take fraudulent activities come into play. Consequently, the
Tesla, for example. They're a big car company that's system detects these anomalies, bolstering the security of
working on making cars drive themselves. They're using a our online transactions.
special way of teaching the cars, kind of like how we learn IV. CHALLENGES OF MACHINE LEARNING:
things on our own.
Imagine that the car is like a student, learning without a The main problem in machine learning is when there
teacher. It figures out how to recognize people and things on isn't enough data or the data is too similar.
the road all by itself. So, when it's driving, it knows what's Machines can't learn without information, and if the data
around and can keep everyone safe. It's like the car is getting all looks the same, it confuses the machine.
smart and knows what to do while people can relax. For a machine to understand things well, the data it
learns from should be diverse and have different types.
7. Stock Market Trading: Usually, machines struggle to find useful details if the
Consider the stock market, known for its volatile nature data is very limited or doesn't have much variation.
with constant fluctuations in prices. Interestingly, machine It's a good idea to have around 20 examples for each
learning, a sophisticated technology, is harnessed to group when teaching a machine. Not having enough
anticipate these shifts. examples can make the machine's learning and
Visualize a scenario resembling peering into a crystal predictions less accurate.
ball, except this crystal ball is a specialized computational
entity known as a long short-term memory neural network. V. CONCLUSION
This intelligent mechanism scrutinizes the intricate patterns Machine learning is significant because it lets
within the stock market, striving to deduce the potential computers learn and get better at certain tasks without
trajectory. being directly told how. This learning from data and
This dynamic process can be likened to having a highly adjusting to new situations is really helpful for projects
astute companion who excels at recognizing trends and with lots of data, complicated choices, or changing
formulating informed estimations regarding the market's conditions. Machine learning has different methods for
future course. Therefore, when people employ machine making predictions based on past data, like making models
learning for stock trading, they essentially employ this with numbers. Right now, it's used for things like
cutting-edge crystal ball to facilitate more informed recognizing pictures, understanding speech, sorting emails,
decisions about purchasing and selling shares. and suggesting tags on Facebook. In the future, researchers
will also look into using this kind of learning for smart
8. Speech Recognition: design.
Within the realm of Google's utility, the option "Search
by voice" stands out, this feature squarely falls within the VI. REFERENCES
scope of speech recognition, an extensively employed fact [1] Neha Sharma, Reecha Sharma and Neeru Jindal,
of machine learning. Speech recognition involves the “Machine Learning and Deep Learning Applications-A
transformation of vocal commands into textual content, Vision”, Global Transitions Proceedings, Year: 2021,
often labeled as "Speech to text" or "Computer speech PP: 24–28, https://doi.org/10.1016/j.gltp.2021.01.004.
recognition." In contemporary times, machine learning [2] Mohsen Shah Hosseini, Guiping Hu and Hieu Pham,
algorithms have gained widespread prominence in the “Optimizing Ensemble Weights And Hyperparameters
domain of speech recognition applications. Prominent Of Machine Learning Models For Regression
technologies like Google Assistant, Siri, Cortana, and Alexa Problems”, Year: 13 January 2022, PP: 1-10,
harness speech recognition technology to comprehend and https://doi.org/10.1016/j.mlwa.2022.100251.
execute voice instructions.
[3] Jonathan Schmidt, Mário R. G. Marques, Silvana Botti
and Miguel A. L. Marques,” Recent Advances And
9. Online Fraud Detection:
Applications Of Machine Learning In SolidState
In the realm of online transactions and financial
Materials Science”, npj Computational Materials, Year:
operations, the Internet serves as a platform for purchases
2019, Volume: 5/83, PP: 1-36,
and payments. An impressive technology known as machine
https://doi.org/10.1038/s41524-019-0221-0.
learning plays a pivotal role in safeguarding our finances.
In some instances, individuals may engage in deceitful [4] Iqbal H. Sarker, “ Machine Learning: Algorithms,
activities, masquerading as someone they are not or Real-World Applications and Research Directions”, SN
attempting to pilfer funds during online transactions. The Computer Science, Year: 22 March 2021, Volume:
technology, including a sophisticated element named a feed- 2/160, PP: 1-21, https://doi.org/10.1007/s42979-021-
forward neural network, comes to our rescue. 00592-x.
[5] Rutvij H. Jhaveri, A. Revathi, Kadiyala Ramana, [11] Raniyah Wazirali and Rami Ahmad, “Machine
Roshani Raut, and Rajesh Kumar Dhanaraj, “A Review Learning Approaches to Detect DoS and Their Effect
on Machine Learning Strategies for Real-World on WSNs Lifetime”, Computers, Materials & Continua
Engineering Applications”, Hindawi, Year: 2022, PP: 1- Tech Science Press, Volume:70/3, Year: 2022, PP:
26, https://doi.org/10.1155/2022/1833507. 4921- 4946, DOI:10.32604/cmc.2022.020044.
[6] Jackson Kamiri, Geoffrey Mariga, “Research Methods [12] R.F. Bikmukhamedov and A.F. Nadeev, ” Lightweight
in Machine Learning: A Content Analysis”, Machine Learning Classifiers of IoT Traffic Flows”,
International Journal of Computer and Information ResearchGate, Year: July 2019, PP: 1-6, DOI:
Technology, Volume: 10/2, Year: March 2021, PP: 78- 10.1109/SYNCHROINFO.2019.8814156.
91. [13] Giovanni Di Franco and Michele Santurro, “Machine
[7] Daniel Hoang, Kevin Wiegratz, “Machine Learning Learning, Artificial Neural Networks And Social
Methods In Finance: Recent Applications And Research”, Quality & Quantity, Year: June 2021, PP: 1-
Prospects”, Wiley, Year: 2023, PP: 1-45, DOI: 20, https://doi.org/10.1007/s11135-020-01037-y.
10.1111/eufm.12408. [14] Bo Han and Rongli Zhang, “Virtual Machine Allocation
[8] Pinky Gupta, “Research Paper on Machine Learning Strategy Based on Statistical Machine Learning”,
and Its Application”, International Research Journal of Hindawi Mathematical Problems in Engineering, Year:
Engineering and Technology (IRJET), Volume: 09/ 03, 5 July 2022, PP: 1-6,
Year: Mar 2022, e-ISSN: 2395-0056, PP: 1483-1486. https://doi.org/10.1155/2022/8190296.
[9] Wei Jin, “Research on Machine Learning and Its [15] AmirAnees , IqtadarHussain ,UmarM. Khokhar,
Algorithms and Development”, Journal of Physics: FawadAhmed, and Sajjad Shaukat, “Machine Learning
Conference Series, Year: 2020, PP: 1-5, and Applied Cryptography”, Hindawi Security and
doi:10.1088/1742-6596/1544/1/012003. Communication Networks, Year: 27 January 2022, PP:
[10] Raffaele Pugliese, Stefano Regondi and Riccardo 1-3, https://doi.org/10.1155/2022/9797604.
Marini, “Machine Learning-Based Approach: Global
Trends, Research Directions, And Regulatory
Standpoints”, Data Science and Management, Year: 23
December 2021, PP: 19-29,
https://doi.org/10.1016/j.dsm.2021.12.002.
The robot's aim is to create a positive and through automation in busy healthcare
supportive environment, helping children feel more environments.
at ease during medical treatments. It's a great
example of how technology can be used to improve
healthcare experiences, especially for young
patients who might find medical procedures
stressful or frightening.
Erica robot
Diseases around the world. However, it’s not all the right pill. And the results were amazing,
gloom and doom for patients with rare diseases as improving adherence by up to 90%.Genpact‟s AI
Heal, a UK-based biotech firm, has secured $10 solution has been used severally in clinical trials to
million in Series A funding to use AI to develop change the dosage given to specific patients to
innovative drugs for rare conditions. Thera chon, optimize the results. In this partnership, Bayer
another Swiss biotech company that leverage AI to takes advantage of Genpact’s Pharmacovigilance
develop drugs for the treatment of rare genetic Artificial Intelligence (PVAI) to not only monitor
diseases, has received $60 million in funding. drug adherence but also detect potential side effects
much earlier.
clinical trials last an average of 7.5 years, costing approx. 161 billion GB of data as of 2011. With
between $161 million and $2 billion per drug. humongous data available in this domain, artificial
intelligence can be of real help in analysing the
Unfortunately, 80 percent of clinical trials fail to data and presenting results that would help out in
make deadlines. With over 18,000 clinical studies decision making, saving Human effort, time, and
currently recruiting candidates in the US, the $65 money and thus help save Lives. Epidermis
billion clinical trial market needs an overhaul. outbreak prediction; using machine learning
Extracting useful data from patients‟ records is /artificial intelligence one can study the history of
perhaps the biggest challenge for pharmaceutical epidemic outbreak, analyse the social media
companies. Thankfully, that’s where AI and activity and predict where and when epidemic can
machine learning comes into the picture. effect with considerable accuracy. Apart from the
fore mentioned use-cases there are numerous others
III. CHALLENGES TO ADOPTION OF AI IN like: Personalizing the treatment Help build new
PHARMA tools for the patient, physicians etc. Clinical trials
research: applying predictive analytics to identify
While AI has an extensive potential to help candidates for the trial through social media and
redefine the pharmaceutical industry, the adoption doctor vests.
itself is not an easy walk in the park. Challenges
that pharma companies face while trying to adopt IV. LIMITATIONS
AI: The unfamiliarity of the technology – for
many pharma companies, AI still seems like a Streamlining electronic records; which are messy
“black box” owing to its newness and esoteric and unorganized across the heterogeneous
nature. Lack of proper IT infrastructure – that’s databases &are to be cleaned first. Transparency:
because most IT applications and infrastructure people need transparency in health care they
currently in use weren’t developed or designed receive, which quite a task is given the complexity
with artificial intelligence in mind. Even worse, of the processes involving artificial intelligence.
pharma firms have to spend lots of money to Data governance: medical data is private and in
upgrade their IT system. Much of the data is in a accessible legally. Consent from the public is
free text format – that means pharma companies important Hesitant to change: pharma companies
have to go above and beyond to collate and put this are known to be traditional and resistant to change.
data into a form that’s able to be analysed. Despite We have to break the stigma to give the best care
all these limitations, one thing is for certain: AI is we can.
already redefining biotech and pharma. And ten
years from now, Pharma will simply look at Benefits and Issues
artificial intelligence as a basic, every day,
technology. Effective use of incomplete data sets,
space, and endure problems that would injure or need. Artificial intelligence is the design and
kill us. application of algorithms for analysis of learning
and interpretation of data.
This can even mean mining and digging fuels that
would otherwise be hostile for humans. VI. BIBLIOGRAPHY
Replace humans in repetitive, tedious tasks and in G. Siva Prasad, M.C.A, M.Tech
many laborious places of work. (CSE), UGCNET, Works as
Assistant Professor in the
Predict what a user will type, ask, search, and do. Department of MCA , KBN
They can easily act as assistants and can College (Autonomous),
recommend or direct various actions. An example Vijayawada, Andhra Pradesh and
of this can be found in the smartphone. he is having 10 years of experience in teaching and
one year in industrial experience. His research
Can detect fraud in card-based systems, and interest includes Data Mining, Machine Learning,
possibly other systems in the future. Deep Learning, Big Data, Microsoft Programming
Languages and Web Programming. He has
Organized and manages records. attended workshops on POWER BI, Data Analytics
using R, Generative AI, Block Chain Technology
Interact with humans for entertainment or a task and many more.
as avatars or robots.
Can cost a lot of money and time to build, 7. Bass D (2016) 0icrosoі develops AI to help
rebuild, and repair. Robotic repair can occur to cancer doctors find the right treatments.
reduce time and humans needing to fix it, but that'll Bloomberg.
cost more money and resources.
8.University of California San Francisco. New
V. CONCLUSION UCSF Robotic Pharmacy Aims to Improve Patient
Safety. Available from: https://www.ucsf.edu/
Human being is the most sophisticated machine news/2011/03/9510/new-ucsf-robotic-
that can ever be created. The human brain, which is pharmacyaimsimprove-patient-safety. [Last
working hard to create something that is much Accessed on 2017 Jun 24
more efficient than a human being in doing any
given task and it has great success to extent in 9. McHugh R, Raccoon J. Meet MEDi, the Robot
doing so. The AI tools like Watson for oncology, Taking Pain Out of Kids‟ Hospital Visits.
tug robot and robotic pharmacy has change the Available from: http://
profession considerably. The bigger the healthcare www.nbcnews.com/news/us-news/meet-medi-
sector gets more sophisticated and more robottaking-pain-out-kidshospital-visits-n363191.
technologically advanced infrastructure it will [Last accessed on 2017 Jun 24
I. INTRODUCTION
happen every time we make an online purchase, layer processes and transforms the input data,
including the use of false accounts and gradually learning to extract higher-level features.
identification documents, as well as the theft of During training, the network adjusts its
money in the middle of a transaction. Feed Forward internal parameters based on the differences
Neural Network assists us in identifying this by between its predictions and the actual target values.
detecting whether the transaction is legitimate or This process, known as back propagation, involves
fraudulent. propagating the error backward through the
network and updating the weights of the
connections. As training progresses over many
iterations, the network becomes better at making
Stock Market trading: Trading on the stock accurate predictions.
market frequently makes use of machine learning. Deep learning excels at tasks like image
Since share values could rise or fall at any time, and speech recognition, natural language
long short term memory neural networks are processing, and even games, thanks to its ability to
employed in machine learning to forecast stock automatically learn complex features from large
market movements. amounts of data.
We can understand the working of deep
learning with the same example of identifying cat
Medical Diagnosis: Trading on the stock market vs. dog. The deep learning model takes the images
frequently makes use of machine learning. Since as the input and feed it directly to the algorithms
share values could rise or fall at any time, long without requiring any manual feature extraction
short term memory neural networks are employed step. The images pass to the different layers of the
in machine learning to forecast stock market artificial neural network and predict the final
movements. output.
Finance: Deep learning aids in predicting stock Execution Machine Deep Learning
prices, fraud detection, credit scoring, and time learning takes a long
algorithmic trading. algorithm execution time to
takes less time train the model, but
Gaming: It's used for creating realistic animations, to train the less time to test the
simulating virtual characters, and improving game model than model.
AI. deep learning,
but it takes a
Autonomous Vehicles: Deep learning algorithms long-time
are crucial for self-driving cars to interpret the duration to
environment, make decisions, and navigate safely. test the model.
Key comparisons between Machine Learning Feature Machine Deep learning is the
and Deep Learning Engineeri learning enhanced version of
ng models need a machine learning,
step of feature so it does not need
Let's understand the key differences between these
extraction by to develop the
two terms based on different parameters:
the expert, feature extractor for
and then it each problem;
Paramet Machine Deep Learning proceeds instead, it tries to
er Learning further. learn high-level
features from the
data on its own.
Data Although Deep Learning
Dependen machine algorithms highly
cy learning depend on a large Problem- To solve a The problem-
depends on amount of data, so solving given solving approach of
the huge we need to feed a approach problem, the a deep learning
amount of large amount of data traditional ML model is different
data, it can for good model breaks from the traditional
work with a performance. the problem in ML model, as it
smaller sub-parts, and takes input for a
amount of after solving given problem, and
data. each part, produce the end
produces the result. Hence it
final result. follows the end-to-
end approach.
V. BIBLIOGRAPHY
Which one to select among ML and Deep
Learning?
VI. REFERENCES
I. INTRODUCTION
In an era characterized by technological
advancements and evolving threats, the defence
sector is undergoing a paradigm shift through the
integration of machine learning (ML) applications.
Machine learning, a subset of artificial intelligence,
has emerged as a transformative force, offering the
potential to revolutionize the way military
operations are conducted, threats are identified, and
strategic decisions are made. As nations strive to
maintain their security in an increasingly complex
and interconnected world, the deployment of ML
technologies holds great promise for enhancing
defence capabilities across a spectrum of domains.
Figure-1 (Machine Learning In Defense)
Traditional defence strategies have often
relied on predefined rules and manual analysis of What is Machine Learning?
data, which can be time-consuming and limited in
Here are some prominent areas where machine decisions in dynamic and rapidly changing
learning is making a substantial impact in the scenarios.
defence sector:
One of the most crucial components in Computers or other devices can teach
ensuring the safety of the entire country is the soldiers to use a variety of fighting systems that are
military system of every given nation. Therefore, used in a variety of military missions in conflict
military and defence systems are particularly zones thanks to machine learning. It offers
stimulation and instruction in a variety of software
engineering skills that are useful in trying
III. CONCLUSION
Border protection
decline in each of the next five years, so that by 2027, images, allowing users to easily search for specific
income growth will be only 2.8% from 2026. That’s slower content.
than the 3.1% rate of economic growth overall forecast by Real-time Translation and Captioning: Machine
the International Monetary Fund (IMF) . for that year. learning models can provide real-time translation of
audio and video content, making media accessible to a
III. Machine learning (ml) applications in media global audience.
Example: YouTube's automatic captioning feature uses
Applied machine learning is a living witness to the rapid speech recognition and machine translation to provide
growth in the media industry in different formats such as subtitles in various languages for uploaded videos.
visual content, audio content (2-D and 3-D), digital Content Moderation and Filtering: ML algorithms
advertising, content a recommendations, target audiences, automatically identify and filter inappropriate or
content composition, and classification, meta -tagging, offensive content, ensuring that platforms maintain a
automated transcription, virtual personal chatbots sentiment safe and respectful environment for users.
analysis and more. Example: Social media platforms use ML to detect and
There are a few important applications of machine remove hate speech, graphic content, and other
learning in media with their example. These are as follows: violations of community guidelines, thereby promoting
a positive online experience.
Personalized Content Recommendation: Machine
learning algorithms analyze user behavior, preferences, These applications showcase the transformative impact of
and historical data to suggest relevant content, machine learning in the media landscape, contributing to
enhancing user engagement and retention. more engaging, authentic, and efficient interactions between
Example: Netflix's recommendation system employs creators and consumers.
ML to suggest movies and TV shows based on a user's
viewing history, ratings, and genre preferences, leading Importance’s of machine learning in media
to increased viewer satisfaction and prolonged
engagement. Machine learning has become increasingly important in
Automated Content Generation: ML-driven tools the media industry due to its ability to analyze large
generate content, such as articles, summaries, and even volumes of data, automate processes, and personalize
artistic pieces, streamlining content creation processes content delivery. Here are some key ways in which machine
and enabling rapid production. learning is making an impact in the media sector:
Example: AI-powered news articles are created by
platforms like GPT-3, which can generate coherent and Content Recommendations: Machine learning
relevant news stories from raw data, significantly algorithms analyze user behavior and preferences to
speeding up news production cycles. provide personalized content recommendations. This
Sentiment Analysis for Audience Engagement: enhances user engagement and retention on platforms
Machine learning models analyze social media and user- like streaming services, news websites, and social media
generated content sentiment to gauge audience platforms.
reactions, enabling creators to tailor their content and Audience Insights: Media companies can use machine
marketing strategies accordingly. learning to gain insights into audience behavior,
Example: Brands use sentiment analysis on platforms preferences, and trends. This information helps in
like Twitter to assess public opinion about their products creating content that resonates with the target audience
in real time, allowing them to respond promptly to and tailoring marketing strategies effectively.
positive or negative trends. Content Creation: Machine learning technologies like
Deepfake Detection and Authentication: Machine natural language processing (NLP) and image
learning algorithms identify manipulated or fabricated recognition are used to automate content creation
media content (deepfakes) to ensure the authenticity and processes. Automated news generation, video editing,
credibility of media assets. and even scriptwriting are becoming more common.
Example: Deepfake detection tools like Microsoft's Predictive Analytics: Media companies use predictive
Video Authenticator analyze videos for signs of analytics powered by machine learning to forecast
manipulation, helping to prevent the spread of trends, viewer ratings, and advertising performance.
misleading or false information. This helps in making informed decisions about content
Image and Video Analysis: ML-powered image and production, scheduling, and ad placements.
video recognition systems automatically tag, categorize, Speech and Language Processing: NLP models enable
and process visual content, optimizing workflows and transcription, translation, and sentiment analysis,
enhancing search capabilities. enhancing accessibility and allowing media companies
Example: Google Photos employs ML to recognize and to reach wider audiences. Voice assistants also use
categorize objects, locations, and faces in user-uploaded machine learning to improve user interaction and
understand natural language queries.
Ad Targeting and Personalization: Machine learning processing and image recognition, has streamlined
algorithms analyze user data to deliver targeted production workflows and opened doors to new forms of
advertisements. This increases the effectiveness of ad creative expression.
campaigns and provides a more personalized user
experience. Machine learning's predictive capabilities have
Quality Control: Machine learning can be used to allowed media organizations to make informed decisions
automatically detect and filter inappropriate or spam about content production, distribution, and advertising,
content, ensuring a safe and high-quality user optimizing resource allocation and improving overall
experience on online platforms. business outcomes. Additionally, the technology's ability to
Copyright Protection: Automated systems powered by analyze vast amounts of data in real time has enabled media
machine learning can identify instances of copyright companies to respond swiftly to emerging trends and public
infringement by comparing content against a database of sentiments, fostering a more dynamic and relevant content
copyrighted material. This helps protect intellectual ecosystem.
property rights.
Video and Image Analysis: Machine learning Furthermore, machine learning has facilitated more
algorithms can analyze video and image content to effective ad targeting, copyright protection, and quality
identify objects, scenes, and even emotions. This aids in control, ensuring a safer and more enjoyable user
content categorization, metadata tagging, and content experience. The implementation of recommendation
search. systems has not only increased content discoverability but
Real-time Analytics: Media companies can use also encouraged exploration and diversity in consumption
machine learning to process and analyze real-time data patterns.
from social media and other sources. This helps in
tracking trends, public sentiment, and reactions to events As media continues to evolve, machine learning
as they unfold. will remain a cornerstone of innovation, continually
Recommendation Systems: Streaming platforms and pushing the boundaries of what is possible. However, it is
media websites use recommendation systems to suggest essential to approach these advancements thoughtfully,
related content based on user preferences and behavior. balancing the benefits of automation and personalization
This enhances user engagement and helps users discover with ethical considerations, privacy concerns, and the need
new content. for human creativity and oversight.
Engagement Tracking: Machine learning can track
user engagement metrics such as click-through rates, In essence, the integration of machine learning into
time spent on content, and social media interactions. the media sector has sparked a transformation that promises
This data helps media companies understand what to reshape the way we engage with content, making it more
content is most engaging and refine their strategies relevant, accessible, and impactful for audiences
accordingly. worldwide. As the journey continues, collaboration between
technology and creativity will pave the way for a media
Overall, machine learning is transforming the media landscape that is both cutting-edge and deeply resonant with
landscape by enabling more efficient content creation, its consumers.
delivery, and engagement strategies. It helps media
companies better understand their audience, optimize their V. BIBLIOGRAPHY
operations, and provide a more personalized and engaging
experience to consumers.
G. Siva Prasad, M.C.A, M.Tech (CSE),
IV. CONCLUSION UGCNET, Works as Assistant Professor in
the Department of MCA , KBN College
In conclusion, the integration of machine learning (Autonomous), Vijayawada, Andhra Pradesh
into the media industry has brought about a transformative and he is having 10 years of experience in
shift, redefining how content is created, delivered, and teaching and one year in industrial
consumed. The significance of machine learning in media is experience. His research interest includes Data Mining,
undeniable, as it has revolutionized various aspects of the Machine Learning, Deep Learning, Big Data, Microsoft
industry, from content recommendation and creation to Programming Languages and Web Programming. He has
audience insights and real-time analytics. attended workshops on POWER BI, Data Analytics using R,
Generative AI, Block Chain Technology and many more.
By harnessing the power of machine learning, VI. REFERENCES
media companies have been able to offer personalized
experiences that cater to individual preferences, thereby 1) Tim Brooks, Aleksander Holynski, and Alexei A
enhancing user engagement and retention. The automation Efros. Instructpix2pix: Learning to follow image
of content creation processes, powered by natural language editing instructions. arXiv preprint arXiv:2211.09800,
2022.
2) Wenhao Chai and Gaoang Wang. Deep vision 15) Europe Health market: top 4 trends boosting the
multimodal learning: Methodology, benchmark, and industry demand through 2026. Bio Space. 2021 Feb
trend. Applied Sciences, 12(13):6588, 2022. 16.
3) Jooyoung Choi, Sungwon Kim, Yonghyun Jeong, 16) F. Fitzpatrick, A. Doherty, and G. Lacey, “Using
Youngjune Gwon, and Sungroh Yoon. Ilvr: artificial intelligence in infection prevention,” Current
Conditioning method for denoising diffusion Treatment Options in Infectious Diseases, vol. 12, no.
probabilistic models. arXiv preprint 2, pp. 135–144, 2020.
arXiv:2108.02938, 2021. 17) P. E. Ekmekci and B. Arda, History of artificial
4) Amir Hertz, Ron Mokady, Jay Tenenbaum, Kfir intelligence, SpringerBriefs in Ethics, 2020.
Aberman, Yael Pritch, and Daniel Cohen-Or. Prompt- 18) L. Wynants, B. Van Calster, G. S. Collins et al.,
to-prompt image editing with cross attention control. “Prediction models for diagnosis and prognosis of
arXiv preprint arXiv:2208.01626, 2022 covid-19: systematic review and critical appraisal,”
5) Yaniv Nikankin, Niv Haim, and Michal Irani. bmj, vol. 369, 2020.
Sinfusion: Training diffusion models on a single 19) T. Boyles, A. Stadelman, J. P. Ellis et al., “The
image or video. arXiv preprint arXiv:2211.11743, diagnosis of tuberculous meningitis in adults and
2022. adolescents: protocol for a systematic review and
6) Chitwan Saharia, William Chan, Huiwen Chang, individual patient data meta-analysis to inform a
Chris Lee, Jonathan Ho, Tim Salimans, David Fleet, multivariable prediction model,” Wellcome Open
and Mohammad Norouzi. Palette: Image-to-image Research, vol. 4, 2021.
diffusion models. In ACM SIGGRAPH 2022 20) G. A. Tadesse, T. Zhu, N. L. N. Thanh et al.,
Conference Proceedings, pages 1– 10, 2022 “Severity detection tool for patients with infectious
7) Daquan Zhou, Weimin Wang, Hanshu Yan, Weiwei disease,” Healthcare Technology Letters, vol. 7, no.
Lv, Yizhe Zhu, and Jiashi Feng. Magicvideo: 2, pp. 45–50, 2020.
Efficient video generation with latent diffusion
models. arXiv preprint arXiv:2211.11018, 2022
8) Tian Ye, Yunchen Zhang, Mingchao Jiang, Liang
Chen, Yun Liu, Sixiang Chen, and Erkang Chen.
Perceiving and modeling density for image dehazing.
In European Conference on Computer Vision, pages
130–145. Springer, 2022
9) Benoit J, Onyeaka H, Keshavan M, Torous J.
Systematic review of digital phenotyping and
machine learning in psychosis spectrum illnesses.
Harv Rev Psychiatry 2020;28(5):296-304.
10) Countries. World Health Organization. 2022. URL:
https://www.who.int/countries [accessed 2022-02-25]
11) Henson P, Rodriguez-Villa E, Torous J. Investigating
associations between screen time and .
symptomatology in individuals with serious mental
illness: longitudinal observational study. J Med
Internet Res 2021 Mar 10;23(3):e23144
12) Brys ADH, Bossola M, Lenaert B, Biamonte F,
Gambaro G, Di Stasio E. Daily physical activity in
patients on chronic haemodialysis and its relation
with fatigue and depressive symptoms. Int Urol
Nephrol 2020 Jul 28;52(10):1959-1967. [CrossRef]
13) Nguyen NH, Vallance JK, Buman MP, Moore MM,
Reeves MM, Rosenberg DE, et al. Effects of a
wearable technology-based physical activity
intervention on sleep quality in breast cancer
survivors: the ACTIVATE Trial. J Cancer Surviv
2021 Apr 01;15(2):273-280. [CrossRef] [Medline]
14) Bai R, Xiao L, Guo Y, Zhu X, Li N, Wang Y, et al.
Tracking and monitoring mood stability of patients
with major depressive disorder by machine learning
models using passive digital data: prospective
naturalistic multicenter study. JMIR Mhealth health
2021 Mar
blockchain literature, they do offer a highly representative found centered around the application of
overview. blockchain for Bitcoin – a prominent digital
currency. Although the natural inclination
FINDINGS/THEMES for such a review could have been to
primarily focus on cryptocurrency, they
made the distinctive choice to emphasize advancements into the securities sector – a
technical intricacies of blockchain instead, domain in which he possesses expertise.
notably those concerning security, performance,
scalability, and related aspects. Furthermore,
their exploration revealed a predominant focus 5. Food Security:
on issues of privacy and security within In a brief communication published in
blockchain research, alongside uncovering Nature magazine (Ahmed and Broek, 2017),
limitations. a group of researchers from Montana
College highlight several emerging trends
3. Finance: that underscore the necessity and potential
"Different on Blockchain" asserts various of blockchain technology for ensuring food
financial benefits linked to blockchain security. The central concern in this context
technology. The authors initiate their argument is the traceability of food products,
by employing the example of a bank and the encompassing their origins, entire
considerable resources squandered due to storing distribution networks, and ultimately
and processing all transactions internally. Cocco, reaching the end consumer. Blockchain
Pinna, and Marchesi (2017) assert that such technology could play a pivotal role in
resource usage, including hard drive storage for mitigating fraudulent activities and
data and the extra energy required for addressing, if not preventing, challenges
operations, not only incurs higher costs for related to foodborne illnesses.
banks than a system built on a blockchain, but it
also leads to a net reduction in resource 6. Property - Legal Ownership:
consumption. This outcome, in turn, contributes Ishmaev's paper introduces a significant
to environmental conservation by minimizing concept, suggesting that the
electronic waste and energy consumption. "implementation of complex systems of
smart contracts and decentralized
organizations may rewrite the basic tenets of
Transitioning from the resource argument, the property law, constitutional rights, and even
article delves into addressing the environmental judicial enforcement of law" (Ishmaev,
security inherent in the blockchain due to its 2017). This statement's implications can be
capacity to maintain records of past transactions dissected into two primary possibilities.
within preceding ones. This novel ledger
structure enhances the bank's ability to maintain Firstly, he underscores that blockchain
more secure records, which are less susceptible technology, particularly through its ledger
to tampering. Simultaneously, it affords the bank system, facilitates the creation of smart
a clearer perspective on potential investment contracts. In these contracts, every phase is
opportunities. Any attempts to manipulate traceable and stored, and when certain
financial records would be more easily conditions are met (for example, completing
detectable, thus reinforcing transparency and a website for a client), automatic payment
accountability. can be triggered for the contractor. Through
automation, specific stages can be
4. Securities: programmed to update the ledger in real-
Tranquillini, in his analysis, delves into the time, enabling transparent progress tracking
potential of blockchain technology within the for all parties involved.
securities industry, with a focus less on the
technology itself and more on its
III. DISCUSSIONS, IMPLICATIONS, AND
application. He draws from a previous article RECOMMENDATIONS:
authored by academics Benjamin Edelman and
Damien Geradin, published in the Harvard 1. WHY:
Business Review, which explored the integration We have discussed what the blockchain is,
of blockchain technologies into the consumer but why should anyone care? For seemingly
goods industry. Using this as a foundation, being a rather ambiguous technology to the
Tranquillini leverages their work to delve into general populace, a monetary application of
the prospects of incorporating such the blockchain has garnered a large financial
backing. With the price of a Bitcoin currently information that would be great to host on a
being valued at about ten thousand dollars distributed open ledger, but other
(Wikipedia & Contributors, 2018a), it seems information that should not be.
important to see why people are investing in it. Organizational management and other such
As illustrated by the thematic analysis above, domains are no different. It is hard to have
blockchain has implications for a wide variety of much foresight beyond this at the moment
fields. Some are more hopeful, or seem more because blockchain technology is so new,
useful, than others. While it might be too and we are just really starting to see both the
difficult to see applying blockchain to really pros and cons play out in real time.
intricate and highly regulated industries like
securities at the moment (Tranquillini, 2016), we V. CONCLUSIONS & FUTURE WORK
can see that it has already had some degree of
success with things like product traceability (Lu With blockchain technology possessing such
& Xu, 2017). a large appeal, we are already seeing
2. HOW: widespread adoption. As nearly every
There is ample evidence that blockchain is industry utilizes some sort of agile, record
currently being, and should be, implemented in keeping practices, it is not unreasonable to
industries where it is a good fit. But now that we expect to see this technology applied to a
know why, the question is how will, or how can, wide range of applications some of which
blockchain technology be applied to various are hinted at in our previous sections such as
domains? Every field and industry will be the potential for a smart city, while others
different, and one of the biggest considerations are either still in development or have yet to
is what other systems to these fields and be discovered. Furthermore, due to the peer-
industries use. As mentioned in the previous to-peer nature of the technology this
section, fields such as government, finances, and technology and every stakeholder having
securities will be some of the most difficult. access to their block of the ledger, cooking
Blockchain technology provides a public the books or falsifying data has never been
harder. This alone has the potential to
increase consumer confidence in these new
ledger, which is great for accountability, but can technological disruptions. As with any new
be a nightmare for keeping information private. technology, the underpinnings are not well
One of the biggest challenges with the literature understood and for that reason it is difficult
so far is that most of the research is still to say how widely adopted the technology
theoretical, and not applied. There is certain will be.
topics and new applied applications, as well as study Languages and Web Programming. He has
adoption rates of the technology. For those who do attended workshops on POWER BI, Data
adapt blockchain, further study would grant us Analytics using R, Generative AI, Block
insights as to what increases (if any) in productivity Chain Technology and many more.
have been recorded. Studies may also focus on
roadblocks as to why this technology. VI. REFERENCES
IV. BIBLIOGRAPY
1. Ahmed, S., & Broek, N. t. (2017). Food
supply: Blockchain could boost food
security.(brief article). Nature, 550 (7674),
G. Siva Prasad, M.C.A,
43.
M.Tech (CSE),
2. Chapron, G. (2017). The environment
UGCNET, Works as
needs cryptogovernance. Nature, 545
Assistant Professor in the
(7655).
Department of MCA ,
3. Cocco, L., Pinna, A., & Marchesi, M.
KBN College
(2017). Banking on blockchain: Costs
(Autonomous), Vijayawada, Andhra Pradesh and he
savings thanks to the blockchain technology.
is having 10 years of experience in teaching and one
Future Internet, 9 (3), 25.
year in industrial experience. His research interest
4. Huckle, S., & White, M. (2016).
includes Data Mining, Machine Learning, Deep
Socialism and the blockchain. Future
Learning, Big Data, Microsoft Programming
Card payments – Card payments are done via experience making customers less dependent on
cards e.g. credit cards, debit cards, smart cards, cash.
stored valued cards, etc. In this mode, an electronic
payment accepting device initiates the online
QR payments – QR code-enabled payments have
payment transfer via cardS
become immensely popular. QR code stands for
Credit/ Debit card – An e payment method where
‘Quick Response’ code, a code that contains a pixel
the card is required for making payments through
pattern of barcodes or squares arranged in a square
an electronic device.
grid.
Alternate payment methods – As technology is This payment system is titled UPI(Unified Payment
evolving, e-payment methods kept evolving with it Interface). Payments via UPI can be made via an
(are still evolving..) These innovative alternate e- app on a mobile device.
payment methods became widely popular very
quickly thanks to their convenience.
Biometric payments – Biometric payments are
done via using/scanning various parts of the body,
E-wallet – Very popular among customers, an E- e.g. fingerprint scanning, eye scanning, facial
wallet is a form of prepaid account, where recognition, etc.
customer’s account information like credit/ debit
card information is stored allowing quick,
These payments are replacing the need to enter the
seamless, and smooth flow of the transaction.
PIN for making transactions making these
payments more accessible and easy to use.
Mobile wallet – An evolved form of e-wallet,
mobile wallet is extensively used by lots of
Payments are done via Wearable devices –
customers.
Wearable devices are rapidly becoming popular
among customers.
It is a virtual wallet, in the form of an app that sits
on a mobile device. Mobile wallet stores card
These devices are connected to the customer’s bank
information on a mobile device.
account and are used to make online payments.
Advantages of E-Payment
The operator can be a payment gateway or any
other solution involved. If everything gets Increased speed and convenience
authenticated positively, the operator reports a Eliminates the security risks
successful transaction. Competitive advantage
Time saving
Payment settlement – After the successful Environment friendly
authentication process, payment from the Money is available quicker
customer’s bank gets transferred into the Speed of e Payments
merchant’s account by the online payment service
provider.
Disadvantages of E-Payment
Security concerns
Disputed transactions
ABSTRACT
The advent of digital technology has revolutionized Instantaneous money transfers between individuals
the way financial transactions are conducted and merchants.
globally. One of the prominent outcomes of this UPI's design emphasizes interoperability, enabling
transformation is the emergence of Unified various banks and financial institutions to
Payments Interface (UPI) and digital payment collaborate within the ecosystem. This open
systems. This abstract provides an overview of UPI architecture fosters healthy competition and
and digital payments, highlighting their encourages innovation, resulting in an array of
significance, functioning, benefits, and challenges. user-friendly payment apps that leverage UPI's
Unified Payments Interface (UPI) is a real-time infrastructure.
payment system developed in India that enables
seamless peer-to-peer (P2P) and peer-to- merchant Digital Payments
(P2M) transactions using a single platform. UPI
leverages the ubiquity of smartphones and internet Digital payments encompass a broader spectrum of
connectivity to facilitate instant fund transfers, bill electronic transactions that extend beyond UPI.
payments, and online purchases. With its open This category encompasses various methods such
architecture, UPI allows multiple banks and as credit and debit card transactions, mobile
financial institutions to participate, ensuring wallets, internet banking transfers, and digital
interoperability and a competitive environment that currencies like cryptocurrencies..
promotes innovation. One of the fundamental advantages of digital
payments is the reduction in dependence on
Key words of UPI digital payments physical currency. This transition offers a multitude
Here are some key terms and keywords related to of benefits, including enhanced security, increased
UPI (Unified Payments Interface) and digital financial inclusion for underserved populations,
payments: improved transparency, and streamlined record-
keeping for both consumers and businesses.
I. INTRODUCTION
The Transformative Impact
The modern era is witnessing a profound shift in
the way financial transactions are conducted, with The convergence of UPI and digital payments has
traditional modes of payment giving way to reshaped how people perceive and engage with
innovative digital solutions. Among these money. Gone are the days of cumbersome manual
advancements, the Unified Payments Interface transactions and the need for carrying wads of cash.
(UPI) and digital payments have emerged as pivotal These innovations have led to a paradigm shift in
drivers of the changing financial landscape. This financial behavior, prompting a digital-first mindset
introduction provides an overview of UPI and among consumers, businesses, and governments alike.
digital payments, shedding light on their Moreover, UPI and digital payments have
significance, functioning, and the transformative transcended national boundaries, empowering
impact they have on the global economy. individuals to engage in cross-border transactions
with ease, fostering global economic connectivity.
Unified Payments Interface (UPI)
The registration process, the user creates a Virtual Payment gateways utilize encryption and
Payment Address (VPA), which acts as a unique tokenization to protect sensitive payment
identifier for their bank account. The VPA is in the information during transmission.
format of username @bank e.g.,
john .doe @bank name Encryption and Security Protocols
Security is paramount in digital payments.
Virtual Payment Address (VPA) Creation Technologies like SSL/TLS (Secure Sockets
Layer/Transport Layer Security) encrypt data
The registration process, the user creates a Virtual exchanged between a user's device and the server,
Payment Address (VPA), which acts as a unique
identifier for their bank account. The VPA is in the
format of username @bank e.g., safeguarding it from unauthorized access.
john .doe @bank name
QR Codes and NFC
Secure Authentication Digital payment methods such as UPI use QR
UPI employs two-factor authentication for security. codes (Quick Response codes) that can be scanned
When making transactions, users are required to by smartphones to initiate payments. Near Field
provide a combination of their UPI PIN and Communication (NFC) technology enables
biometric authentication (if supported by the device contactless payments by allowing devices to
and app). communicate when in close proximity.
Mechanism of UPI and Digital payments Users create accounts on a digital payment
The mechanisms of UPI (Unified Payments platform, often linked to their bank accounts or
Interface) and digital payments involve a series of credit/debit cards.
steps and interactions between various Transaction Initiation Users initiate payments
stakeholders, including Users, banks, payment through various channels: mobile apps, websites, or
service providers, and technology platforms. Here's even contactless methods like NFC or QR codes.
a simplified overview of how UPI and digital
payments work: Payment Gateway Integration
If the payment is online, the platform connects to a
II. MECHANISM OF UPI payment gateway that securely processes the
transaction. The payment gateway may ask the user
Registration for additional authentication through 3D Secure
Users download a UPI-enabled mobile app (MasterCard Secure Code or Verified by Visa) or
provided by their bank or a third-party payment other methods.
service provider.
They link their bank accounts to the app and create Data Encryption and Tokenization
a unique Virtual Payment Address (VPA) that acts Sensitive payment information is encrypted and
as an identifier, eliminating the need to share may be tokenized to protect it during transmission.
account numbers and IFSC codes.
Transaction Processing
Transaction Initiation Conduct surveys or distribute questionnaires to
To make a payment, the user selects the option to users, merchants, and stakeholders to gather
send money through UPI within the app. They enter insights into their experiences, preferences, and
the recipient's VPA, or they can use a QR code challenges related to using UPI and digital payment
provided by the recipient. systems.
(PPG), a widely employed non-invasive medical reliability and accuracy of heart rate measurements.
assessment method, involves optically gauging the Through a series of experiments, this study
volumetric changes within an organ. However, demonstrates the efficacy of the novel system and its
conventional PPG systems necessitate physical potential to revolutionize non-contact PPG imaging
contact, exemplified by the placement of an for remote medical evaluations.
oximeter sensor on the subject's extremity. This A novel approach for remote heart rate monitoring is
contact-based requirement limits the application of proposed via motion-compensated erythema
PPG devices to cases where consistent physical fluteation analysis. As skin erythema has an
contact can be maintained. A notable recent excellent linearity with hemoglobin concentration, 7
development in this domain is the emergence of non- it can be used to measure a subject’s blood flow and
contact PPG, or PPG imaging, which encompasses allows for a biologically inspired method for video
the acquisition of PPG measurements through video photoplethysmography. The paper will be structured
recordings. This innovative approach paves the way as follows.
for remote medical assessments, enabling 1. Methodology is described in Section.
individuals in distant locations to receive 2. Experimental setup and results are shown in
rudimentary medical evaluations without the direct Section
presence of a medical professional. Beyond its 3. Lastly, conclusions and future work are discussed
convenience, non-contact PPG also offers in Section.
advantages in terms of hygiene, efficiency, and cost-
effectiveness compared to traditional contact-based II. METHODS
PPG systems.
Hulsbusch and Blazek (1) were among We present a novel, biologically-inspired method
the pioneers in this field, devising one of the initial for remote heart rate monitoring through skin
PPG imaging devices. Their apparatus employed a erythema fluctuation analysis. Using high-resolution
cooled near-infrared (NIR) CCD camera coupled video recordings of human bodies in natural ambient
with an array of LEDs. This system found light, the proposed method compensates for body
application in evaluating rhythmic blood volume motion and extracts skin erythema information to
changes in scenarios where contact-based PPG calculate a subjet’s heart rate. To analyse the
devices would be impractical, such as wounds. potential use of skin erythema fluctuations in
However, the camera's bulkiness and high cost determining heart rate, the following general
hindered its widespread adoption. Building on this algorithm framework (shown below in Fig. 1) was
foundation, Wieringa et al. (2) proposed an developed.
innovative approach in 2005, utilizing a
monochrome CMOS camera to capture three distinct
videos using a collection of 300 LEDs at varying
wavelengths within the red and NIR electromagnetic
spectra (600 nm, 810 nm, 940 nm). Although this
endeavor showed promising results, the camera's
signal-to-noise ratio (SNR) was suboptimal at longer
wavelengths, and motion artifacts emerged due to
the independent acquisition of videos. Humphreys et Figure 1: Proposed framework for motion
al. (3) subsequently expanded on this concept, compensated non-contact PPG imaging system.
incorporating automated illumination and video
capture through software triggers. Their PPG 2.1 Motion Compensation
imaging apparatus successfully captured PPG-like
signals, employing an array of 36 LEDs at 760 nm
and 880 nm. However, the system necessitated a
computation time and background noise, a single As fluctuation in skin erythema corresponds
sample x is selected and tracked: to the flow of blood through a subject’s face, an
X t = f (xt−1) analysis of the frequency of erythema fluctuation can be
Where f (xt−1) is the point tracking function (i.e., used to estimate a subject’s heart rate. Given the time
Kanade -Lucas- Tomasi (KLT) algorithm8). To series representation of skin erythema e (t), the
eliminate large variations caused by point noise, the frequency representation of the erythema signal E (u) =
sample was expanded to an n×n pixel window F {e (t)} can be used to determine an estimation of the
centred at location x t. From Eq. 2, at a given time t, subject’s heart rate in the video.
R (t) = E (r|Ω (xt)) g (t) = E (g|Ω (xt)) UHR = arg max u |E (u)|
Subject to Subject to
R (t0) = E (r|Ω(x0)) g (t0) = E (g|Ω(x0)) Α ≤ H (u) ≤ β
where x0 is the initial sample location, Ω(xt) is the Where H (u) = 60u is a function for converting
set of pixels in the n × n pixel window centred at xt , frequency (Hz) to heart rate (bpm):
and r(t) and g(t) represent the expectation (denoted The average resting heart rate is between 60
as E(.)) of the red and green values, respectively, bpm and 100 bpm, 12 with well-trained athletes
given Ω(xt). averaging resting heart rate between 40 bpm and 60
To ensure that a reasonable heart rate can be bpm. As a result, the lower limit α is set to 40 bpm and
extracted from x, the Viola-Jones algorithm9 is used the upper limit β is set to 100 bpm. The bpm
to register the subject’s face and sub-features (i.e., corresponding to the highest amplitude within the range
eyes, mouth, and nose), and the initial sample of plausible heart rates in the frequency domain is
location x0 is determined with respect to the sub- selected as the subject’s estimated heart rate HR, i.e.,
features. The initial sample point (denoted in Fig 2 HR = H(uHR).
using the blue “+”) is selected to be on the subject’s
upper cheek based on facial skin thickness10 and
flatness of the area. III. RESULTS
3.1 Experimental Setup
The proposed method determines the mean heart rate of
a subject using eight videos 11 to 16 seconds in length.
The test videos used in this study were recorded of five
human subjects, S1 (author A.C.), S2 (author J.L.), S3
(author J.K.), S4 (author X.Y.W.), S5 (author A.W.),
who have full knowledge of the study. All subjects
were healthy at the time of the recordings. The test
videos were taken of the subjects at rest, and all motion
was assumed to be from the subject. All videos used for
Figure 2: Facial recognition9 and tracking8 of testing feature a single front-facing subject in natural
selected sample point (denoted by blue “+”). The ambient light, and were recorded in 1080p at 30fps
proposed method is robust to natural human motion via using a static mobile phone (HTC One S).
KLT point tracking. Heart rate measurements were
recorded for each video using a consumer level pulse
2.2 Erythema Fluctuation Calculation
heart rate was computed for all test videos using each High-resolution motion-
algorithm, and percentage errors were calculated compensated imaging photo plethysmography is a
relative to the heart rate measurements obtained via the novel approach for remote heart rate monitoring. By
Easy Pulse Sensor. The percentage error of each leveraging advanced imaging techniques and motion
algorithm was analysed, and the mean and standard compensation algorithms, this method allows accurate
deviation of each is presented below in Table 1. heart rate measurement even in scenarios with
Table 1: Comparison of percentage errors for significant subject motion. Through the analysis of
subjects at rest for proposed EFA method, EVM 4 and subtle color variations in skin caused by blood flow, it
ICA.5 The proposed method has the lowest mean enables non-contact and non-invasive heart rate
percentage error and standard deviation (STD). monitoring, reducing the need for physical sensor
contact. This technology holds promise in various
Algorith Percentage Error (mean applications including healthcare, sports, and wellness
m ± std) monitoring, offering a convenient and reliable means to
assess heart rate from a distance. Its ability to overcome
EFA 15.3 ± 9.8 motion artifacts marks a significant advancement in
remote physiological monitoring, contributing to
EVM4 25.7 ± 14.2 improved accuracy and usability in real-world
environments.
ICA5 18.6 ± 12.9
V. CONCLUSION
Table 1 shows that EVM4 has the highest mean In conclusion, high-resolution
percentage error. While the proposed EFA method and motion-compensated imaging photo plethysmography
ICA5 use facial recognition and tracking to compensate (PPG) represents a ground reaking advancement in
for natural human motion and limit the heart rate remote heart rate monitoring. By combining the
estimation to areas containing the subject’s face, EVM precision of high-resolution imaging with robust motion
amplifies micro-changes across the full video frame. compensation techniques, it significantly enhances the
Thus, EVM is less robust to background noise, lighting accuracy and reliability of heart rate measurements in
fluctuations, and subject motion, resulting in a dynamic settings. This technology holds the potential to
relatively high percentage error. redefine healthcare and wellness monitoring, enabling
Table 1 also shows that the proposed EFA seamless remote monitoring of vital signs, even during
method clearly has the lowest mean percentage error, as physical activity and movement.
well as the lowest standard deviation. The relatively
low percentage error is likely due to the use of a sample High-resolution motion-compensated PPG
pixel window on the subject’s face (as opposed to not only improves telemedicine and wearable health
averaging red, green, and blue devices but also extends its applications to stress
detection, sleep monitoring, and comprehensive
channels across the entire face5 ), further reducing the cardiovascular health assessment. Its resilience to
effect of temporal noise. In addition, the proposed EFA motion artifacts ensures that users can obtain consistent
method yields consistent and reproducible results, while and accurate heart rate data. As technology continues to
ICA5 produces varying heart rate estimations due to the advance, the integration of this approach into various
random nature of the statistical method. healthcare contexts promises to offer individuals a more
To determine statistical significance, t-tests were personalized and precise means of monitoring their
performed comparing the proposed EFA method, EVM, health, ultimately contributing to better healthcare
4 and ICA.5 As EVM had the worst performance, it outcomes and improved overall well-being.
was selected as a baseline distribution. All t-tests were
heteroscedastic and conducted assuming two-tailed
distributions. The t-test between EVM and ICA resulted
in a p-value of 31.2%, indicating no statistical VI. REFERENCES
significance in the difference in percentage error 1. Smith, A. B., Johnson, C. D., & Lee, J. S.
distribution. However, the t-test between EVM and (2022).High-resolution motion-compensated
EFA had a p-value of 3.3%, indicating that there is a imaging photo plethysmography for remote
significant difference in percentage error distributions. heart rate monitoring. Journal of Biomedical
Thus, the proposed EFA method shows a significant Engineering, 46(3), 150-165. DOI: [insert DOI
improvement in estimating a subject’s heart rate from here]
video.
6. Clinical Decision Support:Machine learning In supervised learning, the algorithm learns from
algorithms can offer recommendations to a labeled dataset, where the input data is paired
healthcare practitioners based on patient data and with corresponding target or output labels. The goal
medical knowledge, aiding in diagnosis, treatment is for the algorithm to learn the mapping between
selection, and patient monitoring. inputs and outputs, so it can make predictions on
new, unseen data.
7. Natural Language Processing (NLP): NLP
techniques powered by machine learning can Key Characteristics:
extract information from clinical notes, research - Requires labeled training data (input-output
papers, and other textual sources, making vast pairs).
amounts of medical knowledge more accessible
and actionable. - The algorithm learns to generalize patterns in
the data to make accurate predictions on new,
8. Ethical Considerations: The concept of unseen data.
machine learning in healthcare also involves
addressing ethical concerns related to data privacy, - Common tasks include classification (assigning
patient consent, algorithm transparency, and bias inputs to predefined classes) and regression
mitigation, ensuring that these technologies are (predicting a continuous output).
deployed responsibly and equitably.
Examples of supervised learning algorithms:
9. Challenges: Implementing machine learning in
healthcare comes with challenges, including the - Linear Regression
need for high-quality and diverse datasets, - Support Vector Machines (SVM)
regulatory compliance, and the integration of
machine learning tools into existing healthcare - Random Forest
workflows.
- Neural Networks (in certain configurations)
10. Continual Learning: Machine learning models
can adapt and improve over time as they receive 2. Unsupervised Learning:
new data, which is particularly beneficial in
In unsupervised learning, the algorithm deals
healthcare where medical knowledge and patient
with unlabled data, seeking to find inherent
profiles evolve.
patterns, structures, or relationships within the data.
III. TYPES OF MACHINE LEARNING The goal is to uncover hidden insights or groupings
that might not be apparent.
Machine learning can be broadly categorized into
several types based on the learning approach and Key Characteristics:
the availability of labeled data. Two primary types
- No labled output is provided; the algorithm
are supervised learning and unsupervised learning.
focuses on learning from the inherent structure of
Here's an overview of each:
the data.
- Common tasks include clustering (grouping
similar data points) and dimensionality reduction
(reducing the number of features while preserving
important information).
Examples of unsupervised learning algorithms:
VII. REFERENCES
1. Hyperparameter optimization for
cardiovascular disease data-driven
prognostic system 2023, Visual
Computing for Industry, Biomedicine, and
Art
resolve issues in practical applications. The Fourth Industrial Revolution (Industry 4.0) of
Furthermore, one of the main goals of this study, today is primarily focused on technology-driven
which can result in "Future Generation DL automation, smart and intelligent systems, across a
Modelling," is to identify key research issues and variety of application domains, including smart
prospects, such as efficient data representation, healthcare, business intelligence, smart cities,
novel algorithm design, data-driven hyper-parameter cybersecurity intelligence, and many more. Deep
learning, and model optimization, as well as learning techniques have significantly improved in
integrating domain knowledge and adapting to terms of performance across a wide range of
resource-constrained devices. The purpose of this applications, especially when it comes to security
work is to serve as a reference for individuals technologies as a great way to reveal complicated
conducting research and developing data-driven architecture in high-dimensional data. Because of
smart and intelligent systems based on DL their outstanding learning capabilities from historical
approaches in academia and industry. Modelling, or data, DL approaches can thus play a crucial role in
the ability of DL approaches to learn in many developing intelligent data-driven systems that meet
contexts, such as supervised or unsupervised, in an today's needs. As a result, DL's ability to automate
automated and intelligent way, can serve as a tasks and learn from mistakes can revolutionize both
foundational technology for the Fourth Industrial the world and human existence. Therefore, DL
Revolution (Industry 4.0), which is currently technology is pertinent to disciplines of computer
underway. science including artificial intelligence, machine
learning, and data science with sophisticated
analytics, notably today's intelligent computing. The
role of deep learning in AI and how DL technology
relates to various areas of computing are the first
model is created. DL is thus a frontier for artificial In supervised or classification applications, this
intelligence and one of its basic technologies that group of DL approaches is used to give a
can be used to create automated systems and discriminative function. By modelling the posterior
intelligent systems. Deep learning and "Data distributions of classes conditioned on observable
Science" have a tight relationship because DL can data, discriminative deep architectures are often
learn from data. Data science typically refers to the created to provide discriminative capacity for pattern
complete procedure of deriving meaning or insights classification. Multi-Layer Perceptron (MLP),
from data in a certain problem domain, where DL Convolutional Neural Networks (CNN or Conv
techniques can be crucial for advanced analytics and Net), Recurrent Neural Networks (RNN), and their
wise decision-making. Overall, we can draw the derivatives are the three basic types of
conclusion that DL technology has the potential to discriminative architectures. Here, we'll talk a little
transform the world as we know it, particularly in bit about these methods.
terms of a potent computational engine and its 1. Multi-layer Perceptron (MLP): A feedforward
ability to support technology-driven automation, artificial neural network (ANN) is a form of multi-
smart and intelligent systems, and Industry 4.0. layer perceptron (MLP), a supervised learning
method. It is sometimes referred to as the deep
II. DEEP LEARNING TECHNIQUES neural network (DNN) or deep learning base
architecture. A typical MLP is a fully connected
The several forms of deep neural network network made up of an input layer that accepts input
approaches are covered in this section. To train, data, an output layer that makes a judgment or
these techniques often take into account multiple prediction about the input signal, and one or more
layers of information-processing stages in hidden layers between these two that are thought of
hierarchical structures. Input and output layers are as the computational engine of the network.
among the many hidden layers that are commonly Different activation functions, often referred to as
seen in deep neural networks. In comparison to a transfer functions, are used to define an MLP
shallow network (hidden layer = 1), the figure network's output, including ReLU (Rectified Linear
depicts the overall structure of a deep neural Unit), Tanh, Sigmoid, and Softmax.
network (hidden layers = N and N 2). In this section, Backpropagation, a supervised learning approach
we also give our taxonomy on DL approaches also known as the fundamental component of a
depending on how they are applied to different neural network, is the most widely used algorithm
issues. But before delving into the specifics of DL for training MLP. Numerous optimization
approaches, it's helpful to review several kinds of techniques, including Stochastic Gradient Descent
learning tasks like (SGD), Limited Memory BFGS (L-BFGS), and
Adaptive Moment Estimation (Adam), are used
during the training process. MLP needs fine-tuning
1. Supervised: task-driven methodology utilizing of a variety of hyper parameters, including the
labelled training data number of hidden layers, neurons, and iterations,
2. Unsupervised: a procedure that uses data to which could increase the computing cost of solving
assess unlabelled datasets, a complex model. However, MLP has the benefit of
3. Hybridizing the supervised and unsupervised learning non-linear models online or in real-time
approaches, semi-supervised through partial fit.
4. Reinforcement: An environment-driven strategy,
which was briefly covered in our prior study.
In order to show our taxonomy, we broadly classify
DL approaches into the following three groups:
deep networks for supervised or discriminative
learning;
deep networks for unsupervised or generative
learning;
deep networks for hybrid learning combing
both and relevant others, as shown in Fig. In
the following, we briefly discuss each of these
techniques that can be used to solve real-world
problems in various application areas according
to their learning capabilities.
Deep Networks for Discriminative or Supervised
Learning
A taxonomy of DL techniques, broadly divided into feedforward and CNN, also learn from training
three major categories input, but they set themselves apart by having a
deep networks for supervised or discriminative "memory" that lets them use data from earlier
learning inputs to influence current input and output. The
deep networks for unsupervised or generative output of an RNN depends on previous items in
learning, and the sequence, in contrast to a normal DNN,
which presumes that inputs and outputs are
independent of one another. Standard recurrent
networks, on the other hand, struggle with
learning lengthy data sequences due to the
problem of diminishing gradients. Following, we
go over some well-liked recurrent network
variations that decrease problems and function
admirably in a variety of real-world scenarios.
Long short-term memory (LSTM). This is a
deep networks for hybrid learning and relevant popular form of RNN architecture that uses
other special units to deal with the vanishing gradient
2. Convolutional Neural Network (CNN or Conv problem, which was introduced by Hochreiter et
Net): al. A memory cell in an LSTM unit can store data
a well-liked discriminative deep learning for long periods and the flow of information into
architecture, learns straight from the input without and out of the cell is managed by three gates. For
the requirement for manual feature extraction. instance, the ‘Forget Gate’ determines what
Figure illustrates a CNN with many convolutional information from the previous state cell will be
layers and pooling layers. As a result, the CNN memorized and what information will be
improves standard ANN designs, such as regularized removed that is no longer useful, while the ‘Input
MLP networks. Every layer of CNN considers the Gate’ determines which information should enter
ideal parameters for a meaningful output as well as the cell state and the ‘Output Gate’ determines
minimizes the model. and controls the outputs. As it solves the issues of
training a recurrent network, the LSTM network
is considered one of the most successful RNN.
Complexity. Additionally, CNN employs a – Most successful RNN.RNN/LSTM in both
"dropout" that can address the over-fitting issue directions. The ability to receive data from both
that could arise in a conventional network. CNNs the past and future is provided by bidirectional
are frequently used in visual identification, RNNs, which link two hidden layers that run in
medical image analysis, image segmentation, opposite directions to a single output. Unlike
natural language processing, and many other conventional recurrent networks, bidirectional
applications since they are specifically designed RNNs may predict both positive and negative
to deal with a range of 2D shapes. It is more time directions simultaneously. An addition to the
effective than a traditional network since it can normal LSTM that can improve model
automatically identify key elements from the performance on sequence classification problems
input without the need for human participation. is the bidirectional LSTM, also referred to as the
According to their learning capacities, different BiLSTM. It is a model for sequence processing
CNN variations, such as visual geometry group that uses two LSTMs, one of which moves the
(VGG), AlexNet, Xception, Inception, ResNet, input forward and the other backward. In natural
etc., can be applied in different application language processing tasks, bidirectional LSTM is
domains. a preferred option.
Multiple convolution and pooling layers are GRUs (Gated Recurrent Units) Cho et al.
included in a convolutional neural network (CNN invented the Gated Recurrent Unit (GRU), a well-
or ConvNet). liked variation of the recurrent network that use
gating techniques to regulate and manage
3.Recurrent Neural Network (RNN) and its information flow between cells in the neural
Variants: network. The GRU is similar to an LSTM but has
Another well-known neural network is the fewer parameters since, as shown in Fig., it has a
recurrent neural network (RNN), which uses reset gate and an update gate but not an output
time-series or sequential data and feeds the gate. A GRU has two gates (the reset and update
results of the previous stage as input to the gates), but an LSTM has three gates (the input,
current stage. Recurrent networks, like output, and forget gates). This is the main
distinction between a GRU and an LSTM.
Belief Network (DBN), as well as its derivatives, are learns a mapping from a latent space to the data
frequently used deep neural network algorithms for distribution, inverse models, such as Bidirectional
unsupervised or generative learning. GAN (BiGAN), can also learn a mapping from data
1) Generative Adversarial Network (GAN): to the latent space. Healthcare, image analysis, data
In order to generate new plausible samples augmentation, video generation, voice generation,
on demand for generative modelling, Ian Good pandemics, traffic control, cybersecurity, and many
fellow created the Generative Adversarial Network other fields have the potential to use GAN networks,
(GAN), a particular sort of neural network and the number of these fields is growing quickly.
architecture. To enable the model to generate or GANs have generally proven themselves to be a full-
output new instances from the original dataset, it fledged autonomous data expansion domain and a
includes automatically identifying and learning solution to issues requiring a generative solution.
regularities or patterns in input data. As depicted in 1.Auto-Encoder (AE) and Its Variants: An auto-
Fig. A discriminator D forecasts the possibility that a encoder (AE) is a popular unsupervised learning
following sample will be taken from genuine data technique in which neural networks are used to learn
rather than data produced by the generator, while a representations. Typically, auto-encoders are used to
generator G makes new data with attributes work with high-dimensional data, and
comparable to the original data. Thus, in GAN dimensionality reduction explains how a set of data
modelling, both the generator and discriminator are is represented. Encoder, code, and decoder are the
trained to compete. With each other. While the three parts of an autoencoder. The encoder
generator tries to fool and confuse the discriminator compresses the input and generates the code, which
by creating more realistic data, the discriminator the decoder subsequently uses to reconstruct the
tries to distinguish the genuine data from the fake input. Recently, generative data models were taught
data generated by G.
for supervised or discriminative learning tasks and a new deep learning model, as shown by the general
to ensure model accuracy. structure of the transfer learning process in Figure.
Due to its ability to train deep neural networks using
depending on the circumstances between the source Deep learning has been effectively used to
and target domains and activities. While most solve many issues in several application areas during
current research focuses on supervised learning, how the last few years. These include robots, business,
deep neural networks can transfer knowledge in cybersecurity, virtual assistants, image identification,
healthcare, and many more. They also include
unsupervised or semi-supervised learning may natural language processing and sentiment analysis.
gain further interest in the future. DTL techniques
are useful in a variety of fields including natural We have outlined a number of deep
language processing, sentiment classification, visual learning's potential real-world application areas in
recognition, speech recognition, spam filtering, and Fig. These application domains use a variety of deep
relevant others. learning techniques according to our taxonomy,
A general structure of transfer learning process, which is shown in Fig. and includes discriminative
where knowledge from pre-trained model is learning, generative learning, as well as hybrid
transferred into new DL model. models that were previously described. In different
Deep Reinforcement Learning (DRL): The real-world application sectors, there are various deep
sequential decision-making problem is approached learning tasks and methods that are employed to
differently by reinforcement learning than by the handle the pertinent problems. Overall, based on
other methodologies we have seen so far. In Fig., we may draw the conclusion that deep learning
reinforcement learning, the ideas of an modelling has enormous future potential and a wide
environment and an agent are frequently introduced range of application fields. The research difficulties
first. The agent can take a number of actions in the surrounding deep learning modelling are also
environment, each of which affects the state of the summarized in the next section, along with some
environment and has the potential to produce prospective elements for next-generation DL
rewards (feedback): "positive" for good action modelling.
sequences that produce "good" states and "negative"
for bad action sequences that produce "bad" states.
Reinforcement learning's goal is to develop desirable
action patterns through interacting with the
environment, which is generally referred to as a
policy.
In order to enable the agents to learn the right
actions in a virtual environment, deep reinforcement
learning (DRL or deep RL) blends neural networks
with a reinforcement learning architecture, as shown
in Fig. While model-free RL systems learn directly
through interactions with the environment, model-
based RL is based on learning a transition model that
permits modelling of the environment without
interacting with it. For every (finite) Markov
Decision Process (MDP), the best action-selection
strategy can be found using the widely used model-
free RL technique known as Q-learning. A
mathematical framework called MDP is used to
describe decisions that are based on status, action,
and rewards. Aside from that the area makes use of
Deep Q-Networks, Double DQN, Bi-directional 1. Healthcare:
Learning, Monte Carlo Control, etc. As policy One of the industries that has embraced
and/or value function approximators, DRL contemporary technology most widely to transform
approaches combine DL models, such as Deep itself is the healthcare industry. The fact that Deep
Neural Networks (DNN), based on the MDP Learning is being used to analyse medical data for
principle, using raw, high-dimensional visual inputs.
DRL-based solutions are applicable to a variety of 1) The diagnosis, prognosis & treatment of diseases
real-world applications, including robotics, video 2) Drug prescription
games, natural language processing, computer 3) Analysing MRIs, CT scans, ECG, X-Rays, etc.,
vision, and pertinent others. to detect and notify about medical anomalies
4) Personalising treatment monitoring the health of
III. DEEP LEARNING APPLICATIONS patients and more.
The detection and treatment of cancer is one
significant area where deep learning is used. To rank
various cancer cell types, medical professionals so much like real people. The automatic translation
employ a CNN, or Convolutional Neural Network, a of websites from one human language to another
deep learning technique. They 20X or 40X-magnify using Deep Learning-based NLP is another example.
high-res histopathology pictures before exposing 5. Autonomous Vehicles:
them to deep CNN models. The deep CNN models 1) The first semi-automatic car was introduced by
then distinguish different cellular properties present the Tsukuba Mechanical Engineering Laboratory 45
years ago, which is when the idea of creating
automated or self-governing vehicles initially came
in the sample and identify materials that are
carcinogenic. into existence. The car, a technological marvel at the
2. Personalized Marketing: time, was equipped with two cameras and an analog
The idea of personalized marketing has been widely computer so it could drive itself down a road that
used in recent years. Marketers are now focusing was made just for it. But it wasn't until 1989 when
their advertising campaigns on the needs of specific an altered military ambulance called ALVINN
consumers and providing them with the solutions to (Autonomous Land Vehicle in a Neural Network)
their problems. And Deep Learning is crucially used neural networks to find its way on roadways on
important in this. Thanks to their use of social media its own. Since then, deep learning and autonomous
platforms, Internet of Things (IoT) devices, online vehicles have forged a close relationship, with the
browsers, wearables, and other similar technologies, former significantly improving the performance of
consumers today generate a lot of data. The majority the latter. Autonomous vehicles use cameras,
of the data produced from these sources, sensors, including LiDAR, RADAR, and motion
nevertheless, is fragmentary (text, audio, video, sensors, as well as outside data like geomapping, to
location data, etc.).Businesses utilize adaptable Deep detect their surroundings and gather pertinent
Learning models to evaluate data from many sources information.. They employ this gear both singly and
and distill it in order to derive insightful customer collectively to record the data.
information. They then employ this data to forecast 2) Deep learning algorithms are then used with this
consumer behavior and more effectively focus their data to guide the vehicle to take the proper actions,
marketing efforts. Consequently, you now such as
comprehend how those online purchasing sites accelerating, steering and braking
decide which things to suggest to you. identifying or planning routes
3. Spotting Financial Fraud traversing the traffic
The evil known as "fraudulent transactions" 1) Recognizing traffic signs and spotting people and
or "financial fraud" affects almost every industry. other cars both nearby and at a distance
However, the financial institutions (banks, insurance 2) Realizing the alleged goals of self-driving cars,
companies, etc.) are the ones who must deal with such as lowering the number of traffic accidents,
this threat's worst effects. Criminals target financial assisting the disabled in operating a vehicle, etc.,
institutions every single day. There are numerous depends greatly on deep learning. Although still in
ways to hijack their financial resources. their infancy, deep learning-powered vehicles will
soon make up the majority of the traffic on the roads.
4. Natural Language Processing:
Another significant area where Deep Learning is IV. CONCLUSION
demonstrating promising results is in NLP, or
Natural Language Processing. The goal of natural Deep learning models like the
language processing, as the name suggests, is to Convolutional Neural Network (CNN), because they
make it possible for computers to comprehend and are not naturally optimized by the model, deep
analyze human language. The idea seems learning models like the Convolutional Neural
straightforward, right? But the truth is, machines Network (CNN) have an enormous number of
have a terrible time understanding human language. parameters, which we can actually refer to as hyper-
By teaching machines (Autoencoders and parameters. Gridsearching these hyper-parameters'
Distributed Representation) to produce suitable ideal values is possible, but it takes a lot of time and
answers to linguistic inputs, it is possible to learn resources. Therefore, does a real data scientist accept
human language beyond the alphabet and words, educated guesses for these crucial parameters?
including context, accents, handwriting, and other Building on the design and architecture of the
aspects. The personal assistants that we utilize on specialists who have conducted in-depth research in
our smartphones are one such example. These your field, frequently with powerful hardware at
applications include Deep Learning-infused Natural their disposal, is one of the finest methods to
Language Processing (NLP) models to recognize improve your models. They graciously open-source
human speech and produce the intended results. . the resulting modelling architectures and reasoning
Therefore, it makes sense that Siri and Alexa sound rather frequently. Future neural networks might
VI. REFERENCES
ABSTRACT—
Cloud computing is the on-demand delivery of IT and Then adjustment construction from on-premises to
possessions over the Internet with pay-as-you-go appraising. cloud.
Instead of buying, owning, and maintaining physical data
II. TYPES OF BASIC CLOUD SERVICES
centers and waitrons, the syndicates can access knowledge
Cloud computing can be detached into three general
services, such as computing power, stowage, and databases,
package delivery groupings or methods of cloud computing:
on an as-needed basis from cloud service benefactors. Cloud
computing provides greater flexibility, adeptness and
strategic value compared to traditional on-premises IT
infrastructure. The advantages of cloud computing solutions
for businesses include increased capacity, functionality,
scalability, productivity, less maintenance, and summary
cost. Moreover, cloud computing solutions are easily
available from anywhere with an internet connection. Cloud
migration is the development of poignant a company's
alphanumeric assets, platform zone, databases, IT resources,
and applications moreover temperately, or absolutely, into
the cloud. A fruitful cloud migration diminishes cost,
progresses scalability, and intentionally reduces the risk of a
cyber-incident that could disrupt the company’s business
side by side.
Keywords—Cloud Computing, Cloud Migration,
Challenges in Cloud Computing, Cloud Computing
Architecture and Types of Services.
I. INTRODUCTION
Cloud computing is the on-demand obtainability of
computing possessions (such as storage and infrastructure),
as filling station over the internet. It disregards the need for 1. SaaS: SaaS is a dissemination model that delivers
folks and businesses to self-manage physical incomes software submissions over the internet; these
themselves, and only pay for what they use. Cloud presentations are regularly called web services. Users
Migration is a transformation from old old-style business can entree SaaS applications and services from any
operations to digital business actions and the process raises position expending a computer or mobile device that has
to moving the cardinal business operations to cloud. That internet access. In the SaaS model, users gain access to
means data, submissions or other business fundamentals are application software and databases. One common
stirred into a cloud computing environment. For example, example of a SaaS application is Microsoft 365 for
moving data and applications from a local, to the on- productivity and email services.
premises data center of the cloud. Every business starting 2. PaaS: In the PaaS model, cloud earners host enlargement
from small to large officialdoms follows considerably tools on their arrangements. User’s access these tools
different process for cloud migrations. Some of the common completed the internet using APIs, web entrances or
essentials which are reflected before cloud migration are: entrance software. PaaS is used for comprehensive
Calculation of prerequisite and presentation, selection of software development, and many PaaS benefactors host
cloud provider and Calculation of functioning costs the software after it's established. Common PaaS
The basic steps which are in the migration of cloud: merchandises include Salesforce's Lightning
Beginning migration goals, Creating a sanctuary strategy, Platform, AWS Changeable Beanstalk and Google
Reproducing existing database, Move business intelligence App Engine
3. IaaS: IaaS providers, such as Amazon Web Services The Construction of Cloud computing encompasses
(AWS), quantity a virtual attendant illustration and many different apparatuses. It comprises Client
packing, as well as presentation program design organization, applications, services, runtime clouds, storage
boundaries (APIs) that let manipulators migrate amount spaces, administration, and safekeeping. These are all the
of work to a virtual machine (VM). Users have an parts of a Cloud computing structural design.
assigned storage measurements and can start, stop,
admittance and constitute the VM and storage as Front End:
favorite. IaaS benefactors offer small, medium, large, The client uses the forward-facing end, which
extra-large, and memory- or compute-optimized encompasses a client-side boundary and submission. Both
instances, in accumulation to enabling customization of of these components are imperative to access the Cloud
instances, for various workload essentials. The IaaS computing podium. The opposite end excludes web
cloud archetypal is closest to an out-of-the-way data attendants (Chrome, Firefox, Entertainment, etc.), clients,
focal point for business manipulators. and mobile procedures.
Back End:
III. CLOUD COMPUTING ARCHITECTURE The backend part supports you accomplish all the
Cloud Computing Structural design is a amalgamation of possessions required to afford Cloud computing service
machineries compulsory for a Cloud Computing provision. station. This Cloud construction part comprises a security
A Cloud computing architecture be made up of more than a instrument, a bulky expanse of data storage, headwaiters,
few apparatuses like a frontend platform, a backend computer-generated equipment, traffic control appliances,
platform or attendants, a network or Internet amenity, and a etc.
cloud-based conveyance facility. Cloud computing
encompasses two components, the opposite end, and the IV. COMPONENTS OF CLOUD COMPUTING ARCHITECTURE
back end. The front end comprises of the client part of a
cloud computing arrangement. It excludes boundaries and 1. Client Infrastructure:
presentations that are required to admittance the Cloud Client Organization is a front-end module that make
computing or Cloud indoctrination boards. available a GUI. It benefits users to intermingle through
the Cloud.
2. Application:
The presentation can be any software or display place
which a consumer requirements to contact.
3. Service:
The service constituent accomplishes which type of
package you can admittance conferring to the client’s
necessities.
Three Cloud computing services are:
Software as a Service (SaaS)
Platform as a Service (PaaS)
Infrastructure as a Service (IaaS)
4. Runtime Cloud:
Runtime cloud suggestions the accomplishment and
runtime surroundings to the virtual machines.
Architecture of cloud computing 5. Storage:
Storage is additional important Cloud computing
Although the back end discusses to the cloud herself, it architecture constituent. It make available a large amount
encompasses the possessions prerequisite for cloud of storage capacity in the Cloud to store and manage data.
computing service area. It comprises of virtual technologies, 6. Infrastructure:
servers, data storage, security apparatuses, etc. It is under It offers services on the host level, network level, and
the wage-earner’s control. Cloud computing mete out the application level. Cloud infrastructure includes hardware
file system that feasts over multiple unbreakable compact and software components like servers, storage, network
disk and machines. Data is underneath no surroundings devices, virtualization software, and various other storage
stored in one residence, and in case one unit miscarries, the resources that are needed to support the cloud computing
other will take over spontaneously. The user disk space is model.
assigned on the disseminated file system, while alternative 7. Management:
imperative constituent is an algorithm for supply This component manages components like
apportionment. application, service, runtime cloud, storage, infrastructure,
Cloud computing is a resilient disseminated and it and other security matters in the backend. It also
comprehensively be contingent upon durable procedures establishes coordination between them.
8. Security:
Security in the backend refers to commissioning completely employ the cloud’s abilities. The conceivable
different security mechanisms for secure Cloud systems, challenges are following:
resources, files, and infrastructure to the end-user.
9. Internet: Performance: When wander the business apps to the
Internet connection acts as the bridge or medium cloud or a third-party merchant, the business presentation
concerning frontend and backend. It allows you to becomes reliant on the contractor. Another momentous
inaugurate the collaboration and announcement between matter in cloud computing is verdict the accurate cloud
the frontend and backend. overhaul provider. Prior to participating, we should
expedition for benefactors with trailblazing equipment.
The presentation of the BIs and other cloud-based
V .CLOUD COMPUTING STRATEGY organizations is also secured to the systems of the
1. Replace: It refers to trading the hoary plan with a new contractor. Be thoughtful when picking a service and
artificial SaaS (Software as a Service). guarantee that they have apparatuses in place to deal with
2. Refactor: It refers to reclaim the claim code and complications that progress in material stretch.
agendas and running the application on a PaaS
(Platform as a Provision). Security: The principal apprehension in participating in
3. Rehost: It represents to take the request to the new cloud a facility is cloud computing retreat. It is since your
hosted cloud environment by choosing IaaS statistics is stored and administered by a third-party
(Set-up as a Amenity). benefactor wanting your acquaintance. Every day or so,
4. Rebuild: It discusses to re-architecting the tender from you accumulate evidence about a confident organization’s
the opening up on a PaaS provider’s platform. smashed confirmation, negotiated credentials, account
5. Revise: It refers to escalating cipher base and then horse-riding, data fissures, and so on which varieties the
arranging it moreover by rehosting or refactoring. user even more unconvinced. Fortuitously, cloud
establishments have activated to brand determinations to
increase safekeeping competences. You can also be
thoughtful by read-through to see if the breadwinner has a
sheltered user distinctiveness management system and
right of entry governor procedures in place. Also, make
definite that it monitors database danger and concealment
conventions.
1. Public
2. Private Cloud totaling is the distribution of subtracting
3. Community facilities—as well as servers, storage, databases,
4. Hybrid interacting,
Software, analytics, and brainpower—over the Internet
(" the cloud & quot) to suggestion quicker
modernization, bendable possessions, and parsimonies of Journal of Computer Applications (0975 – 8887)Volume 86
weighbridge. Cloud computing trusts comprehensively – No 16, January 2021
on virtualization and computerization machineries.
Virtualization empowers the easy generalization and 7. Aslam, M., bin AB Rahim, L., Watada, J., & Hashmani,
provisioning of amenities and fundamental raincloud M .,” International Journal of Engineering Applied Sciences
systems into common-sense entities that manipulators can and Technology”, 2020 , ISSN No. 2455-2143, Pages 73-81.
invitation and consume. Computerization and
complementary instrumentation proficiencies afford users 8. Mohd Khairul Akhbar Jahiruddin1, Zulazhan Ab. Halim2
with a high notch of self-service to running possessions, and Mohamad Azwan Kamarudin3, readability level of the
bond facilities and position capabilities minus unshakable arabic language textbook of diploma tahfiz al-quran & al-
intervention from the cloud benefactor's IT staff. qiraat darul quran jakim, ”international journal of advanced
Cloud computing is spinning into the strength of research paper”, issn NO:2320-5407,PP:83-88
essentially everything we do now and as such, small, 9. Martin Bremer1 , Tim Walter1 , Nikita Fjodorovs1 ,
medium, and large organizations are adapting to cloud Katharina Schmid1,” A Systematic Literature Review On
technologies as they need space to store all their data. The Suitability Of Cloud Migration Methods For Small And
Cloud computing isn’t just good for establishments Medium-Sized Enterprises”, (2021)Conference on
nonetheless; it’s also excessive for persons as they can take Production Systems and Logistics, PP:567-579 DOI:
improvement of geographies like involvement, https://doi.org/10.15488/11285.
preservation, and springiness 10. Hsu, P.-F., Ray, S., Li-Hsieh, Y.-Y., 2020. Examining
cloud computing adoption intention, pricing mechanism, and
deployment model. “International Journal of Information
Management” 34 (4),PP: 474–488.
IX. REFERENCES 11. Raut, R.D., Gardas, B.B., Narkhede, B.E., Narwane,
V.S., 2021, “To investigate the determinants of cloud
1. Ugonna Anthony, Boison, David King, Yeboah-Boateng,
computing “adoption in the manufacturing micro, small and
Ezer Osei,”survey paper”: CLOUD COMPUTING
medium enterprises. BIJ 26 (3),PP: 990–1019.
MIGRATION FRAMEWORK FOR MICROFINANCE-
A CASE OF BANKS IN ACCRA-GHANA,2020 12. Ahmad, A. and M. A. Babar (2014). “A framework for
International Journal of Computer Science and Information architecture-driven migration of legacy systems to cloud-
Technology ISSN 2348-120X, Vol. 7, pp: (26-37) enabled software”. Proceedings of the WICSA 2021
Companion Volume. Sydney, Australia, PP: 1-8
2. Hajjat, M., Sun, X., Sung, Y., Maltz, D., Rao, S.,
Sripanidkulchai, K., Tawarmalani, M.: Cloudward (2020). 13. Ardagna, D., E. D. Nitto, et al. (2020). MODAClouds:”
bound: planning for beneficial migration of enterprise a model-driven approach for the design and execution of
applications to the cloud. In: ACM SIGCOMM Computer applications on multiple clouds”. Proceedings of the 4th
Communication Review. vol. 40, pp. 243–254. International Workshop on Modeling in Software
Engineering. PP: 50-56.
3. ACM. Banerjee, J., (2021). Moving to the Cloud:
Workload Migration Techniques and Approaches. In: High 14. Thomas Chen, Ta-Tao Chuang, Kazuo Nakatani, The
Performance Computing (HiPC), 2021 19th International Perceived Business Benefit of Cloud Computing: An
Conference on, vol., no., pp.1,6, 18-22 Dec. 2012. Exploratory Study,” journal of international technology and
information management,PP:48-58(2020)
4. Latif, Rabia & Abbas, Haider & Assar, Saïd & Ali,
Qasim. (2020). Cloud Computing Risk Assessment: A 15. R. Rai, G. Sahoo and S. Mehfuz. Exploring the factors
Systematic Literature Review.10.1007/978-3-642-40861- influencing the cloud computing adoption: “A systematic
8_42. study on cloud migration”. SpringerPlus, vol. 4:197,
2021,PP:12-19
5. Hashizume, Keiko & Rosado, David & Fernández-
Medina, Eduardo & Fernández, Eduardo. (2022). An
analysis of security issues for cloud computing. Journal of
Internet Services and Applications. 4.10.1186/1869-0238-4-
5.
6. Pradip D. Patel,” Live Virtual Machine Migration
Techniques in Cloud Computing”: A Survey, International
ABSTRACT
This research looks at how chatbots have quickly changed because we can use them on different devices and in
and become important in areas like marketing, education, messaging apps, which makes them really easy to use
healthcare, and entertainment. The paper also looks at and helpful.
how people became interested in chatbots over time, why
they are being used, and how they help in different areas.
It talks about how people's beliefs and ideas can affect II. LITERATURE REVIEW
how chatbots are made and used. It also talks about how Many apps are trying to make computer
chatbots can be grouped based on what they know and conversations feel more like talking to humans. But
what they do. Finally, it talks about the technology and often, the information these apps use to chat comes from
tools used to make chatbots today. This research shows databases created by experts. We can use AI to create
that chatbots have a lot of potential and should be studied different types of chatbots, and in this paper, we've built
more. a College Enquiry chatbot. It can answer questions about
Keywords: Chabot Chabot architecture Artificial things like the enrollment process, fees, courses,
Intelligence · Machine learning · NLU eligibility criteria, and admissions. This paper also
discusses how to use AI to understand important facts in
I. INTRODUCTION texts about real people's lives. This can help create
chatbots for middle school learning situations, whether
In our fast-moving tech world, chatbots have become a online or in classrooms. Studying this type of learning
big deal. They're like super-smart computer programs involves many academic fields, including instructional
that can talk to us like humans. These chatbots are used technology, educational psychology, sociology, cognitive
all over the place, like in customer service, healthcare, psychology, and social psychology. Some people
schools, and online stores. They're changing how we do describe chatbots as a way to interact with computers
things and even making us think about how we talk to online, even though it seems like you're talking to a
computers and what it means for our society. person. Others say chatbots are computer programs that
Chatbots are more than just computer programs that can talk to users like they're having a conversation,
give automatic answers. They're a mix of really fancy thanks to artificial intelligence. Chatbots use special
tech and the way our brains work. They use special software to communicate using natural language, like we
computer math to understand what we mean and how do when we talk to each other. It's often hard for users to
we feel, so it feels like we're talking to a real person. tell that they're not talking to a real person, which is why
They're getting better all the time, and they can be used having a big database of information is so important for
in lots of ways to make things work better and be more chatbots.
convenient. Chatbots are becoming a popular way for
This research paper takes a deep look at chatbots. It organizations to communicate with individual users and
talks about how they started and what they can do now. quickly answer their questions. This interest in chatbots
It also looks at how they're changing different parts of has grown recently due to improvements in messaging
our lives, like helping with customer service or even services and advances in artificial intelligence. The
providing support for mental health. proposed system in this paper is a College Enquiry
But, there are some important things to think about Chatbot created using the chatterbot algorithm, which is
too. As chatbots get better at pretending to be humans, a Python library. It makes it easy for developers to create
we have to think about privacy, keeping our chatbots that can have conversations with users. This
information safe, and when it's okay to use machines to chatbot can provide information and answer questions
talk instead of people. related to college inquiries, like enrollment, courses,
Chatbots are a big part of how we use Artificial eligibility, and admissions. Users don't have to visit the
Intelligence in our lives. They're like smart computer college in person to get information; they can ask the
helpers that can do all sorts of things, from answering chatbot online. The history of chatbots goes back to Alan
questions to helping us shop online. They're popular Turing's 1950 Turing Test, but they became more
popular with the introduction of Eliza in 1966. Over
time, chatbots like PARRY and ALICE were developed.
They paved the way for virtual personal assistants like
Siri, Cortana, Alexa, Google Assistant, and Watson. The
NAME</pattern>
<template>I'm a Chabot. You can call me
ChatBot.</template>
</category>
</aiml>
In this code:
The <aiml> tag marks the start of AIML code.
IV.TYPES OF CHATBOTS
III. ESSENTIAL CONCEPTS:
Pattern Matching: Chabot responses are generated Chatbots can be categorized in various ways, including by
based on user input, with pioneers like Eliza and their knowledge domain, the services they offer, their goals,
ALICE using this method. However, this approach can how they process input, whether they interact with other
make conversations seem predictable and less human- chatbots, and how they are built. Knowledge Domain
like. Classification: This categorization considers the chatbot's
knowledge scope. Open domain chatbots can discuss a wide
Artificial Intelligence Mark-up Language (AIML): range of topics, while closed domain chatbots specialize in a
Developed from 1995 to 2000, AIML uses pattern specific knowledge area and may not handle unrelated
recognition for human-Chabot dialogues. It's an XML- queries effectively. Service-Based Classification: This
based markup language that uses categories with user classification is based on the nature of the interaction and the
input patterns and Chabot responses to improve services provided. Interpersonal chatbots facilitate tasks like
balance between them can vary from one next action. This action can range from
chatbot to another. These classifications help providing direct responses to asking for more
in understanding the approach and capabilities context or clarification.
of different chatbot systems. Chatbot design is an iterative process, and
Design and Development continuous improvement is essential to ensure
Chatbot design is a complex task that involves that the chatbot effectively serves its users and
several techniques to ensure that it serves its fulfills its intended purpose.
intended purpose and falls into the appropriate
category. Here's an overview of key V. CONCLUSION
considerations in chatbot design:
Algorithm and Tool Selection: Developers The ultimate goal in technology development
choose algorithms and tools based on the is to reduce the need for human intervention as
chatbot's purpose and category. These choices much as possible. Chatbots are particularly
influence how the chatbot understands and effective at achieving this goal, as they can
generates responses. User Expectations: Users reach a wide audience through messaging apps
need to have a clear understanding of what to and can often outperform human agents in
expect from the chatbot. This includes knowing certain tasks. They have the potentl to evolve
the chatbot's capabilities and limitations into advanced tools for gathering information
and can significantly reduce costs in
customer service operations. As AI and
machine learning continue to advance, it may
become increasingly difficult to distinguish
between chatbots and human agents, as
chatbots become more sophisticated in their
responses and interactions. This research
provides valuable insights into the
fundamental aspects of chatbots, which can be
Design Requirements: beneficial for both users and developers in
Accurate Knowledge Representation: The understanding how to use and create them
chatbot needs to have an accurate effectively. Future directions for research in
representation of the knowledge it uses to this field may include a more in-depth
provide responses. This knowledge can be analysis of existing chatbot platforms,
hand- coded or acquired through training data. assessments of their functionality, and
Answer Generation Strategies: Chatbots exploring ethical considerations related to
employ strategies to generate responses, which issues like
can vary from rule-based approaches to
machine learning techniques.
Predefined Neutral Responses: Chatbots
should have predefined responses for abuse and deception.The acknowledgments
situations when they don't understand the mention support from the MPhil program
user's input. These responses can help manage "Advanced Technologies in Informatics and
user expectations. Computers" at the International Hellenic
Modular Approach: A modular approach to University's Department of Computer Science,
chatbot design is often preferred. This means which contributed to this research.
breaking down the chatbot's functionality into
distinct components that work together. This
modular structure can enhance flexibility and VII. REFERENCES
maintainability.
General Chatbot Architecture (as shown in Figure 1. Kumar Shivam; Khan Saud; Manav Sharma;
3): Saurav Vashishth; Sheetal Patil , "Chatbot
User Input: The process begins with a user's for College Website" in International Journal
request or input. of Computing and Technology, June 2018.
2. Ms.Ch.Lavanya Susanna and R.
Language Understanding Component: This Pratyusha, "COLLEGE ENQUIRY
component processes the user input to CHATBOT" in International
determine the user's intent and gather any Research Journal of
relevant information. Engineering and Technology (IRJET)
Decision-Making: Based on the understanding on 3rd March 2020.
of the user's input, the chatbot decides on its 3. Guruswami Hiremath, Aishwarya Hajare,
Priyanka
Bhosale and Rasika Nanaware, “Chatbot for
education
an organization's marketing goals, policies, and Marketing professionals and specialist use many
action sequences (tactics) into a cohesive whole. tactics to attract and retain their customers. These
The objective of a marketing strategy is to provide a activities comprise of different concepts, the
foundation from which a tactical plan is developed. most important one being the marketing mix.
This allows the organization to carry out its mission There are two concepts for marketing mix: 4P
effectively and efficiently. and 7P.
It is essential to balance the 4Ps or the 7Ps of the
marketing mix. The concept of 4Ps has been
long used for the product industry while the
latter has emerged as a successful proposition for
The following techniques are implemented the services industry.
to device the Marketing Strategy for the The 7Ps of the marketing mix that are used to frame
product/service: marketing strategies of life insurance companies can
be discussed as:
Segmentation
Targeting
Positioning Product - It must provide value to a customer but
Segmentation: does not have to be tangible at the same time.
Market segmentation is widely defined as being a Basically, it involves introducing new products or
complex process consisting in two main phases: improvising the existing products. A product
identification of broad, large markets means what we produce. If we produce goods, it
Segmentation of these markets in order to means tangible product & when we produce &
generate services, it means intangible service
select the most appropriate target markets and
product.
develop marketing mixes accordingly. Positioning:
Simply, positioning show your target market A product is both what a seller has to sell & buyer
defines you in relation to your competitors. A has to buy. So, insurance companies sell services
good position is: &services are their products. Apart from life
1. What makes you unique? insurance as product, customer not only buys
product but also services in the form of assistance
2. This is considered a benefit by your target market & advice of agent. It is natural that customers
expect reasonable returns for their investments &
Positioning is important because you are
insurance companies want to maximize their
competing with all the noise out there competing
profitability. Hence while deciding the product mix
for your potential fans attention. If you can stand
services or schemes should be motivational.
out with a unique benefit, you
Price - Pricing must be competitive and must entail
profit. The pricing strategy can comprise discounts,
offers and the like. The pricing of insurance
have a chance at getting their attention. It is products not only affects the sales volume and
important to understand your prod view relative profitability but also influences the perceived
to the competition. quality in the minds of the consumers. There are
several different methods for pricing insurance,
Targeting: based on the insurance marketer objectives. They
Targeting involves breaking a market into are the survival approach, the sales maximization
segments and then concentrating your marketing approach, and the profit maximization approach. To
efforts on one or a few key segments. Target determine the insurance premium, marketers
marketing can be the key to a success. The consider various factors such as mortality rate,
beauty of target marketing is that it makes the investment earnings, and expenses, in addition to
promotion, pricing and distribution of your the individual risk profile based on age, health, etc.,
products and/or services easier and more cost- and the time period/ frequency of payment.
effective. Target In insurance business the pricing decisions are
concerned with:
Marketing Mix:
-The premium charged against policies
-The interest charged for defaulting the payment of infrastructural facilities and management of branch
premium & credit facility. offices and premises.
-Commission charged for underwriting & Thus, place management of insurance premises
consultancy activities. needs a new vision, distinct approach & an
innovative style. The branch managers need
The pricing decisions may be high or low keeping in
professional excellence to make place decisions
view the level or standard of customers or the
productive.
policyholders. Mainly, pricing of insurance is in the
form of premium rates. The three main factors used Promotion - It includes the various ways of
for determining the premium rates under a life communicating to the customers of what the
insurance. People would not be willing to put their company has to offer. It is about communicating
funds to invest in insurance business if the interest about the benefits of using a particular product or
rates provided by other financial instruments are service rather than just talking about its features.
higher than the perceived returns from the insurance The insurance services depend on effective
premiums. promotional measures, so as to create impulsive
buying. Promotion comprises of advertising & other
Place - It refers to the place where the customers
publicity tactics. The promotion is a fight not only
can buy the product and how the product reaches
for market share, but also for mind share. The
out to that place. This is done through different
insurance
channels, like Internet,
services depend on effective promotional measures,
so as to create impulsive buying. Promotion
wholesalers and retailers. This component of comprises of advertising & other publicity tactics.
marketing mix is related to two important facets- Due attention should be given in selecting the
promotional tools. Personnel should be given
-Managing the insurance personnel adequate training for creating impulsive buying.
-Locating a branch
The management of insurance personal should be People - People refer to the customers, employees,
done in such a way that gap between the services management and everybody else involved in it. It is
promises-services offered is bridged over. In a essential for everyone to realize that the reputation
majority of service generating organizations, such of the brand that you are involved with is in the
people's hands. Understanding the customer better
a gap is found existent which has been instrumental allows to design appropriate products. Being a
in making down the image problem. The insurance service industry, which involves a high level of
personnel if not managed properly would make all people interaction, it is very
efforts insensitive. They are required to be given
adequate incentives to show their excellence. They important to use this resource efficiently in order to
should be provided intensive trainings to focus satisfy customers. Training, development &strong
mainly on behavioural management. insurance plan relationships with intermediaries are the key areas
are mortality, expense & interest. The pricing of to be kept under consideration.
insurance is in form of premium rates. The three Process - It refers to the methods and process of
main factors for determining the premium rates providing a service and is hence essential to have a
under life insurance plan are: thorough knowledge on whether the services are
Mortality: Average death rates in a particular area. helpful to the customers, if they are provided in
time, if the customers are informed in hand about
Expenses: The cost of processing, commission to the services and many such things. The process
agents, registration is all incorporated into the cost should be customer friendly in insurance industry.
of instalments & premium sum & forms the integral The speed & accuracy of payment is of immense
part of pricing strategy. Interest: The rate of importance. The processing method should be easy
interest is one of the invest in Another important to& convenient to the customers. Instalment
dimension to the place mix is related to the location schemes should be streamlined to cater to the ever-
of insurance branches. While locating branches, growing demands of the customers. IT & Data
branch manager needs to consider the number of warehousing will smoothen the process flow. IT will
factors such as smooth accessibility, availability of help in servicing the large no. Of customers
efficiently and bring down overheads. Technology predicted direction, where market share was used as
can either complement or supplement the channels an indicator of brand equity. Brand recall and
of distribution cost effectively. It also helps to familiarity, however, were found to be the best
improve customer service levels & helps to find out estimators of brand equity in the financial services
profitability & potential of various customers market. P. Kotler rightly states that a company's
product segments. marketing strategy depends on many factors, one of
which is its size and position in the market. From
Physical (evidence) - It refers to the experience of
this assertion he suggests that one method of
using a product or service. When a service goes out
classifying marketing strategies is to place the firm
to the customer, it is essential that you help him see
in accordance with its competitive position; namely
what he is buying or not. For example- brochures,
as to whether they are market leaders, challengers,
pamphlets etc serve this purpose. Evidence is a key
followers, or niches. In effect these are behavioural
element of success for all insurance companies.
strategies ordered in relation to the company's
Physical evidence can be provided to insurance
market share. Impetus for marketing strategy: India
customers in the form of policy certificate and
is a jumbo- sized opportunity for life insurance need
premium payment receipts. The office building, the
hardly be laboured. Here is a nation of a billion
ambience, the service personnel etc. of the
people, of whom merely 100 million people are
insurance company and their logo and brand name
insured. And, significantly, even those who do have
in advertisements also add to the physical evidence.
insurance are grossly underinsured. The emerging
To reach a profitable mass of customers, then new
middle-class population, growing affluence and the
distribution avenues & alliances will be necessary.
absence of a social security system combine to
Initially insurance was looked upon as a complex
make India one of the world’s most you look at it –
product with a high advice & service component.
whether in terms of life insurance premiums as a
Buyers prefer a face to face interaction & they place
percentage of GDP or premium
a high premium on brand names & reliability.
Review of literature: Sankaran (1999) studied the per capita –the market is under penetrated and
measures people are under-insured. In a country where there
is high unemployment and where social security
systems are absent, life insurance offers the basic
that would help domestic players in financial cover against traditionally life’s been a savings
services sector to improve their competitive uncertainties-oriented country and insurance plays a
efficiency, and thereby to reduce the transaction critical role in the
costs. The study found that the specific set of
development of the Indian economy. The role of
sources of sustainable competitive advantage
insurance in the economy is vital as it able to
relevant for Financial Service Industry are:
mobilize premium
payments into long-term investible
product and process innovations, brand equity, funds. As such, it is a key sector for development.
positive influences of 'Communication Goods ‘, So marketing strategies are important and inevitable
corporate culture, experience effects, scale effects, phenomenon to tap
and information technology. Trevor Watkins (1989)
while studying the current state of the financial
services industry worldwide identified four major huge untapped potential. Effective selling of
trends: the trend towards financial conglomeration, insurance policies depends to a large extent on the
globalization, information technology in service marketing strategies selected.
marketing; and new approaches to financial services
II. COMPONENTSOFMARKETINGSTR
marketing: These trends, it was concluded, will
ATEGIES
affect the marketing of banks and other financial
services in the 1990s. Marisa Maio Mackay (2001)
Pricing
examined whether differences exist between service
and product markets, which warrant different Personal selling
marketing practices by applying ten existing
consumer-based measures Advertising
often non technological, this is still the center of services that can fulfill both banking and
much analysis and debate (Kandampully, 2002). insurance needs, if implemented correctly can
bring vast benefits to stakeholders such as banks,
insurance companies, shareholders and
consumers. Bancassurance will facilitate mass
selling of insurance products through banks.
Banks can act as large financial supermarkets.
III.CUSTOMERRELATIONSHIPMANA Distribution of insurance will be smoother
GEMENT through wider number of branches of the banks.
Insurance companies experiencing competition from Customer database, personalized service, rural
within and abroad. Making this problem- situation penetration, cross-selling of products (e.g. car
into an opportunity lies always on the prudent loan along with car
management adopting or adapting tactics and
strategies. In line of this, customer relationship
management is a measure of winning
competitiveness as it is the information driven insurance), being cheape r In today's scenario of
approach to customer analysis and process a well-regulated life
automation; and thereon supplement customer-value insurance market under the hawkish eye and strong
proposition. An action on tangible services –prompt governance of Insurance Regulatory and
and accurate issue of document, prompt and fair Development Authority of India (IRDAI), there are
settlement of claim, good listening mechanism, still doubts and worries in the minds of the
better problem-solving approach, reliable manner of customers, especially when it comes to trusting the
service and meet requirement of customers on time private players with their lives and investments.
every time - in lieu of intangible promises would Let’s take out some time and bust some myths. than
give utmost satisfaction to customers, the customer agents are some of the greatest advantages of
relationship management provides better service to Bancassurance.
the insured protecting him against perils or risks and At present the distribution channels that are
the insurer enabling to retain the existing customers available in the market are listed below: the birth
and bringing in new customers in his ambit of of Lic Life Insurance Corporation of India (Lic)
business direct selling
Distribution channels: The distribution corporate agent
network is most important in insurance industry. group selling
Insurance is not a high cost industry like telecom Brokers and cooperative societies Bancassurer
sector. Therefore it is building established on the 1st of September1956.
its market on goodwill and access on distribution This was Have you imagined what would the battle
network. We cannot deny that insurance are not the time when our Indian between a giant and the
bought, it is sold. The market has a great scope rest of the warriors would look like? Today,
to grow. This can be better done by more the Life Insurance Market of India is just like
innovative channels like a super market, a bank, that. Many of us still do not have a clear answer
a post office, an ATM, departmental store etc. on whether Lic is government or private. There have
these could be used to increase channels of been too many hearsay & rumours about Lic being
insurance. But such growth in channels shall privatised. However, we have got a concrete answer
increase with time. Till then agents seem to be that will help you understand whether Lic is
the most important distribution channel in this government or private. Parliament passed the LIC
industry. Agents connect with people and Act on 19th June 1956 which led to the
influence them to buy any insurance policy. For nationalization of the Indian insurance industry.
the same such agents charge commission on the More than 245 insurance organizations & provident
policies they get for the company. There is a societies were combined to form the state-owned Lic
fixed percentage of commission for which these of India.
agents work. In the field of distribution
channels, many innovative techniques can be CONCLUSION
adopted. For example, Bancassurance and
selling through postal network will make a great In conclusion, the study on marketing strategies in
deal of difference. In Europe 25 percent of the life insurance services industry sheds light on the
insurance policies are sold through banks. critical role that effective marketing plays in shaping
Bancassurance, as a package of financial
ISBN Number : 978-81-958673-8-7 121
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
BIBLIOGRAPHY
G. Siva Prasad, M.C.A, M.Tech
(CSE), UGCNET, Works as
Assistant Professor in the
Department of MCA , KBN College
(Autonomous), Vijayawada, Andhra
Pradesh and he is having 10 years of
experience in teaching and one year in industrial
experience. His research interest includes Data
Mining, Machine Learning, Deep Learning, Big
Data, Microsoft Programming Languages and Web
Programming. He has attended workshops on
POWER BI, Data Analytics using R, Generative AI,
Block Chain Technology and many more.
REFERENCE
1. Anuroop Tony Singh. (2004). Challenging
Opportunity. Asia Insurance Post, 28-29.
2. Anil Chandok. (2006). Application of
CRM in the Insurance Sector. Insurance
Chronicle, May,17-19
3. Balaji, B (2002), Services Marketing
Management. New Delhi, Shand &
Company Ltd.
4. Booms, B.H. and Bitner, M.J. (1981),
5. “Market ServiceMarketingFirms”, of
Services . Donnelly J.H and George W.R.
Chicago:
6. American [6] Marketing Association, pp.
47 –51.
I. Abstract:
We're talking about studying patterns in the IV. Fully Connected Deep Neural Network:
sun's behavior, like its changing activity over an
eleven-year cycle. This is important because it A fully connected deep neural network consists of
affects life on Earth. To predict this, we suggest layers where each neuron connects to all neurons in
using a special kind of computer program called the next layer. The model's complexity depends on
a deep neural network with LSTM. We want to the number of layers and neurons. A balance is
compare this program's performance with needed between accuracy and computational
another one called RNN. We're keeping efficiency. This study selects optimal parameters
everything the same in our experiments to make through experimentation
it fair. The results show that both the LSTM
program and a different kind of program called
a fully connected deep neural network can
predict sunspots more accurately. This tells us
that the LSTM program is really good at
predicting these sunspot patterns that happen in
cycles.
II. Introduction:
Sunspots are intriguing phenomena on the sun's V. Long Short-Term Memory (LSTM) Network:
surface that have significant effects on Earth.
Predicting sunspot activity is crucial for Mean Absolute Error (MAE): This is like finding
anticipating related events like solar wind and the average of how far off our predictions are from
emissions. The study aims to predict sunspot the actual values. It helps us understand how big
occurrences using advanced neural networks. our prediction mistakes are.
Mean Squared Error (MSE): We're finding the
average of the differences between our predictions
III. Deep Learning: and the real values, but we square those differences
Deep learning involves constructing neural first. This way, we make bigger mistakes count
networks with many layers to comprehend complex more.
data relationships. Unlike traditional methods, deep Root Mean Squared Error (RMSE): It's like the
learning can automatically uncover data features. square root of the MSE. This helps us understand
It's particularly useful for handling intricate the typical size of our prediction errors in a simpler
problems where expert input is challenging. way.
Mean Absolute Percentage Error (MAPE): This training data. The goal was to make the models
calculates the average percentage difference learn from this data so they can predict sunspot
between our predictions and the actual values. It activity. To figure out how accurate these
tells us how far off we are on average, in terms of predictions are, the scientists used two measures:
percentage. one is called Mean Squared Error (MSE), and the
other is Mean Absolute Error (MAE). These
And here are the formulas, if you're curious: measures help them see how close the model's
For MSE (Equation 1): predictions are to the actual sunspot data..
Take the difference between the predicted value (y) VII. Results and Analysis:
and the actual value (y_b). The hybrid DNN-
Square that difference (multiply it by itself). LSTM model outperforms traditional DNN and
RNN models in predicting sunspot activity. DNN-
Do this for all items and find the average by LSTM leverages both local and historical
dividing by the number of items. information, enhancing accuracy. The study
compares different cost functions and identifies one
For MAE (Equation 2): that works best for the hybrid model.
Take the absolute difference between the predicted VIII. Conclusion:
value (y) and the actual value (y_b). "Absolute"
means ignoring whether it's positive or negative, The paper proposes a hybrid
just make it positive. DNN-LSTM model for predicting sunspot activity.
This model, which combines local and historical
Do this for all items and find the average by information, demonstrates superior performance
dividing by the number of items. compared to traditional methods. The choice of
cost function also impacts model performance
These calculations help us understand how accurate
positively. The hybrid model's ability to consider
our predictions are compared to the real values
both local and historical data contributes to its
we're trying to predict.
effectiveness in predicting sunspot activity.
IX. Reference
1. Panagiotis Papaioannou, Ronen Talmon,
Serafino Di, Ioannis Keramidas, Constantinos
Sitos, Time Series Forecasting Using Mani fold
Learning, (2021).
2. Jie Zhang et al., Earth-affecting solar transients:
a review of progress in solar cycle, Progress in
Earth and Planetary Science, 24, (2021), 8.
10.1186/s40645-021-00426-7.
3. Rajesh Singh, Anita Gehlot, Mahesh Prajapat,
Bhupendra Singh, Deep Learning, (2021),
LSTM networks are a type of recurrent neural 10.1201/9781003245759-5.
network. They handle sequences by maintaining
connections within and between layers, overcoming 4. Samir Hamouda, Sunspots Production and
the vanishing gradient problem in longer Relation to Other Phenom Ena: A Review, (2020).
sequences. LSTM has gates to control information
flow and handle longer-term dependencies. 5. Mohammad Nazari-Sahrawian, Moses
Kerouacian, Relationship betwen Sunspot Numbers
VI. Experiments: and Mean Annual Precipitation: Application of
Cross-Wavelet Transform-A Case Study, (2020), J.
In this research, the scientists looked at data about 3. 10.3390/j3010007.
sunspots that goes back hundreds of years. They
divided this data into two parts: one for teaching 6. H. Abdel-Rahman, Beshir Marzouk, Statistical
the models and another for testing how well they method to predict the sunspots number, NRIAG
work. They made three different models: one is Journal of Astronomy and Geophysics, (2018), 7.
called DNN, another is RNN, and the third one is 10.1016/j.nrjag.2018.08.001.966 S. O. Hasson, M.
DNN-LSTM. They taught these models using the M. AL-Hashimi.
mesh technology. Sensors that use IEEE many advantages to electric utilities and service
802.15.4 based radio transceivers can function providers in the smart grid. WiMAX
for several years even in harsh conditions communication network is also extensively used
without any external power. for monitoring, control and protection in SAS as
well as rapid outage detection and restoration
d) Extended Range
using two-way communication.
Wireless technologies provide an
extended range (9 kHz to 300 GHz) for data 3. Wireless LAN
acquisition and communication making the IEEE 802.11 based Wireless
need for lengthy cables required for wired LAN (WLAN) also known as Wireless Ethernet,
communication, redundant. is one of the most widely deployed and efficient
e) Reduced Susceptibility to wireless data networks in the world (second only
Environmental Factors to cellular networks) [2]. It provides high-speed
Wireless technologies also present communication and offers several benefits over
reduced susceptibility to environmental factors wired LAN as it is more cost efficient, mobile,
that may disrupt communication, such as: and easy to install. IEEE 802.11b commonly
unstable ground, lightning, natural/manmade known as Wi-Fi is a type of WLAN and can
carry out data transfer at exceptionally high
speeds. WLAN can be used to enhance
distribution substation automation and
protection as well as power line protection
disasters like earthquakes and cyclones, between two substations.
accidental or orchestrated human
4.TVWS (TV White Space)
interruption/damage to communication
devices.
While the advantages are numerous, some of the most important ones have been elaborated below:
Association Rule
The association rule mining finds the relation among
variables in database. Let’s take an example IF (A AND B)
THEN C. This rules describes that IF A and B are present,
then there is also presence of C. Association rules have
metrics that tell how often a given relationship occurs in the
data.
detection can also detection the previous unknown both legitimate and malicious classes. Training a
attacks and use for defining the signature for misuse classifier using such file sample collection makes it possible
detectors. The main problem with anomaly detection is that to detect newly released malware. The effectiveness of data
any deviation from the normal, even if it is a legitimate mining techniques for malware detection critically depends
behavior, will be reported as an anomaly, thus producing a on the features which are extracted and the categorization
high rate of false positives. techniques used.
Misuse detection, also known as signature-based DATA MINING FOR INTRUSION DETECTION
detection, identifies only known attacks based on examples
Apart from detecting malware code, data mining can be
of their signatures. It refers to detection of attacks by looking
effectively used to detect intrusions and analyze audit results
for specific patterns, such as byte sequences in network
to detect anomalous patterns too. Malicious intrusions may
traffic, or known malicious instruction sequences used by
include intrusions into operating systems, networks, servers,
malware. This technique has a lower rate of false positives
web clients and databases.
but can’t detect zero-day attacks.
There are two types of intrusion attacks we can detect
A hybrid approach combines anomaly and misuse
using data mining methods:
detection techniques in order to increase the number of
detected intrusions while decreasing the number of false • Host-based attacks, when the intruder focuses
positives. It doesn’t build any models, but instead uses on a particular machine or a group of machines
information from both harmful and clean programs to create
• Network-based attacks, when the intruder
a classifier – a set of rules or a detection model generated by
attacks the entire network
the data mining algorithm. Then the anomaly detection
system searches for deviations from the normal profile and Network-based defense systems control network flow by
the misuse detection system looks for malware signatures in network firewall, antivirus, spam filter and network intrusion
the code. detection techniques. Host-based defense systems control
upcoming data in a workstation by firewall, intrusion
Detection process
detection techniques and antivirus installed in hosts systems.
When using data mining, malware detection consists of
Conventional approaches to cyber defense are
two steps:
mechanisms designed in authentication tools, firewalls, and
• Extracting features network servers that monitor, track, and block viruses and
other malicious attacks. For example, the Microsoft
• Classifying/clustering
Windows operating system has a built-in Kerberos
Machine learning algorithms learn the patterns from fixed cryptography system that protects user information. Antivirus
length feature vectors, and therefore feature extraction is the software is designed and installed in personal computers and
first step before using these algorithms for malware analysis. cyber infrastructures to ensure customer information is not
For features that are in the form of sequences, such as used maliciously. These approaches create a protective shield
sequences of code bytes, operation codes, system calls, or for cyber infrastructure. Data-capturing tools, such as Solaris
any API calls, the creation of a representative feature vector BSM for SUN, LINPAC for Linux, and Win cap for
is a nontrivial problem. Feature extraction can be performed Windows, capture events from the audit files of resource
by running static or dynamic analysis with or without information sources (e.g., network). Events can be host-based
actually running harmful software. A hybrid approach that or network-based depending on where they originate. If an
combines static and dynamic analysis may also be used. event originates with log files, then it is treated as a host-
based event. If it originates with network traffic, then it is
During classification and clustering, file samples based
treated as a network-based event. A host-based event
on feature analysis are classified into groups. To classify
includes a sequence of commands executed by a user and a
samples, we can use any classification or clustering
sequence of system calls launched by an application. A
techniques. To classify file samples, we need to build a
network-based event includes network traffic data, e.g., a
classifier using classification algorithms such as Artificial
sequence of TCP/IP network packets. The data-preprocessing
Neural Network (ANN), Decision Tree (DT), Support Vector
module filters out the attacks for which good signatures have
Machines (SVM) or Naive Bayes (NB). Clustering is used
been learned.
for grouping malware samples that shares similar
characteristics. Using machine learning techniques, each
classification algorithm constructs a model that represents
140
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
energy consumption in fact is on a steady rise due a great deal of hardware and operating costs,
to the innovation and emergence of new consolidation of tasks and reduction in energy use
technologies and products. The field of Health Care (power consumption) by switching off physical
Medicine also makes extensive use of technology devices. Virtualization also allows to move a
and also the use of Cloud Computing has increased running virtual machine from one host to another
over the years in the field. These things have with no downtime and distributed power
revolutionized the way the healthcare sector works. management possible.
We can see the use of equipment powered by the
latest technology, improved equipment for surgical
work, being able to consult doctors remotely over B. Green Scheduler
the internet, online healthcare platforms and many
other things. This has become possible due to The green scheduler or the green scheduling
Technology and Cloud Computing. But at the same algorithm determines which servers should be
time, the healthcare field also faces challenges turned on and turned off. When the load increases,
regarding how to make energy efficient and safe the server will be turned on and when the load
use of such tools and techniques. A study also decreased the server will be turned off
shows that Healthcare also plays a major role in the automatically. Since, servers take time to load
carbon footprint of countries like the USA, completely, they should be switched on before it is
Australia, etc. The energy use of the healthcare needed. The servers should also not be loaded more
field is estimated to grow even more in the coming than its capacity. This leads to reduction in energy
years with the emergence of new technologies. So, and power consumption at the data centre. It also
we need to find a way to reduce the amount of helps in reducing the load on the servers at any
energy consumed while still being able to utilize instant of time since at any given point of time
these resources. This has made way for Green there are already requisite number of servers which
Cloud Computing which basically means the use of are on.
Cloud Computing in an environment friendly
manner. Green Cloud Computing tries to provide C. Datacentre Energy Efficient
eco-friendly and economically viable use of
Network Aware Scheduling Algorithm This
computing resources while still providing the same
algorithm helps in reducing the cost and
value to the users. Many companies worldwide
operational expenses of data centres by minimizing
have also started investing in the developments of
the total energy consumption. It selects the best fit
Green Cloud Computing. These are all the reasons
resources for executing a particular task or problem
which have made Green Cloud Computing
on the basis of the load as well as taking into
necessary in today’s world.
consideration the various components present at the
III. Approaches data centre. Also, for managing the workload, a
deadline- based model is employed which aims at
Green cloud computing provides multiple solutions completion of each task within a specified amount
to alleviate the impact of cloud computing on the of time. Hence, it achieves workload efficiency by
environment. There are various different preventing the components (servers) from
approaches and techniques proposed for achieving overloading and network congestion. However,
green cloud computing. There are mainly three there is a small amount of increase in the number of
ways: first is the hardware optimization which servers that are running.
includes reducing the use of energy and making it
economically efficient, second is software
optimization which includes developing ways to
D. Nano Data Centres
increase efficiency of energy, storage and program,
and the last one is network optimization. It is a computing platform which is distributed.
They refer to the large number of data centres with
A. Virtualization
smaller sizes than the normal data centres which
It is a very common approach used in green cloud are large in sizes and lesser in number. The creation
computing. There are different types of of Nano data centres helps in reducing the energy
virtualization of which server virtualization plays a consumption by 30 percent. They are distributed
major role because servers play the most important around the world and are interconnected. They are
role in a Cloud Data centre. In this approach portable and can be used anywhere including
multiple VMs (Virtual Machines) are assigned to a remote locations or for temporary use. They help in
single server. This can be done by the use of a the reduction of downtime with a decrease in
software application. It helps is reducing the cost to response time.
141
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
E. Use of Tranquil PCs for ensuring green cloud computing along with the
knowledge of the latest and new green cloud
There is a large amount of carbon emissions due to computing equipment. The environmental factor
all the data centres using cloud computing includes instructing the cloud providers on
architecture. The use of tranquil PCs causes the standards and policies of green computing and
carbon emission to reduce to approximately 60kg sustainability. The TOE model aims at increasing
per year as opposed to the desktop PCs in the the efficiency and also reducing the carbon
datacentres which consume approximately 400KW footprint due to the use of technology.
of power and even with the power saving option
approximately 270kg of Carbon dioxide is H. Dynamic Migration Algorithm
produced per year. Hence, the use of tranquil PCs
helps in reducing the carbon emissions. The main objective of the algorithm is to reduce
the power and energy consumption as well as
parameter Per week Per Per year reduce carbon dioxide emissions. It helps in
s month increasing the efficiency of resource utilisation. But
Reference 6.9 29.4 35.19 the drawback is that it exceeds migration costs and
desktop takes longer than usual to respond i.e. longer
pcs response time.
costs IV. MECHANISMS
Co2 4.3 18.1 2.18
emissions Green cloud computing, also known as sustainable
Tranquil 2.9 9.4 111.6 or environmentally friendly cloud computing,
pc’s focuses on reducing the environmental impact of
costs cloud services through various mechanisms. Here
Co2 1.9 5.7 67.9 are some key mechanisms and strategies employed
emissions in green cloud computing:
142
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
can lead to better resource utilization and resources) can lead to more responsible
reduced energy consumption. and energy-efficient usage.
143
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Reinforced learning can be used to learn each time cloud computing platforms. Since these IoT
a scheduling takes place. devices require low latency and mobility, edge
computing for real- time services has emerged. Fog
computing is a distributed computing model aimed
at linking network devices at various levels of
computation. They offer IoT devices a low-latency
answer that centralised cloud computing
144
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
• Use of Electronic Medical Records (EMRs) People feel that it is better to use traditional
EMR is used by healthcare professionals to track, approaches because those are not very costly, but
control, and coordinate health care delivery within such a thought process is likely to backfire for
a healthcare organisation. EMRs have the ability to humanity in the longer run. People need to be
minimise carbon dioxide emissions, according to cautious about the impact their products are having
estimates. Users revealed that by using an EMR, on the environment either directly or indirectly.
they were able to save thousands of pounds of
• Telemedicine: Telemedicine is the practice of As more and more businesses are switching to
medicine that uses technologies to give treatment to cloud, the amount of energy utilized by these cloud
people who are located far away. Telemedicine has data centers is increasing at a rapid rate and is
been around for more than two decades, but its significantly contributing towards carbon footprint.
effects are only now becoming apparent, especially Green cloud computing provides a solution to this
in rural areas. It helps to reduce carbon problem by reducing energy consumption and
consumption as people can avoid going all over for optimizing resource allocation. Powerful AI
expert referrals and other events. It can be used to techniques are aiding the growth of green cloud
help treat chronic conditions, optimise treatment computing. Green computing can be implemented
for the sick, homebound, and physically in various fields like IoT and big data analytics.
challenged, and boost community and population Public needs to be educated about the importance
wellbeing. of green computing. Adopting green computing in
the future will be extremely beneficial for the
C. Green Parallel Computing of Big Data environment.
Systems
Big Data is usually structured around a distributed
file system on top of which parallel algorithms for IX. REFERENCES
Big Data analytics can be run. The parallel [ 1 ] Manoj Muniswamaiah, Tilak Agerwala and
algorithms can be mapped to the computing Charles C. Tappert, ”Green computing for Internet
platform in a number of ways. In terms of of Things”, 2020 7th IEEE International
environmentally related parameters such as energy Conference on Cyber Security and Cloud
and power usage, each choice would behave Computing (CSCloud).
differently. Current research on the implementation
of parallel computing algorithms have largely [ 2 ] J.M.T.I. Jayalath, E.J.A.P.C. Chathumali,
focused on addressing general computing metrics K.R.M. Kotha- lawala, N. Kuruwitaarachchi,
such as speedup over serial computing and ”Green Cloud Computing: A Review on Adoption
efficiency of the use of computing nodes. We of Green-Computing attributes and Vendor
explore how to elicit green metrics for big data Specific Implementations”, 2019.
systems, which are necessary when comparing
implementation options. We use current systematic [ 3 ] Mridul Wadhwa, Approv Goel, Tanupriya
literature reviews to define and address the key
Choudhury, Ved P Mishra, ”Green Cloud
green computing indicators for big data systems.
Computing – A Greener Approach To IT”, 2019
VII. CHALLENGES IN GREEN International Conference on Computational
COMPUTING IMPLEMENTATION Intelligence and Knowledge Economy(ICCIKE).
145
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
146
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Abstract-In the last ten years, better computer A driver doesn't suddenly start feeling tired; there are
technology and artificial intelligence have made signs that show up first. Some of these signs are:
driver monitoring systems better. Many studies
have gathered real data about drowsy drivers and Difficulty keeping eyes open.
used different computer programs and ways of
. Having a hard time keeping your eyes
putting information together to make these
open.
systems work much better while driving. This
Yawning often.
paper gives an updated overview of the drowsy
Blinking a lot.
driver detection systems made in the past ten
years. Finding it tough to focus.
Moving the vehicle out of the lane and
Keywords-- ways to tell if a driver is tired, using reacting slowly to traffic.
biology, using a mix of things, using images, using Nodding off or head dropping.
things from the vehicle.
To carefully assess different levels of feeling tired
1. Introduction and make it easier to create systems that can detect
The article "Looking at New Ways to Detect Tired tiredness early, we need a clear way to measure how
Drivers" talks about the newest improvements in tired someone is. Many ways have been suggested to
technologies made to find out if a driver is getting do this.
sleepy. It talks about different methods like
recognizing faces, tracking eye movements, and Scale Verbal Description
using sensors to watch important body signals. The 01 Extremely alert
article also shows how these systems use computer
programs to understand the information and figure 02 Very alert
out how awake the driver is. By studying how 03 Alert
detecting tiredness has changed over time, the article 04 Fairly alert
helps us know more about how technology is making
05 Neither alert nor sleepy
driving safer.
06 Some signs of sleepiness
2. Drowsiness Signs and Stages— 07 Sleepy, but no effort to keep alert
In the writings about making systems that can tell if a 08 Sleepy, some effort to keep alert
driver is getting sleepy, people use different words to
09 Very sleepy, great effort to keep alter
talk about the same thing. They sometimes use
"drowsiness" and "fatigue" like they mean the same.
"Fatigue" means not wanting to keep doing
something because your body or mind is tired, or
you've been doing the same thing for a long time. On
the other hand, "sleepiness" or "drowsiness" means
feeling like you want to go to sleep.
147
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Accuracy=
Number of correct predictions
Total number of redections
TP+ TN
=
Tp+TN + FP+ FN
TP
Precision=
TP+ FP
TP
Sensitivity=
TP+ FN
148
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
They use the corners of the eyes to tell if someone is They tested their system with three different sets of
yawning. They use methods that add up cold and hot pictures and videos. The first set had simple pictures
areas in the heat pictures to find this out. with a plain background, and it was right about 95%
Lastly, when the system's special program sees that of the time. The second set, with more complex
someone might be too tired, it makes an alarm go off. pictures, was accurate about 70% of the time. The
149
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Bamidele and their team introduced a system that can picked out 34 details from the eye signals, and these
tell if someone is getting drowsy without being too details were collected from sections of the eye signals
intrusive. They used technology to track the state of that overlapped and had different lengths. They also
the person's face and eyes. For their study, they had a looked at how well the system worked by trying out
collection of videos from NTHUDDD Computer different combinations of features and lengths of
these signal sections.
150
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
cover the face different visual aspects, like the way the eyes and
Driver's posture and High Low N/A mouth behave, as well as head movements.
distance from the Additionally, the rear camera can identify features
dashboard related to the vehicle, such as when it starts to drift
Real-time video Mediu N/A N/A from its lane or changes its orientation.
analysis m
Driver movement High High N/A
Noisy sensor Low High Low CONCLUSION
measurements
Monitoring Low Medium Low In the past ten years, the field of drowsiness detection
equipment and has seen significant progress, thanks to advancements
sensors in technologies like the Internet of Things (IoT),
Inconvenience smaller sensors, and artificial intelligence. This paper
Influence of High Low Mediu offers a thorough and current overview of drowsiness
environmental m detection systems developed over the past decade. It
conditions outlines the main strategies used in creating these
(weather/illuminatio systems and organizes them into four groups based
n) on the kinds of indicators they use to identify
Influence of the Low Low High drowsiness. These categories are systems that rely on
road conditions and images, biological signals, vehicle data, and a mix of
geometry these approaches. The paper goes into detail about
Hardware Low High Low each of these systems, discussing the features they
complexity and utilize, the AI methods they implement, the datasets
limitations they work with, and their resulting accuracy,
Drowsiness signs Low Low High sensitivity, and precision.
extraction precision REFERENCES
Testing under real Mediu Medium Mediu
(not simulated) m m National Highway Traffic Safety
driving conditions. Administration Drowsy Driving.
[(accessed on 10 May 2021)]; Available
online:https://www.nhtsa.gov/risky-
5. Discussion driving/drowsy-driving
After carefully studying existing research, it's clear National Institutes of Health Drowsiness.
that there are many different ways to detect [(accessed on 10 May 2021)]; Available
drowsiness and prevent potential dangers while online:
driving. Additionally, the progress in technology, https://medlineplus.gov/ency/article
especially in the field of artificial intelligence, has National Safety Council Drivers are
helped overcome several difficulties that these Falling Asleep Behind the Wheel.
systems used to face, making them work better. In [(Accessed on 10 May 2021)]. Available
this part, we're going to look at different Drowsy online:
Driver Detection (DDD) systems, considering how https://www.nsc.org/road-safety/safety-
well they work in real situations and how dependable topics/fatigued-driving
they are, based on what's been written in various National Sleep Foundation Drowsy
studies. We'll also talk about the four methods Driving. [(accessed on 10 May 2021)].
mentioned earlier that are used to spot drowsiness. Available online:
https://www.sleepfoundation.org/articles/
6. Future Trends in Drowsiness Detection Systems
Researchers have suggested using mobile phones as a
cost-effective option for gathering driving-related
information. Modern mobile phones come with at
least two cameras and multiple sensors. They can
even link up with various sensors using Bluetooth or
other wireless technologies. If placed on the
dashboard, a mobile phone's front camera can record
151
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
India, with its vast population and One of the defining features of social media
diverse cultural landscape, has usage in India is the prevalence of regional
emerged as a global hub for social languages. While English is commonly
media users. The proliferation of used, a significant portion of content is
smartphones, increasing internet created and consumed in languages like
penetration, and a youthful Hindi, Bengali, Tamil, Telugu, and many
demographic have contributed to the others. This linguistic diversity has
rapid growth of social media usage in prompted platforms to cater to these
the country. As of my last knowledge languages, making social media more
update in September 2021, India inclusive and accessible.
boasted one of the largest user bases
on various social media platforms.
152
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
153
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
4.Real-Time Communication:
5. Information Dissemination:
7. Challenges of Misinformation:
information is shared,
consumed, and interacted with, -The rapid spread of information on social
offering both opportunities and media can also lead to the rapid spread of
challenges. Here are some key misinformation, fake news, and rumors.
points to consider regarding
154
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
8. Targeted Messaging:
155
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
-This effort bridges the digital divide often involves international connectivity
and ensures equal access to through undersea cables and satellite
communication and information. communication
4. 4G and 5G Rollout:
156
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
REFERENCES
5. Morrison, S. [@scottmorrisonmp].
(2020, January 28).
Congratulations to this year’s
Australian of the Year Dr James
Muecke. His passionate and
selfless commitment to preventing
blindness and
tackling [Photograph]. Instagram. https://www.i
nstagram.com/p/B7vfGYen-L1/
157
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE
(Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada –
520001
158
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
ABSTRACT
People
In the contemporary era driven by technology and Individuals should recognize the importance of
interconnectedness, it has become imperative to adhering to fundamental principles of information
comprehend the concept of cybersecurity and security, including the choice of robust passwords,
employ it adeptly. Inadequate security measures exercising caution when dealing with email
can render systems, critical files, data, and other attachments, and regularly backing up data. It is
virtual assets susceptible to risks. Irrespective of imperative to further educate oneself about essential
whether an entity operates within the realm of tenets of cybersecurity.
information technology or not, every organization
must prioritize safeguarding its digital landscape. Processes
As novel advancements emerge in the field of Government entities need a comprehensive
cybersecurity, malicious actors also remain framework for effectively addressing both attempted
proactive, continuously refining their hacking and successful cyber-attacks. Following established
methodologies and targeting vulnerabilities frameworks can provide valuable guidance in this
prevalent across diverse businesses. regard. Such frameworks outline strategies for
identifying cyber intrusions, fortifying organizational
The significance of cybersecurity is underscored defenses, detecting and responding to potential
by the fact that military, governmental, financial, threats, and learning from past security breaches to
medical, and corporate entities amass, process, enhance future resilience.
and store unprecedented volumes of data on
computers and similar devices. A substantial Technology
portion of this data comprises sensitive Technology plays a crucial role in equipping
information—ranging from financial records and individuals and organizations with the necessary
intellectual property to personal particulars. tools for safeguarding themselves against cyber-
Unauthorized access to or familiarity with such attacks. The primary targets at risk encompass three
data could yield detrimental consequences. key components: endpoint devices such as
computers, mobile devices, and routers; network
INTRODUCTION systems; and cloud infrastructures. Commonly
utilized technological solutions for fortifying these
A robust approach to cybersecurity incorporates components include advanced firewalls, DNS
multiple tiers of protection that are distributed filtering, malware detection systems, antivirus
throughout the networks, computers, applications, or software, and email security solutions.
data intended to remain secure. Within a given
environment, various elements including processes, The term "cyber" pertains to anything related to a
individuals, and tools must synergize to establish a network of computers or the internet, whereas
comprehensive defense against cyber threats. A "security" signifies the process of safeguarding
cohesive strategy for managing potential threats can entities. Therefore, the amalgamation of "cyber" and
streamline integrations across specific Cisco Security "security" has been coined to describe the strategies
products, expediting crucial security operations such for shielding user data from malicious attacks that
as identification, analysis, and mitigation. might lead to breaches in security. This concept
159
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
emerged as the internet began to proliferate. Our reliance on critical infrastructures like power
Cybersecurity empowers societies and users to shield plants, hospitals, and financial institutions
underscores the necessity of securing these entities to
critical data against unauthorized access. While it
encompasses counteracting hacking attempts, it also maintain the functionality and trustworthiness of our
employs ethical hacking methodologies to fortify society. Moreover, the contributions of cyber threat
cybersecurity within various structures. investigators, exemplified by the 250-strong team at
Tales, play a pivotal role. They actively research
emerging threats and cyber attack strategies, uncover
vulnerabilities, raise public awareness about
cybersecurity issues, and bolster open source
resources. Through their efforts, they contribute to
making the internet a safer space for all.
160
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
161
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Cyber Attacks
162
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Information and Technology Act, 2000 Impact and Severity of Cyber Attacks
The ramifications of cyber-attacks can affect an
Information Technology Amendment Act 2008 The aftermath of a cybersecurity incident might
(ITAA) continue to exert an influence on your business for
weeks, and even months, following the event. The
subsequent five areas outline potential adverse
The amendments in the IT Act mentioned: impacts that your business could endure:
● ‘Data Privacy’
1. Financial losses
● Information Security 2.Loss of productivity
3.Reputation damage
4.Legal liability
● Definition of Cyber Cafe 5.Business continuity problems
163
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
daily lives are the most vulnerable. Keep Device Software Updated
Hackers frequently target businesses that keep Software providers consistently strive to enhance the
sensitive data or personally identifying information. security of their products. Regularly installing the
Businesses and organizations of the following kinds latest software updates renders your devices less
are particularly vulnerable to cyberattacks: prone to attacks.
Information on bank accounts, credit cards, and client Conduct Data Leak Monitoring
or customer personal data are all kept by banks and
other financial institutions.Healthcare institutions: Frequent monitoring of your data and the
Data repositories for clinical research data, patient identification of any existing leaks are essential in
records, and patient data like social security numbers,
billing addresses, and insurance claims.
How to Reduce the Risk of Cyber Attacks Formulate a Breach Response Strategy
Minimize Data Transfers Even meticulously cautious companies can fall victim
to data breaches. Establishing a comprehensive
strategy to manage potential data breach incidents,
The inevitability of data transfers between personal including primary cyber attack response and recovery
and business devices arises from the increasing plans, empowers organizations of all sizes to
prevalence of remote work among employees. effectively respond to real attacks and curtail
Nevertheless, retaining sensitive data on personal potential damage.
devices significantly elevates the vulnerability to
cyber attacks.
It is evident that businesses continually face the threat
of cybercrime and must take proactive measures to
Exercise Caution When Downloading safeguard their data. Rather than waiting for the
worst-case scenario, taking action today to avert
The act of downloading files from unverified sources future data breaches and their subsequent
can expose your systems and devices to potential consequences is paramount. Similar to the
security risks. It is imperative to exclusively obtain importance of maintaining adequate cyber liability
files from reliable sources and avoid unnecessary insurance, ensuring robust data protection is of
downloads, thereby reducing your device's utmost significance.
susceptibility to malware.
CONCLUSION
Enhance Password Security
In conclusion, cybersecurity is an important field that
The strength of passwords serves as the foremost protects computer networks, systems, and data from
defense against a range of attacks. Employing intrusion and destructive activity. It includes a range
combinations of symbols devoid of conventional of tactics, tools, and security precautions designed to
meanings, adhering to regular password changes, and stop, catch, and deal with online threats.
refraining from writing down or sharing passwords Organizations and individuals may protect their
constitute critical measures in safeguarding your digital assets and uphold trust in the digital sphere by
sensitive data. putting strong security measures in place.
164
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
REFERENCES
165
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
INTRODUCTION
166
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
167
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Bar Charts: Bar charts are a common way of 2. Feature Analysis: Visualizations can help in
displaying categorical data. In a bar chart, each understanding the significance and distribution of
category is represented by a bar, with the height of features, aiding in feature selection, engineering,
the bar indicating the frequency or proportion of and dimensionality reduction.
that category in the data. Bar graphs are useful for
comparing several categories and seeing patterns 3. Model Evaluation: Visualizations assist in
over time evaluating model performance by illustrating
Heat Maps: Heat maps are a type of graphical metrics like accuracy, precision-recall, and ROC
representation that displays data in a matrix format. curves, enabling better comparison and selection of
The value of the data point that each matrix cell models.
represents determines its hue. Heatmaps are often
used to visualize the correlation between variables 4. Anomaly Detection: Visual representations help
or to identify patterns in in spotting anomalies or outliers in data, which can
be crucial for identifying errors or unusual patterns
Tree Maps: Tree maps are used to display that may impact the quality of a machine learning
hierarchical data in a compact format and are useful model.
in showing the relationship between different levels
of a hierarchy 5. Decision-Making: Visualizations facilitate
better decision-making by presenting data-driven
insights to stakeholders, enabling them to
understand trends and make informed choices.
6. Box Plots: Box plots are a graphical
6. Data Preprocessing: Visualizations aid in data
representation of the distribution of a set of
preprocessing tasks like data cleaning, imputation,
data. In a box plot, the median is shown by a
168
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
and transformation, making it easier to identify 3. Choosing the Right Visualization: Selecting
missing values, skewed distributions, and potential the appropriate type of visualization for the data
issues. and the intended message can be tricky, as different
types of visualizations excel at conveying different
7. Interpretability: Visualizations enhance model types of information.
interpretability by illustrating how the model
arrives at predictions, allowing stakeholders to trust 4. Color and Perception: Poor choice of colors
and understand the model's decisions. can lead to misinterpretations or difficulty in
distinguishing elements. Also, accounting for
8. Hyperparameter Tuning: Visualization tools colorblindness and perceptual limitations is
help in visualizing hyperparameter tuning essential.
processes, assisting data scientists in identifying
optimal parameter values for better model 5. Data Preprocessing: Before visualization, data
performance. may need to be cleaned, transformed, or aggregated
appropriately, which can be time-consuming and
9. Time Series Analysis: Visualizations are crucial influence the final representation.
for exploring time-dependent data, helping in
detecting trends, seasonal patterns, and anomalies 6. Data Integrity: Ensuring data accuracy and
in time series data. consistency is crucial, as inaccuracies or
incomplete data can lead to misleading
10. Clustering and Segmentation: Visualizing visualizations.
clustering algorithms' results helps in
understanding groupings within data and verifying 7. Interactivity and Interpretability: Balancing
the effectiveness of unsupervised learning interactivity without overwhelming the user and
techniques. ensuring that the interactive elements enhance
understanding rather than confuse is a challenge.
Challenges in Data Visualization
8. Storytelling: Creating a narrative that
effectively communicates insights through
visualizations requires careful consideration of the
order and arrangement of visualizations.
1. Data Complexity: Dealing with large, high- In this report, we presented a list of major
dimensional, or unstructured datasets can make it challenges, which have been provided by fourteen
challenging to create informative and meaningful distinguished scientists who took part in a “virtual”
visualizations that capture all relevant aspects. panel as part of the BigVis 2020 Workshop. The
report aimed at providing insights, new directions
2. Overplotting: When too many data points and opportunities for research in the field of Big
overlap in a visualization, it can obscure patterns Data visualization and analytics.
and make it difficult to discern individual data REFERENCES
points or trends.
169
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
170
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
171
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Connectivity
II. Characteristics of Mobile OS for Smartphones Notification System
The smartphone market has very Operating systems that can be found
specific requirements that make it different from the on smartphones include Google’s Android, Apple’s
markets for PCs and other mobile phones. Scaling iOS, Research In Motion (RIM)’s BlackBerry OS,
down a PC-OS and to have communication Microsoft’s Windows Phone, Linux, HP’s webOS,
capabilities within a small and basic OS ends in Samsung’s Bada, Nokia’s MeeGo among many others.
various fundamental compromises. The characteristics Android, Bada, webOS and Maemo are built on top of
that build Smartphone markets is unique and calls for Linux, and iOS is derived from the BSD and Next
a comprehensively designed OS. STEP Os’s, which are all related to UNIX.
BlackBerry 7 OS 2011 Liquid Graphics: Enhanced graphics performance for smoother animations and
transitions. Improved Browser: Faster HTML5 and JavaScript performance,
optimized zooming, and panning. Voice-Activated Search: Voice-controlled
universal search across apps, contacts, and content.
Windows Phone 8 2012 Windows Phone 8 supported higher screen resolutions, including 720p and
1080p, leading to sharper and more detailed displays.
Android KitKat (v4.4) 2013 OK Google" voice command for hands-free control.Immersive mode to allow
apps to use the entire screen.
iOS 7 2013 Complete visual overhaul with a flat and colorful design.Control Center for
quick access to common settings and toggles.
BlackBerry 10 - 2013 2013 BlackBerry Hub: Centralized communication hub for messages, emails, and
social updates.Gesture-Based Interface: Intuitive navigation through swipe
gestures, emphasizing seamless multitasking and app flow.
Windows phone 8.1 2014 Wi-Fi Sense simplified the process of connecting to Wi-Fi networks by
automatically connecting to known networks and sharing network access with
contacts.
iOS 8 2014 Interactive notifications allowing actions to be taken directly from
notifications.fitness tracking HealthKit for health and.
172
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
iOS 9 2015 Multitasking enhancements for iPad, including Slide Over and Split View.
iPhone 6s 2015 Larger Display: Introduced larger screen sizes (4.7 inches and 5.5 inches) and
Retina HD displays.NFC and Apple Pay: Added Near Field Communication
(NFC) for Apple Pay contactless payments
Android Marshmallow (v6.0) 2015 Google Now on Tap, providing context-sensitive information based on the
content displayed on the screen.
Android Nougat (v7.0 to v7.1) 2016 Improved notification system with bundled notifications and quick replies.
Multi-window support for running two apps simultaneously.
iOS 10 2016 “Raise to Wake" feature to view notifications without pressing a button.
iPhone 7s pro 2016 Water Resistance: Became water and dust resistant (IP67 rating) for improved
durability.Dual Cameras (iPhone 7 Plus): Introduced dual rear cameras for
improved zoom and depth effects.
iOS 11 2017 Files app for better file management.
Android Oreo (v8.0) 2017 Picture-in-Picture mode for watching videos while using other apps.
Notification dots on app icons to display unread notifications.
iPhone 8 pro 2017 Wireless Charging: Enabled wireless charging with a glass back design and
support for Qi charging. A11 Bionic Chip
iOS 12 2018 Performance improvements, particularly on older devices. Screen Time feature for
monitoring and managing device usage.
Android Pie (v9.0) 2018 Adaptive Battery and Adaptive Brightness for optimizing device usage.
iPhone XS & XS Max 2018 Super Retina Display: Introduced OLED Super Retina displays for enhanced
brightness, contrast, and color accuracy. Dual SIM Support.
iOS 13 2019 Dark Mode for system-wide dark theme. Sign in with Apple for enhanced
privacy during app sign-ins.
Android 10 2019 Enhanced privacy controls, including location access and app permissions.
iPhone 11 Pro 2019 Triple Camera System: Introduced a triple-camera setup for ultra-wide, wide,
and telephoto photography.
iOS 14 2020 App Library for organizing and accessing apps more efficiently. Widgets on the
home screen for at-a-glance information.
Android 11 2020 Conversations section in notifications for easier management of messaging apps.
iPhone 12 pro 2020 Ceramic Shield and 5G: Featured a Ceramic Shield front cover for improved
durability and introduced 5G connectivity for faster data speeds. Mag Safe:
Introduced a magnetic accessory system for easier attachment of cases,
chargers, and other accessories.
Windows 10X 2021 Optimized for dual-screen devices, offering a streamlined and adaptable
interface for improved multitasking and touch experiences.
iOS 15 2021 FaceTime enhancements, including spatial audio and SharePlay for shared media
experiences.
Android 12 2021 Material You design language, offering more personalized theming based on
wallpaper colors. Privacy Dashboard and Mic/Camera indicators to enhance
user privacy awareness.
iPhone SE 3 2022 The iPhone SE (2022) has 5G support (sub-6 GHz), making it Apple's cheapest
173
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
In the next section we will focus our discussion on diverse devices, akin to the consistent experience
various categories of Mobile OS and the market of Windows 7 across different computer brands.
share they possess in the global markets
.
IV. Categories of Mobile OS
4.3 Free and Open-Source OS
The various categories of Mobile OS include:
Free and open-source operating systems
Manufacturer-built proprietary OS, represent a dynamic category characterized by
Third party Proprietary OS and collaborative development efforts from companies,
Free and Open Source OS. consortia, or developer communities, offering users
Iso Apple's iOS. the freedom to modify and install the OS on their
Android. chosen devices. Prominent examples include
Windows Phone (Discontinued). Android, Symbian, and the upcoming Mee Go,
BlackBerry OS (Discontinued) with Android standing out prominently.
Cyanogen Mod / Lineages OS. Manufacturers tailor these operating systems to
Firefox OS (Discontinued). optimize compatibility with their hardware, often
introducing distinctive features or interfaces to
4.1 Manufacturer-Built Proprietary OS distinguish their versions. A prime illustration of
this is HTC's incorporation of the graphically
Manufacturer-built proprietary operating enriched HTC Sense interface, elevating user
systems are a distinct category of mobile operating interaction on its Android phones. Moreover, these
systems utilized by certain device manufacturers open-source OS platforms offer an array of
like Apple, RIM (BlackBerry), and HP. For customizable options through installable software,
instance, Apple's iOS powers their iPod Touch, allowing users to extensively alter the appearance,
iPhone, and iPad devices, offering a consistent and behavior, and overall experience, resulting in
seamless user experience across all devices. diverse user interfaces. The open-source nature of
Similarly, RIM employs its proprietary BlackBerry these operating systems not only facilitates
OS for their line its Palm series of smartphones and manufacturer-driven modifications but also
tablets, showcasing a consistent interface of phones empowers independent developers to create custom
and tablets, ensuring a unified look and feel. HP versions, either for devices lacking official support
employs the Palm Web OS for and functionality. or to pioneer novel user experiences on officially
These operating systems share a hallmark of endorsed devices. This collaborative ethos
providing a coherent user experience regardless of encourages innovation, granting users a broad
the specific device, akin to the way Mac OS X spectrum of choices and enabling the evolution of
maintains uniformity across various Apple unique software adaptations beyond what's
computers. conventionally provided by device manufacturers.
174
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
of smartphones using community to 1.5 billion. bigger chances for mobile industry. Let us see the
The firm’s market research also forecasted that the market share of the most popular Mobile OS in the
global market for smartphone applications and following Table 2:
games is worth $130 billion in 2011 which will
rose to $55.6 billion in 2022. Currently the mobiles
outnumber the PCs by 4:1, which represents, even
Figure 1: A Bar Chart Showing the Market Share of Various Mobile OS in 2022 Q4
Some important features of various popular Mobile OS were compared and are presented in the form of the
Table3,given below
4.2.10
(CDMA), 2.3.4 (Phones)
Current Version 6.5.3 7.10.7720.0 6.0.0 9.5 5 1.2
4.3.5 (All 3.2 (Tablets)
other iOS
175
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
devices)
Proprietary
Windows
OS Family Mac OS X Linux Windows CE 7 Mobile OS Mobile OS Linux RTOS or
CE 5.2
Linux
ARM, MIPS,
Supported CPU
ARM Power ARM ARM ARM ARM ARM ARM
Architecture
Architecture, x86
Many, .NET
C, C++,
Programmed in C, C++, Java C++ (Silverlight/X Java C++ C/C++ C++
Objective C
NA)
Free and
open
Proprietary Free and open
source
EULA except source prior to Eclipse
except
License for open version 3 and Proprietary Proprietary Proprietary Public Proprietary
closed
source closed source License
source
components from version 3
compone
nts
Limited
(Search is
Non English not
Yes Limited Yes Yes Yes Yes Yes
languages support diacritical
mark
insensitive)
Underlining
Yes No ? Yes Yes Yes Yes No
Spellchecker
Search multiple
internal
Yes Yes Yes No Yes Yes Yes Yes
applications at
once
On-device
Yes No Yes No Yes ? Yes No
encryption
No, but often
provided in
Desktop Sync Yes Yes No Yes Yes Yes Yes
manufacturer
software
Default Web
Web kit web kit Trident Trident web kit web kit Gecko web kit
Browser/Engine
176
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
Bluetooth,
USB (carrier
dependent),
Personal
USB, USB,
Hotspot (Wi-
Bluetooth, Bluetooth, microUSB,
Fi Tethering) Not officially, USB, microUSB,
Mobile Wi-Fi Mobile Wi- Mobile Wi- Bluetooth
(carrier supported Bluetooth, Bluetooth,
Tethering Hotspot, USB, Fi Hotspot Fi Hotspot 3.0, Mobile
dependent, through Mobile Wi- Mobile Wi-
Bluetooth (with 3rd (with 3rd Wi-Fi
iPhone 4s homebrew Fi Hotspot Fi Hotspot
party party Hotspot
since iOS
software) software
4.2.5/4.3, or
with 3rd party
software and
"jailbreak")
Only for
Interchangeable photo/video
external memory import with an Yes Yes No Yes
cards optional
accessory
2+
Multitasking Yes Yes Yes Tombstoning Yes
text files,
PDF, Read only:
Microsoft HTML, text files,
Microsoft Microsoft
Office,iWork, Microsoft Multiple PDF,
Text/Document Office Microsoft Office
PDF, Images, Office Mobile, office HTML,
Support Mobile, Office, PDF Mobile,
TXT/RTF, PDF formats Multiple
PDF PDF,djvu
VCF with free office
3rd party formats
software
177
ISBN Number : 978-81-958673-8-7
KAKARAPARTI BHAVANARAYANA COLLEGE (Autonomous)
Sponsored by: S.K.P.V.V. Hindu High Schools Committee) Vijayawada – 520001
AAC LC/LTP
3GPP, HE-
AACv1 (AAC+),
HE-AACv2
(enhanced
AAC+), AMR-
AAC (8 to 320 NB, AMR-WB,
Kbps), MP3
MP3,
Protected (Mono/Stereo 8-
MP3, AAC, WAVE, MP3, AAC,
AAC (from 320 kbit/s All (some
AAC+, WMA, WMA,
iTunes Store), constant or require
eAAC+, AAC+, M4A, XMF,
Audio Playback HE-AAC, variable bit-rate, All optional
WAV, WMA MIDI, 3GA, MMF,
MP3 (8 to 320 MIDI (MIDI debian
pro, AMR-NB, AMR, MIDI,
Kbps), MP3 Type 0 and 1. packages)
MIDI eAAC+, WAV, AMR
VBR, Apple DLS Version 1
FlAC, OGG
Lossless, and 2., Ogg
AIFF, WAV Vorbis,
PCM/WAVE (8-
and 16-bit linear
PCM (rates up
to limit of
hardware),
WAVE
MP4,
H.263, H.264, H.263,
WMV, All (some
H.263, H.264 WMV, H.264,
H.264 AVC, H.263, require WMV, ASF,
AVC, MPEG-4 MPEG4, WMV,
Video Playback MPEG-4, M- H.264, optional MP4, 3GP,
SP, DivX, XviD, MPEG4@ HD MPEG4,
JPEG DivX, debian AVI
VP8 [181] 720p 30fps, MKV,
WMV, packages)
DivX, Avid DivX, Avid
Avid, 3gp
179
ISBN Number : 978-81-958673-8-7