You are on page 1of 11

Artificial Intelligence

Based Browser Cookies

ABSTRACT:

Building a web app that adds to your browser and tracks


everything intelligently can be a great web app idea for your
startup. Artificial Intelligence has started to live everywhere
with its growing popularity and usage. Browser-based AI offers
various benefits such as sentiment analysis, hand gesture
detection, and style transfer when it runs in the browser. It can
drop the need for background application programming interface
requests to cloud-based resources. As a result, it simplifies and
accelerates AI apps’ end-to-end flow. By developing such a web
app, you can use its properties to utilize cookies to enhance
online user experience
Artificial Intelligence about building a web app that adds to your
browser and tracks everything intelligently. Your browser
caches all your activities and suggests show options as per your
behavior. Hence, it can be an excellent idea for you to offer
users personalization on the browser.
INTRODUCTION:
A cookie is a file that is downloaded on your computer (or mobile
device) in order to store data that can be updated and recovered by the
entity responsible for installing it. Our site uses cookies to get
information on the use of the web site.
We inform you that we can use the cookies to improve your navigation
on the site, distinguish from other users, improve the use of the site, and
identify problems to improve our site. In-house cookies: These cookies
are sent to your computer and managed solely by us to ensure the better
functioning of the Website. The information gathered is used to improve
the quality of our service and enhance your user experience. These
cookies remain in your browser for longer, allowing us to recognize you
as a repeat visitor to the Website and adapt its content to offer you
content in line with your preferences Third-party analytics cookies: Our
Website also uses the traffic measuring system Google Analytics, a
Google web analysis tool that allows us to measure how users interact
with our Website. It also sets cookies on the domain of the site the user
is visiting and uses cookies to gather information anonymously and to
report Website trends without identifying individual users. By browsing
and remaining on our Website, you are consenting to the use of the
above cookies, for the time periods established and on the conditions
contained in the Cookies Policy. Our Website Cookies Policy may be
updated and we therefore recommend that you review this policy every
time you access our Website in order to be duly informed of how and
why we use cookies.
ARCHITECTURE:

EXISTING SYSTEM:
Artificial intelligence is starting to live everywhere, especially in
your browser .Browser-based AI has several advantages.
Running AI in the browser can speed up some AI operations —
such as  sentiment Analysis ,User Searching Analysis and 
Cookies Analysis — by executing them directly on the client. It
can eliminate the need for background application programming
interface requests to cloud-based resources, thereby simplifying
and accelerating AI apps’ end-to-end flow. It can also provide
the AI app with direct access to rich data from client-side
sensors, such as webcams, microphones, GPS and gyroscopes. It
addresses privacy concerns by retaining browser-based AI data
in the client. And not least, it brings AI within reach of the vast
pool of Web developers who work in JavaScript and other
client-side languages, frameworks and tools .What they have in
common is support for AI programming in various browser-side
languages and scripts. They all support interactive modeling,
training, execution and visualization of machine learning, deep
learning and other AI models in the browser. They can all tap
into locally installed graphics processing units and other AI-
optimized hardware to speed model execution. And many of
them provide built-in and pertained neural-net models to speed
development of regression, classification, image recognition and
other AI-powered tasks in the browser .Among leading AI
vendors, Google has the most comprehensive tooling for helping
developers build ML and DL apps not just for the browser but in
a growing range of client apps and devices. In that regard,
Google has made several important recent announcements:

PROPOSED SYSTEM:
HTTP — the primary protocol used in web browsing to
communicate with a web server — is an inherently stateless,
session less computing experience .That means that each page
load, each request, is an independent event, unrelated to the
events that come before or after it .This is fine for viewing a few
documents that someone put on their server, but anything more
complicated — like logging in and getting user-specific content
— requires some kind of persistence mechanism, something that
will alert the server that the current request from you is related
to the previous one, that they are both from the same person on
the same computer .Cookies accomplish this. The server
generates one the first time you visit a site. It sends it to your
browser, and your browser stores it. On subsequent page loads,
the browser informs the server of the relevant cookies currently
being stored. The server reads them and knows that this is the
same browser as before. In-house cookies: These cookies are
sent to your computer and managed solely by us to ensure the
better functioning of the Website. The information gathered is
used to improve the quality of our service and enhance your user
experience. These cookies remain in your browser for longer,
allowing us to recognize you as a repeat visitor to the Website
and adapt its content to offer you content in line with your
preferences.
Third-party analytics cookies: Our Website also uses the traffic
measuring system Google Analytics, a Google web analysis tool
that allows us to measure how users interact with our Website. It
also sets cookies on the domain of the site the user is visiting
and uses cookies to gather information anonymously and to
report Website trends without identifying individual users.

MODULES:
1. Searching Query:
You can find a specific word or phrase on a web page on your
computer. On your computer, open a webpage in A I Based
Browser. Type your search term in the bar that appears in the top
right. Matches appear in our Data base to get details to Displayed
2. History Monitoring:
The Monitoring History feature allows you to capture and view
critical performance data from your cluster. Once the performance
data has been collected, you can view the data in the Monitoring
History pages. The top-level Monitoring History page provides an
overview of the performance metrics for all of the key resources in
your cluster. For each resource, you can drill down for more detail.
You can also adjust the time span of the viewed data and apply
filters to view the data for select resources to compare and spot
exceptions .By default, the performance data is stored in the
Meters database. Monitoring history capture is enabled at the
group level. Typically you have one group per cluster. You can
also configure a consolidated Meters database that captures
performance metrics from multiple groups. The group
configuration defines which database is used to store performance
metrics for that group (defaulting to a shared Meters database per
cluster), as well as all configuration parameters for performance
metrics, such as the frequency of data capture and how long to
retain the performance data. The Meters database can participate in
all normal database replication, security, and failover operations.

3. Graphical Representations
The analyses of proposed systems are calculated based on the
approvals and disapprovals. This can be measured with the help of
graphical notations such as pie chart, bar chart and line chart. The
data can be given in a dynamical data.

 ANALYSIS
Sentiment analysis is contextual mining of text which identifies and
extracts subjective information in source material, and helping a
business to understand the social sentiment of their brand, product
or service while monitoring online conversations. However,
analysis of social media streams is usually restricted to just basic
sentiment analysis and count based metrics. This is akin to just
scratching the surface and missing out on those high value insights
that are waiting to be discovered.
 GRAPH

A Depressed interaction graph G_ is generated via some social


graph model, minimizing the distance between the real and
Depressed interaction graphs.
An interaction graph G is extracted from the input (real) social
media data. An interaction graph represents how social network
actors interact with each other [25], [26]. Entities and their
interactions in social media are identified, and an interaction graph
is built with a vertex set V , including entities, an edge set E
representing interactions, and an attribute set A, which includes
both vertex (entity) attributes and edge (interaction) attributes.

ALGORITHM:
SUPPORT VECTOR MACHINE

“Support Vector Machine” (SVM) is a supervised machine learning


algorithm which can be used for both classification and regression
challenges. However, it is mostly used in classification problems. In this
algorithm, we plot each data item as a point in n-dimensional space (where n
is number of features you have) with the value of each feature being the
value of a particular coordinate. Then, we perform classification by finding
the hyper-plane that differentiate the two classes very well (look at the
below snapshot). Support Vectors are simply the co-ordinates of individual
observation. Support Vector Machine is a frontier which best segregates the
two classes (hyper-plane/ line). More formally, a support vector machine
constructs a hyper plane or set of hyper planes in a high- or infinite-
dimensional space, which can be used for classification, regression, or other
tasks like outliers detection. Intuitively, a good separation is achieved by the
hyper plane that has the largest distance to the nearest training-data point of
any class (so-called functional margin), since in general the larger the
margin the lower the generalization error of the classifier. Whereas the
original problem may be stated in a finite dimensional space, it often
happens that the sets to discriminate are not linearly separable in that space.
For this reason, it was proposed that the original finite-dimensional space be
mapped into a much higher-dimensional space, presumably making the
separation easier in that space.

FUTURE WORK:
The basic definition of privacy is having the power to seclude
oneself, or information about oneself, in order to limit the
influence others can have on our behavior. Privacy has been
traditionally recognized as a prerequisite for the exercise of
human rights such as the freedom of expressions, the freedom of
association, and the freedom of choice .In the information age,
privacy hinges on our ability to control how our data is being
stored, modified, and exchanged between different parties. With
the advent of advanced internet-based data mining techniques in
recent decades, privacy has become a pertinent social issue.
Social actors that regularly utilize these techniques, such as
government agencies and corporations, are now in the position to
identify, profile, and directly affect the lives of people without
their consent. And with the emergence of increasingly
sophisticated artificial intelligence systems, these privacy
concerns have only been exacerbated.

REQUIREMENT ANALYSIS

The project involved analyzing the design of few applications so as


to make the application more users friendly. To do so, it was really
important to keep the navigations from one screen to the other well
ordered and at the same time reducing the amount of typing the user
needs to do. In order to make the application more accessible, the
browser version had to be chosen so that it is compatible with most of
the Browsers.

REQUIREMENT SPECIFICATION
Functional Requirements

 Graphical User interface with the User.


Software Requirements

For developing the application the following are the Software


Requirements:

1. Python

2. Django

3. MySql

4. MySqlclient

5. WampServer 2.4

Operating Systems supported

1. Windows 7

2. Windows XP

3. Windows 8

Technologies and Languages used to Develop

1. Python

Debugger and Emulator


 Any Browser (Particularly Chrome)
Hardware Requirements

For developing the application the following are the Hardware


Requirements:

 Processor: Pentium IV or higher


 RAM: 256 MB
 Space on Hard Disk: minimum 512MB

CONCLUSION:
Digital technologies such as AI have made substantial
contributions to many areas of our life. The vast quantities
of information that we are able to gather and analyze
through the use of these tools allow us to tackle social ills
that previously had no solutions. Unfortunately, these
technologies can also be used against us by various social
actors, from individuals, to corporations, to government
agencies. Our loss of privacy is just one example of how
technologies such as AI can work to our detriment.
However, if we manage to properly understand these
technologies, and their impact on our daily life, we will
acquire the means to defend ourselves from exploitation by
those that wield them with malicious intent.

You might also like