You are on page 1of 10

Report

(DOCUMENT)

for
Knime

By
Name:
Roll no:

1|Page
Table of Contents

1. Executive Summary..........................................................................................................4
2. Introduction.....................................................................................................................5
3. Data Pre-processing and feature extraction......................................................................6
4. Experiment.......................................................................................................................7

1. Executive Summary

2|Page
Here is Selected information is from dataset, I have chosen explicit columns from a prepared
table of information, and afterward I have demonstrated this in my report.
Hub of this Dataset Connection as info and permits to choose table in this. The hub yields a
Dataset. Information object contains the association just as a dataset inquiry that characterizes
the information in the data set.

I have used string manipulation to clean data that is given in a groupality, It cleans the data, and
I used “compare” for that so it cleans the data clearly. So it is cleaning the data in this dataset.
We derive conditions to difficult the fundamental model structure and discussion about how
the key model can be mentioned and used by bosses for assistant assessment, measure
examination and cultural organization. This is result of this, and we conclude that it filtered
datset for the given and it will be used for the many purposes. Here is the final results of this so
we can conclude that from this screen shot that this is the results of it.
At the point when you have cleaned and suitably cleaned your data, you can continue to set up
the AI model. Its counts are available in KNIME Analytics Platform. For example commendable
and present day estimations, coordinated and solo figurings, counts from the field of
estimations or from the AI social order, those that predict numeric characteristics or apparent
classes or computations that examine plans, requiring past time course of action, or just a
discretionary case of data.

3|Page
Introduction
I have selected Logistic Regression modeling technique for training because for this type of data
set this is best. It is a one way communication and previous nodes send data and reciever
absorbs it.

4|Page
Linear regression technique shows output of the dataset, here is the result view of this dataset.
It shows that is applied. Applying show well desire for this dataset and keep up all the system.
The data and ouput model is used for Gradually for this assistance. In this document, we
initialized small level guide that is measure show and displaying how it must be used to
throwing information and personel assistance to doing business directions. Here is Selected
information is from dataset, I have chosen explicit columns from a prepared table of
information, and afterward I have demonstrated this in my report.
Hub of this Dataset Connection as info and permits to choose table in this. The hub yields a
Dataset. Information object contains the association just as a dataset inquiry that characterizes
the information in the data set.

3. Data preprocessing and feature extraction


Here is a report screen shot for this dataset and it is well defined and its prediction is correct.
Figure shows the Report of it. Here is the final results of this so we can conclude that from this
screen shot that this is the results of it. So in doubt, I accomplish my work or gather the work
done. Right when I've finish one of these work quantifies, the genuine legitimized, not with
standing all the Mushkiol has close to being a reason of save parts for future work done is that I
can reestablish later and see what I did and how I did it. A less work sure are under pressure

5|Page
regard loss, and wind less on th spot up being utilized again. When one of these work estimates
enters creation and It wll turn into you own site hazard’s aspect of my work, I like to
dependable the decision to is be certain have that it's truly doing what it should do for this. This
is especially evident when something in my condition changes for this, eg I should do this for
you another variety of KNIME, going to other Personal Computer, update a portion of the
asociate site or KNIME are the Lab focuse that end up higher sprinkled all doing my work
measurement. I used to do this test cases are guinene , So its called then I picked that task itself
is usaully giving the contraposition I have to do these lines favor my work courses. This blog
segmentation is associated with doing totally that.

It shows that is applied. Applying show well desire for this dataset and keep up all the system.
The data and ouput model is used for Gradually for this assistance. In this document, we
initialized small level guide that is measure show and displaying how it must be used to
throwing information and personel assistance to doing business directions. Here is Selected
information is from dataset, I have chosen explicit columns from a prepared table of
information, and afterward I have demonstrated this in my report.

6|Page
4. Experiment
Here is a model of linear regression that I built for this dataset and it will be predict the data
correctly. Here is the final results of this so we can conclude that from this screen shot that this
is the results of it. For this estimations, there are many stages some are arrangement, test and
other improvements are finish before move the model into knime. This data showes that the
most perfect part to cover these parts in KNIME Analytics Platform where Learner and the
Predictor create a data.

Figure this shows report of dataset, Here is the final results of this so we can conclude that from
this screen shot that this is the results of it. When one of these work estimates enters creation
and It wll turn into you own site hazard’s aspect of my work, I like to dependable the decision
to is be certain have that it's truly doing what it should do for this. This is especially evident
when something in my condition changes for this, eg I should do this for you another variety of
KNIME, going to other Personal Computer, update a portion of the asociate site or KNIME are
the Lab focuse that end up higher sprinkled all doing my work measurement. I used to do this
test cases are guinene , So its called then I picked that task itself is usaully giving the
contraposition I have to do these lines favor my work courses. This blog segmentation is
associated with doing totally that.

7|Page
4. Result analysis
Output parameter are shown below in the pie chart. Red Blue Green Yellow colors show the
percentage of it. Here is the final results of this so we can conclude that from this screen shot
that this is the results of it. Resulting to taking the computation needed, initialize the model
structure measured for it. Here we can direct through it, from the this survey to building
important request and back samples. We in manner to go from Shows you the theory before
the estimated for surveying how to perform it well.

8|Page
Here is the final results of this so we can conclude that from this screen shot that this is the
results of it. For this estimations, there are many stages some are arrangement, test and other
improvements are finish before move the model into knime. This data showes that the most
perfect part to cover these parts in KNIME Analytics Platform where Learner and the Predictor
create a data. When one of these work estimates enters creation and It wll turn into you own
site hazard’s aspect of my work, I like to dependable the decision to is be certain have that it's
truly doing what it should do for this. This is especially evident when something in my condition
changes for this, eg I should do this for you another variety of KNIME, going to other Personal
Computer, update a portion of the asociate site or KNIME are the Lab focuse that end up higher
sprinkled all doing my work measurement. I used to do this test cases are guinene , So its called
then I picked that task itself is usaully giving the contraposition I have to do these lines favor my
work courses. This blog segmentation is associated with doing totally that.

4. Evaluation and Conclusion.


This is result of this, and we conclude that it filtered datset for the given and it will be used for
the many purposes. Here is the final results of this so we can conclude that from this screen
shot that this is the results of it.
At the point when you have cleaned and suitably cleaned your data, you can continue to set up
the AI model. Its counts are available in KNIME Analytics Platform. For example commendable
and present day estimations, coordinated and solo figurings, counts from the field of
estimations or from the AI social order, those that predict numeric characteristics or apparent
classes or computations that examine plans, requiring past time course of action, or just a
discretionary case of data. Here is the final results of this so we can conclude that from this
screen shot that this is the results of it. For this estimations, there are many stages some are
arrangement, test and other improvements are finish before move the model into knime. This
data showes that the most perfect part to cover these parts in KNIME Analytics Platform where
Learner and the Predictor create a data.

9|Page
When one of these work estimates enters creation and It wll turn into you own site hazard’s
aspect of my work, I like to dependable the decision to is be certain have that it's truly doing
what it should do for this. This is especially evident when something in my condition changes
for this, eg I should do this for you another variety of KNIME, going to other Personal Computer,
update a portion of the asociate site or KNIME are the Lab focuse that end up higher sprinkled
all doing my work measurement. I used to do this test cases are guinene , So its called then I
picked that task itself is usaully giving the contraposition I have to do these lines favor my work
courses. This blog segmentation is associated with doing totally that.

10 | P a g e

You might also like