Professional Documents
Culture Documents
Individual Assignment
NAME IDNO
Technology has been growing so exponentially over recent years, there has been a steadily
increasing demand for bright graduates to come in and help to transform areas ranging from data
infrastructure to cyber security.
He designed the Analytical Engine and it was this design that the basic framework of
the computers of today are based on.
1983: The CD-ROM hit the market, able to hold 550 megabytes of pre-recorded data. That same
year, many computer companies worked to set a standard for these disks, making them able to be
used freely to access a wide variety of information.
1993: With an attempt to enter the handheld computer market, Apple releases Newton. Called
the “Personal Data Assistant”, it never performed the way Apple President John Scully had
hoped, and it was discontinued in 1998.
2011: Google releases the Chromebook, a laptop that runs on Google Chrome OS.
Also in 2011, the Nest Learning Thermostat emerges as one of the first Internet of Things,
allowing for remote access to a user’s home thermostat by use of their smartphone or tablet. It
also sent monthly power consumption reports to help customers save on energy bills.
2012: On October 4, Facebook hits 1 billion users, as well as acquires the image-sharing social
networking application Instagram
Also in 2012, the Raspberry Pi, a credit-card-sized single-board computer is released, weighing
only 45 grams.
2014: The University of Michigan Micro Mote (M3), the smallest computer in the world, is
created. Three types were made available, two of which measured either temperature or pressure,
and one that could take images. Additionally, the Apple Pay mobile payment system is
introduced.
2015: Apple releases the Apple Watch, which incorporated Apple’s iOS operating system and
sensors for environmental and health monitoring. Almost a million units were sold on the day of
its release.
2019: Apple announces iPadOS, the iPad's very own operating system, to better support the
device as it becomes more like a computer and less like a mobile device.
I don’t have the answer to what awaits us in regards to computers. One thing is for sure -- in
order to keep up with the world of tech, the growing need for cyber security, and our constant
need for the next big thing, computers aren’t going anywhere.
General concept:-
Description of a particular field of statistics in terms of its content and scope, actors and
units, concepts and definitions, classifications, relationships between elements and links
to other statistical frameworks.
It may also include indicators, sources, methods and/or model surveys.
3. Hypothesis testing
What Is Hypothesis Testing?
The purpose of hypothesis testing is to determine whether there is enough statistical evidence in favor of
a certain belief, or hypothesis, about a parameter.
Hypothesis testing is used to assess the plausibility of a hypothesis by using sample data.
Such data may come from a larger population, or from a data-generating process.
The difference between the sample and the population can be explained by sampling
error (there does not appear to be a treatment effect)
The difference between the sample and the population is too large to be explained
by sampling error (there does appear to be a treatment effect).
4. Measures of association
There are certain points that a researcher should know in order to better understand the measures of
statistical association.
First, the researcher should know that measures of association are not the same as measures
of statistical significance. It is possible for a weak association to be statistically significant; it is also
possible for a strong association to not be statistically significant.
For measures of association, a value of zero signifies that no relationship exists. In a
correlation analysis, if the coefficient (r) has a value of one, it signifies a perfect relationship on the
variables of interest. In regression analyses, if the standardized beta weight (β) has a value of one, it
also signifies a perfect relationship on the variables of interest.
Measures of Association
A measure of association quantifies the relationship between exposure and disease among
the two groups. Exposure is used loosely to mean not only exposure to foods,
mosquitoes, a partner with a sexually transmissible disease, or a toxic waste dump.
Measures of association provide a means of summarizing the size of the association between two
variables.
Most measures of association are scaled so that they reach a maximum numerical value
of 1 when the two variables have a perfect relationship with each other.
The researcher calculates the observed value of the measure of association, and if the
measure is different enough from 0, the test shows that there is a significant relationship
between the two variables.
Measures of association based on the chi square statistic.
Measures of association based on the chi square statistic. That statistic is a function not
only of the size of the relationship between the two variables, but also of the sample size
and the number of rows and columns in the table. This statistic can be adjusted in various
ways, in order to produce a measure of association.
Measure of association, in statistics, any of various factors or coefficients used to
quantify a relationship between two or more variables. Measures of association are used
in various fields of research but are especially common in the areas of epidemiology and
psychology, where they frequently are used to quantify relationships between exposures
and diseases or behaviors.
5. Exploring, displaying , and examining data
Data exploration refers to the initial step in data analysis in which data analysts use data
visualization and statistical techniques to describe dataset characterizations, such as size,
quantity, and accuracy, in order to better understand the nature of the data.
Data exploration techniques include both manual analysis and automated data exploration
software solutions that visually explore and identify relationships between different data
variables, the structure of the dataset, the presence of outliers, and the distribution of data
values in order to reveal patterns and points of interest, enabling data analysts to gain
greater insight into the raw data.
In any situation where you have a massive set of information, data exploration can help cut it
down to a manageable size and focus efforts to optimize your analysis.
Most data analytics software includes visualization tools and charting features that make
exploration at the outset significantly easier, helping reduce data by rooting out information that
isn’t required, or which can distort results in the long run.
Data Exploration Tools
Manual data exploration methods entail either writing scripts to analyze raw data or manually
filtering data into spreadsheets. Graphical displays of data, such as bar charts and scatter plots,
are valuable tools in visual data exploration.
A popular tool for manual data exploration is Microsoft Excel spreadsheets, which can be
used to create basic charts for data exploration, to view raw data, and to identify the
correlation between variables.
Humans process visual data better than numerical data, therefore it is extremely challenging for
data scientists and data analysts to assign meaning to thousands of rows and columns of data
points and communicate that meaning without any visual components.
Data visualization in data exploration leverages familiar visual cues such as shapes, dimensions,
colors, lines, points, and angles so that data analysts can effectively visualize and define the
metadata, and then perform data cleansing.
Data examination and data exploration are effectively the same process. Data
examination assesses the internal consistency of the data as a whole for the purpose of
confirming the quality of the data for subsequent analysis.
Internal consistency reliability is an assessment based on the correlations between
different items on the same test.
This assessment gauges the reliability of a test or survey that is designed to measure the
same construct for different items.
Displaying data
It is important to display data accurately because it helps in presenting the findings of the
research effectively to the reader.
The purpose of displaying data in research is to make the findings more visible and make
comparisons easy.
When the researcher will present the research in front of the research committee, they
will easily understand the findings of the research from displayed data.
The readers of the research will also be able to understand it better. Without displayed
data, the data looks too scattered and the reader cannot make inferences.
There are basically two ways to display data: tables and graphs.
The tabulated data and the graphical representation both should be used to give more
accurate picture of the research.
In quantitative research it is very necessary to display data, on the other hand in
qualitative data the researcher decides whether there is a need to display data or not.
The researcher can use an appropriate software to help tabulate and display the data in the
form of graphs.
Microsoft excel is one such example, it is a user-friendly program that you can use to
help display the data.
1. The first step is to clarify the metadata associated with the information you are about to
examine.
This means that you need to know some basic things about the way the data were
collected.
Use this information to help write the “methodology” section of your report (of course
you will need to draw upon additional information to provide details about the
methodology when you are actually writing this section).
However, this form will provide a good place to start. Some of this information may
already be recorded in the codebook.
2. The next step is to identify which variables you are interested in examining.
This is a critical step because the temptation is to look at the relationship between
everything. Statistically this is considered “fishing” for good results.
Avoid fishing at all costs. It wastes time and leads to a dramatic increase in the sampling
error.
3. The third step is to select the appropriate statistics for the variables you want to examine.
These tables along with appropriate graphs or charts will then be inserted into your report
along with your interpretations of the results.
To improve your data analysis skills and simplify your decisions, execute these five steps in your
data analysis process: