You are on page 1of 18

APPLICATION DEVELOPMENT (REVIEWER) Effects:

1. More jobs and more goods are produced faster


LESSON 1: Intro. to the Emerging Technologies
and more efficient.
Emerging Technology used to describe “new
4 types of Industries
technology” and “continuing development of
existing technology”. 1. Primary Industry – getting raw materials. Ex:
mining, farming, and fishing.
Note: commonly refers to technologies; expected in
the next 5 to 10 years, to create, significant social 2. Secondary Industry – manufacturing. Ex:
or economic impact. making cars and steel.
Evolution of Technologies 3. Tertiary Industry – provide a service. Ex:
teaching and nursing.
Evolution – is a process of developing by gradual
changes. 4. Quaternary Industry – research and
development industries. Ex: I.T.
 From Latin “evolutio” means “an unrolling or
opening”,
 combined from the prefix “e-“, means “out”
Note:
 plus “volvere”, means to “roll”
IR contributions that changed and transfer the
Industrial Evolution – during the late 1700s and world into modern society:
early 1800s. It occurs when a society shift from
using tools to make products to use new sources of  Steam engine
energy, such as coal, to power machines in  The age of science and mass production
factories.  Rise of digital technology
Notes:  Smart and autonomous systems fueled by
data and machine learning.
 It began in Britain. The 1st European
countries to be industrialized after England:
The Dawn of the 4th Industrial Revolution
Belgium, France, German States.
 It started with coal. IR 1.0 (1760-1840) – Steam Engineering (Age of
 The term “industrial revolution” was Mechanical Production/New manufacturing
popularized by an English Economic processes)
historian Arnold Toynbee (1852-1853)
 1st modern factory: water-powered cotton  Steam power and mechanization of
spinning mill production.
 Huge migration of people in towns and  Ex: Spinning wheels, Steam engines,
cities. Steamship or Steam-powered locomotive.
 Greatest breakthrough for human
Causes and Effects of Industrial Revolution productivity.
Cause: Note:
1. Agricultural Revolution, rise in population; Great  Coined in the 1760s
Britain’s Advantages.  Hand production methods to machines
2. Increase in food production.  Increasing use of steam power
 Development of machine tools
3. Emergence of capitalism.  Rise of factory system.
4. Economic and political system which trade and IR 2.0 (1830-1915) – Assembly Line (Age of
industry are controlled by private owners for profit, Science and Mass Production/Technological
rather than by the state. Revolution)

 Electricity and Assembly line production.


 Henry Ford (1863-1947) took the idea of  Cyber-physical systems – is a mechanism
“mass production” from the butchers of pig that is controlled or monitored by computer-
and carried this principle into “automobile based algorithms, integrated with the
production”. Internet and its users.
 Ex: Telephone by Alexander Graham Bell,
Role of Data for Emerging Technology
Combustion Engine that powers cars.
Data – regarded as new oil and strategic asset that
Note:
drives or determines the future of science,
 Began somewhere in 1870s. technology, economy and possibly everything in
 Development of Telegraph and Railroad our world.
networks, Electrical power and Telephones. Computer data – information processed or stored
 Enhancement of communication. by a computer in a form of either text documents,
IR 3.0 (1969-2010) – Computing/Internet Nuclear images, audio clips, software programs, etc. It is
Energy (Digital Revolution) processed by the CPU and stored in files and
folders on the computer’s hard disk.
 Through “partial automation” using
programmable controls and computers. Data Technologies aim the ff:
 Ex: Robots, Semiconductors, Mainframe, 1. Manage growing data streams.
Personal Computing, and Internet.
2. Get valuable insights from data.
Note:
3. Find solutions to integrate most important data
 Transition from mechanical and analog sources for companies & organizations.
electrical technology to digital.
 Digital Logic Circuits. 4. Find useful information from chaotic data through
 Began late 1950s. the use of machine learning algorithms.
 Transformed traditional production and
business techniques that allows people to
have communication without being Enabling Devices and Network (Programmable
physically present. Devices)

IR 4.0 (current) – Information and Programmable Logic Device (PLD) – circuit that a
Communication Technologies user can configure and reconfigure to perform logic
function.
 Digitization/Integration of value chains.
Ex. of a programmable device is a computer.
 Digitization of product and service offerings.
 Digital business models and customer Four Basic Kinds of devices:
access
 Ex: Mobile Devices, IoT platforms, Location 1. Memory – store random information. Ex:
Detection Technologies, Advanced human- spreadsheet
machine Interfaces, Authentication & Fraud 2. Microprocessors – execute software
Detection, 3D printing, Smart Sensors, Big instructions to perform a wide variety of tasks. Ex:
Data Analytics, Multilevel customer video games
interaction and customer profiling,
Augmented Reality, Cloud Computing. 3. Logic devices – provide specific functions. Ex:
device-to-device interfacing, data communication,
Note: data display, etc.
 The term 4th Industrial Revolution coined by 4. Network – collection of computers, servers,
Klaus Schwab mainframes, etc. to allow the sharing of data.
 Computer Numerical Control (CNC)
machines – giving instructions to the
machine using a computer.
Human-Machine Interaction (HMI) –
communication and interaction between a human
and a machine via user interface.
Ex: machine w/ touch display, push button, mobile
device, computer w/ keypad. Traffic lights which
information can be monitored.
Alarming is an example of an HMI function that
provides a visual indicators of a machine’s issue LESSON 2: Data Science
and its severity. Data Science – study of the data.
Disciplines Contributing to HCI A multi-disciplinary field that uses scientific
1. Cognitive Psychology: Limitations, methods, processes, algorithms, and systems to
Information Processing, Performance extract knowledge and insights from structured,
Prediction, etc. semi-structured, and unstructured data.
2. Computer Science: Graphics, Technology, Note:
Prototyping tools, User-interface
management systems Data Science is a process of using the data to find
3. Linguistics solutions or to predict outcomes for a problem.
4. Engineering and design
My own understanding from this image…
5. Artificial Intelligence
6. Human Factors
Note:
HMIs & PLCs work together to monitor and control
the machine. This in the form of a protocol which
an industrial network.

Human Computer Interaction – study of how


people interact with computers. It consists of 3
parts: user, computer and the ways they work
together.  Data could be from Cloud Computing, IoT,
Big Data and Datafication (Datafication –
Note:
subjects, objects & practices that are
Machines can be controlled by touch, voice, transformed into digital data).
gestures or VR glasses.  This collection of data from different sources
are being processed using Mass Analytic
Ex: Human voice controls, Touchscreen in
Tools, Machine Learning, Recommended
smartphones to enlarge photos, Chatbots to
Systems, Complex Event Processing.
conduct automatic dialog w/ customers, VR to let
 And after being processed, it produced a
people walk through planned factory buildings.
Data Product + Visualization which served
as a new analytic insights (information,
knowledge, data story), that are used by the
Data Science Team.
Need for Data Science
Reasons for using Data Science Technology:
a. Convert massive amount of raw and
unstructured data into meaningful insights.
b. Opting by companies, whether a big brand  Are there anomalies?
or startup.  Are there patterns?
c. Automating transportation such as self-
driving car, which is the future of 4. MODEL THE DATA (Model planning) –
transportation. determine the method & technique to draw the
d. It can help in different predictions such as relation between input variables. Planning for a
various survey, elections, flight ticket model is performed using statistical formulas and
confirmation. visualization tools like SQL analysis services, R,
and SAS/access.
Revolution of Technology – these are challenges
faced by companies, organizations, etc. Responsible: Machine Learning Engineers

Causes that brought up Data science are the ff:  Build a model
 Fit the model
 Data Flow  Validate the model.
 Unstructured Data
 Data Storage Note: In model building, datasets are for training
 Lack of Predictive Analytics and testing. Techniques like association,
classification, and clustering.
 Lack of Scientific Insights
5. OPERATIONALIZE (Communicate and
From there it revolved to…
Visualize the results) – final baselined model w/
 Decision-making reports, code, and technical documents. It is
 Prediction deployed in real-time production environment after
 Pattern Discovery thorough testing.

And then, it became the DATA SCIENCE.  What did we learn?


 Do the results make sense?
 Can we tell a story?
Data Science Process
1. ASK (or Discovery) – an interesting question.
Data Science Jobs – experts who can use various
Acquiring data from all the identified internal &
statistical tools and machine learning algorithms to
external sources which helps to answer business
understand and analyze the data.
questions.
Note:
Responsible: Data Engineers.
 Data Scientist is the most demanding job of
 What is the scientific goal?
the 21st century and some people called
 What would you do if you had all the data? “hottest job title of the 21st century”.
 What do you want to predict or estimate?  Ave. Salary: $95, 000 - $165, 000 per
2. GET – get the data. annum
 About 11.5 millions of job will be created by
 How were the data sampled? the year 2026.
 Which data are relevant?
 Are there privacy issues? 1. Data Scientist – analytical experts --- finding
insights and patterns in the data, handling raw data,
3. PREPARATION (or to explore) – data can have analyzing them, implementing various statistical
a lot of inconsistencies like missing value, blank procedures, visualizing the data, and generating
columns, incorrect data format which needs to be insights from it.
cleaned. “The cleaner the data, the better are
predictions” Requirements: Knowledge about Hadoop,
R, Python, SAS, etc., Data processing,
Responsible: Data Analysts visualization, and prediction.
 Plot the data
2. Data Architect – organizing and managing data Non-Technical Prerequisite
--- implementing the blueprints of a company’s data
platform.  Curiosity – ask various questions to
understand business problem easily.
 Ave. Salary: $123, 680 per annum.  Critical Thinking – find multiple ways to
solve the problem w/ efficiency.
Requirements: Knowledge about XML,
 Communication Skills – most important
Hive, SQL, Spark and Pig.
skill for data scientist.
3. Data engineer – building data pipelines and
models for data scientists to work on.
Requirements: Knowledge of data related Technical Prerequisite:
topics and software engineering principles. Well-
 Machine Learning
versed w/ both structured and unstructured data.
Knowledge of database models and ETL.  Mathematical Modeling
Knowledge of using the tools like SQL, Hive, Pig,  Statistics
Python, Java, SPSS, SAS.  Computer Programming
 Databases
4. Statistician – oldest job title because they are
employed by companies to use statistical modeling
for understanding various trends in the market.
Difference between Business Intelligence and
Responsible for implementing A/B testing,
Data Science
harvesting data, describing data, perform
hypothesis testing etc. Parameters Business Data Science
Intelligence
 Ave. Salary: $82, 477/yr.
Perception Looking Looking
Requirements: Tools used are R, SAS, Backward Forward
SPSS, MATLAB, Python, Stata, SQL.
Data Sources Structured Structured and
5. Data Science Manager – handling and Data. Mostly Unstructured
managing data science projects. Handles the team SQL, but some data. Like logs,
and manages the team performance to meet time Data SQL, NoSQL,
project deadlines, and planning of the roadmap to Warehouse) or text
follow by the data science team. Executing the plan
of action and delivering the results. Approach Statistics & Statistics,
Visualization Machine
6. Machine Learning Engineers – tailoring Learning, and
machine learning models for performing Graph
classification and regression tasks.
Emphasis Past & Present Analysis &
 Ave. Salary: $114, 826 Neuro-
Requirements: Knowledge of techniques linguistic
like clustering, random forest and other deep Programming
learning algorithms. Tools like TensorFlow, Keras, Tools Pentaho. R, TensorFlow
PyTorch , Scikit-learn, Caffe. Microsoft Bl,
QlikView,
7. Decision Scientists – a new field --- to help the
company make business decisions w/ the help of AI
and Machine Learning.
Note:
 Ave. Salary: $69, 192/yr.
 BI focuses on the past and present
predictions.
Prerequisite for Data Science  Data science focuses on past, present and
future predictions.
Java – an Object-oriented programming language
used for data analysis, including cleaning data,
Data Science Components
import and export of data, statistical analysis, deep
1. Statistics – collect and analyze the numerical learning, Natural Language Processing (NLP) and
data in a large amount and finding meaningful data visualization).
insights from it.
MATLAB – not very popular in data science but it
2. Domain Experts – has a specialized knowledge should be considered in learning data science.
or skills of a particular area. Ex: A person skilled in They find it easy to move to deep learning because
Data-gathering. of the functionality of the Deep Learning Toolbox.

3. Data engineering – involves acquiring, storing,


retrieving, and transforming the data, including the
Machine learning in Data Science (Algorithms)
metadata.
1. Regression – model and analyze the
4. Visualization – representing the data in a visual
relationships between variables and how they
context. Ex: Tableau
contribute and are related to produce a particular
5. Advanced Computing – involves designing, outcome together.
writing, debugging, and maintaining the source
Ex: Dependent Variables (Age) and Independent
code of computer program.
Variables (Result of the Blood pressure)
6. Mathematics – study of quantity, structure,
A linear regression refers to a regression
space and changes.
model that is made of linear variables. Most popular
7. Machine Learning – training of the machine so machine learning algorithm based on supervised
it can act as a human. learning. It is mostly used in forecasting and
predictions. Since it shows the linear relationship
between input and output variable.
Tools for Data Science Ex: Y = mx + c
 Data Analysis – R, Spark, Python, SAS  Where, Y = Dependent Variable
 Data Warehouse – Hadoop, SQL, Hive  X = Independent Variable
 Data Visualization – R, Tableau, Raw  M = Slope
 Machine Learning – Spark, Azure ML  C = Intercept
studio, Mahout
2. Decision Tree – probability tree about some
R – statistical language (liner & nonlinear modeling, kind of processes. It can be used for both
classical statistical tests, time-series analysis, classification and regression problems. Ex: choose
classification, & clustering,…) and graphical between item A and item B.
techniques.
Ex: Root node -> Each node == a feature, each
Python – is a great general-purpose language w/ branch == a decision, and each leaf == outcome.
many libraries dedicated for data scientists and
developers. 3. Clustering – gain valuable insights from data by
seeing what groups the data points fall when this
 Note: It can be used as backend for web algorithm is applied.
applications. It is flexible compared to R.
4. Principal Component Analysis (PCA) –
SAS – command-driven software packaged used statistical procedure that summarize the information
for statistical analysis and data visualization. One of content in large data tables by means of “summary
the most widely used statistical software packages indices” (smaller set) to easily visualized and
in both industry and academia. analyzed.
 Note: Only available in Windows OS. 5. Support Vector Machines (SVM) – linear model
for classification and regression problems. It is
used in facial recognition and genetic classification. Common tools used for model planning are: SQL
SVMs have pre-built regularization model that Analysis Services, R, SAS, Python
allows data scientists to minimize the classification
4. Model-building:  process of model building
error. Idea of SVM is simple: it creates a line or a
starts by creating datasets for training and testing
hyperplane which separates the data into classes.
purpose.
6. Naïve Bayes – predict probability of different
Following are some common Model building
class based on various attributes. It is mostly used
tools: SAS Enterprise Miner, WEKA, SPCS
in text classification and w/ problems having
Modeler, MATLAB
multiple classes.
5. Operationalize: deliver the final reports of the
7. Artificial Neural Network (ANN) – piece of
project, along with briefings, code, and technical
computing system to simulate how human brain
documents. It provides a clear overview of
analyzes and process information. Foundation of AI
complete project performance and other
and solves problems that would prove impossible
components on a small scale before the full
or difficult by human or statistical standard.
deployment.
Ex: max. probability of the class does the
6. Communicate results: check if we reach the
image of the cats and dogs falls in. It explains the
goal, which we have set on the initial phase. We
binary classification where the dog or cat is
will communicate the findings and final result with
assigned its appropriate place.
the business team.
8. Apriori – used for Association Rule Mining. It
searches for a series of frequent sets of items in
the datasets. It builds associations and correlations Data Science Applications
between the item sets. Behind the concept of “You
may also like”.

How to solve problems in Data Science using


Machine Learning Algorithms?

Data and Information

Data Science Lifecycle


The main phases of data science life cycle are
given below:
1. Discovery: what are the basic requirements,
priorities, and project budget.
2. Data preparation: Data preparation is also
known as Data Munging.
3. Model Planning:  Determine the various
Data Processing Cycle – sequence of steps or
methods and techniques to establish the relation
process used to process the raw data and turn it
between input variables.
into readable form and generate meaningful
information.
Stages of Data Processing: Basic Concepts of Big Data
a. Input – raw data; first step. Big data – collection of data sets that is large and
complex that it becomes difficult to process by
b. Processing – the raw data is processed by
traditional data processing algorithms.
selected processing method. Most important step
Application of Big Data
c. Output – outcome which now the data is useful
and provides information and no longer called data.  Help companies make more informative
business decisions by analyzing large
volumes of data.
Data types and their representation?  Web server logs, Internet click stream data,
social media content and activity reports,
In computer science and computer text from customer emails, mobile phone
programming, data type is simply an attribute of call, and machine data captured by multiple
data that tells the compiler or interpreter how the sensors.
programmer intends to use the data.
From Computer programming perspective:
Data science vs. Big data
 Integers – whole numbers.
 Booleans (bool) – true or false.
 Characters (char) – single character
 Floating-point numbers (float) – real
numbers; decimals
 Alphanumeric String (string) – combination
of characters and numbers.
Note: Data type defines the operations that can be
done on the data, the meaning, and the way the
values of that type can be stored.
From Data Analytics perspective:

 Structured data – tabular format w/


relationship between the different rows and
columns. Ex: Excel files or SQL which can
be sorted.
 Unstructured data – does not have a
predefined data model or not organized in a
predefined manner. Text-heavy such as
dates, numbers and facts. Ex: audio, video Note: Data never sleeps!
files, no SQL databases. 4 Vs of Big Data
 Semi-structured data – does not conform
w/ the formal structure of data models. It 1. Volume – scale of data.
contains tags or other markers to separate
2. Variety – different forms of data.
semantic elements and enforce hierarchies
of records and fields within the data. “Self- 3. Veracity – uncertainty of data.
describing structure”. Ex: JSON and XML.
4. Velocity – analysis of streaming data.
 Metadata – data about data. Additional
information about a specific set of data;
frequently used by Big data; solutions for
initial analysis.
Data Value Chain – describe the information flow  Stored in blocks in DataNodes and specify
within a big data system as a series of steps the size of each block. From that,
needed to generate value and useful insights from datablocks are replicated on different
data. DataNodes to provide fault tolerance.
 Follows the horizontal scaling which you
Note:
can add new nodes to HDFS cluster on the
 A decision support tool to model the chain run as per requirement, instead of
of activities that an organization performs in increasing the hardware stack present in
order to create valuable product/service to each node.
the market.

2nd problem is storing heterogenous data.


Solution:

 In HDFS, it can store all kinds of data.


There is no pre-dumping schema validation.
 It follows write once and read many models.
From here, you can just write any kind of
data once and you can read it multiple times
for finding insights.

Basic Concepts of Big Data 3rd problem processing speed.

1. Data is generated at a steady rate and is Solution:


structured in nature.  We move the processing unit of data
2. Heterogenous data is being generated at an instead of moving to the processing unit.
alarming rate by multiple sources.  It means that instead of moving data from
different nodes to a single master node for
3. It becomes a bottleneck. processing, the processing logic is sent to
4. Multiple Processing Unit for data processing the nodes where the data is stored so each
node can process a part of the data in
5. Hadoop: HDFS – storage: Distributed File parallel.
System. MapReduce – processing: Allows parallel  All the intermediary output produced by
& distributed processing. each node is merged together and the final
response is sent back to the client.

Hadoop – open-source software framework used


for storing and processing Big data in a distributed Hadoop Ecosystem
manner on large clusters of commodity hardware.
It was developed, based on the paper
written by Google on MapReduce system and
applies functional programming. It is written in the
Java programming language. It is developed by
Doug Cutting and Michael J. Cafarella.
How Hadoop solves these problems?
1st problem is storing the colossal amount of data
Solution:

 Provides a distributed way to store Big data


1950 – Turing Test
Alan Turing proposes the test for machine
intelligence.
1955 – AI born
Term AI coined by John McCarthy as the
“science and engineering of making intelligent
machines”
1961 – Unimate
1st industrial robot worked at GM replacing
humans on the assembly line
LESSON 3: Artificial Intelligence
1964 – ELIZA
Artificial – “Man-made”
Chatbot developed by Joseph Weizenbaum
Intelligence – “thinking power” or “ability to learn at MIT holds conversations with humans.
and solve problems”
1966 – Shakey
Combined, “A man-made thinking power”.
1st electronic person from Stanford. It is a
Artificial Intelligence – branch of computer general-purpose mobile robot that reasons about its
science by which we can create intelligent own actions.
machines which can behave like a human, think
like a human, and able to make decisions. (through 1997 – Deep Blue
experience or exposure that allows us to have a
A chess-playing computer from IBM defeats
certain intelligence)
world chess champion Garry Kasparov.
1998 – Kismet
Goals of Artificial Intelligence
An emotionally intelligent robot that
1. Replicate human intelligence responds to people’s feelings introduced by Cynthia
Breazeal at MIT.
2. Solve Knowledge-intensive tasks
1999 – AIBO
3. An intelligent connection of perception and action
Sony launched 1st consumer robot pet dog.
4. Building a machine which can perform tasks that
requires human intelligence such as: 2002 – ROOMBA

 Proving a theorem 1ST mass produced autonomous robotic


 Playing chess vacuum cleaner from iRobot.
 Plan some surgical operation 2011 – SIRI
 Driving a car in traffic
Apple integrated SIRI, an intelligent virtual
5. Creating some system which can exhibit assistant with voice interface, into the iPhone 4s.
intelligent behavior, learn new things by itself,
demonstrate, explain and can advise to its user. 2011 – Watson
IBM’s question answering computer wins 1st
place on a popular television quiz show Jeopardy.
History of AI
2014 – Eugene
 GPU – big contributors in AI.
Eugene Ghostman, a chatbot passes the
Turing Test believing it is a human.
2014 – Alexa Inductive Reasoning – use specific scenarios and
making generalized conclusions. Ex: Your sister
Amazon launched Alexa, an intelligent
and your mother are both tidy therefore all old sister
virtual assistant with a voice interface that
is tidy.
completes shopping tasks.
Deductive Reasoning – making generalized
2016 – Tay
statement and backing it up with specific scenarios.
Microsoft’s chatbot on social media making Ex: All apples grow on trees; If A=B and B=C, then
inflammatory and offensive racist comments. A must equal C.

2017 – Alphago  Learning

Google’s AI beats world champion Ke Jie in  Problem Solving


the complex board game of Go, vast number
 Perception
(2^170) of possible positions.
 Linguistic Intelligence
AI vs Machine Learning vs Deep Learning
Types of Artificial Intelligence
It can be divided into 2 types based on
capabilities and functionality.

Task Classification of AI
The domain of AI is classified into
Formal tasks, Mundane tasks, and Expert
Tasks.

Mundane Formal Tasks Expert Tasks


(Ordinary)
Levels of AI Tasks

Perception: -Mathematics -Engineering

-Computer Vision -Geometry -Fault Finding

What is Intelligence composed of? -Speech, Voice -Logic -Manufacturing

The intelligence is intangible. It is composed of: -Integration and -Monitoring


Differentiation
 Reasoning
Natural Language GamesGo: Scientific Analysis Agent Terminology
Processing:
-Chess (Deep Performance Measure of Agent − It is the criteria,
-Understanding Blue) which determines how successful an agent is.
-Language -Checkers Behavior of Agent − It is the action that agent
Generation
performs after any given sequence of percepts.
-Language
Translation Percept − It is agent’s perceptual inputs at a given
instance.
Common Sense Verification Financial Analysis
Percept Sequence − It is the history of all that an
agent has perceived till date.
Reasoning Theorem Proving Medical Diagnosis
Agent Function − It is a map from the precept
Planning Creativity sequence to an action.
Influence of AI
Robotics:
1. Big Data: Structured data vs. Unstructured Data
Locomotive
2. Advancements in computer processing speed
and new chip architectures
3. Cloud computing and APIs
4. Emergence of data science

AI Platforms – use of machines to perform the


tasks that are performed by human beings such as
problem solving, learning reasoning, social
intelligence as well as general intelligence.
1. Google AI Platform

 Cloud-based
 Customer sentiment analysis
 Spam detection
 Recommendation systems
AI agents and environments  Purchase Prediction

Agent – anything that can perceive its environment 2. Tensor Flow


through --- sensors and acts upon that  Open-source library for numerical
environment through --- effectors. computation & large-scale machine learning
Human agent – sensory organs such as eyes, using data flow graphs
ears, nose, tongue and skin parallel to the sensors,  Nodes == mathematical operations; Graph
and other organs such as hands, legs, mouth, for edges == multi-dimensional data arrays
effectors. (tensors)
 Flexible architecture to deploy computation
Robotic agent – cameras and infrared range to one or more CPUs or GPUs in a desktop,
finders for the sensors and various motors and server, or mobile device using API
actuators for effectors.
3. Microsoft Azure
Software agent – encoded bit strings as its
programs and actions.  Public cloud computing platform w/
solutions including IaaS (Infrastructure as a
Service), PaaS (Platform as a service), and 4. Travel & transport – travel arrangements to
SaaS (Software as a Service) to be used in suggesting the hotels, flights, and best routes to the
analytics, virtual computing, storage, etc. customers. Travel industries using AI powered
 Digital Marketing chatbots for customers to have a better and fast
 Mobile, E-commerce, Big data analytics, response.
IoT, Gaming, Blockchain
5. Banking and Finance – detect anomalies in
4. Rainbird streams of financial data and comply w/ anti-fraud
regulations, to build customer’s trust.
 Award-winning AI platform which makes
business smarter 6. Manufacturing – supply chains, anticipate
 Natural Language Processing company’s market changes. This helps for staffing,
 Analytics and Insights inventory control, energy consumption and supply
 Controlled learning algorithm of raw materials.
 Turn insights into action 7. Food Technology – robots in restaurants to
prepare food.
5. Infosys Nia
8. Healthcare – make better and faster diagnosis
 Knowledge-based AL platform brings
than human.
machine learning together w/ deep
knowledge of an organization to drive 9. Logistics and Transportation – predict
automation & innovation. demand, modify orders, and re-route in-transit
goods to warehouses where needed. It can collect
6. Premonition
traffic data to reduce congestions and improve the
 World’s largest litigation database scheduling of public transport. Traffic is affected by
 Analyze court, judge and opposing counsel traffic flow, AI allow streamlined traffic patterns,
by their win rates and results. smarter traffic light algorithms and real-time
 Knows the track record of your attorney. tracking.

7. Wit.AI 10. Gaming – play strategic games like chess to


make possible moves.
 Natural language interface for applications
capable of turning sentences into structured 11. Data security – make data more safe and
data. secure. Ex: AEG Bot, A12 platform that determine
bugs and cyber-attacks.
 Create bots that can interact w/ humans on
messaging platforms. 12. Social Media – Facebook, Twitter and
 Build applications you can talk to or text to. Snapchat which are being organized and managed
these massive amounts of data. Trends, hashtags
from different users.
Applications of AI
13. Automotive Industry – virtual assistant such
1. Entertainment – such as Netflix or Amazon. as Tesla introduced TeslaBot, an intelligent virtual
Through ML/AI algorithms which show the assistant.
recommendations for programs or shows.
14. Robotics – general robots that perform
2. Real state – transform the way people buy and repetitive tasks but w/ the help of AI, it can perform
sell property tasks with their own experience w/o pre-
programmed. Ex: Humanoid named Erica and
3. Finance and E-commerce – finance Sophia which can talk like humans.
implements automation, chatbot, adaptive
intelligence, algorithm trading, and machine 15. Agriculture – solid and crop monitoring,
learning into financial processes. E-commerce predictive analysis which can be very helpful for
through helping shoppers to discover associated farmers.
products w/ recommended size, color, or brand.
data over a wired or wireless network, w/o human-
to-computer intervention.
According to 2020 Conceptual Framework
expressed in formula:
IoT = Services + Data + Networks + Sensors
History of IoT

IoT Architecture

 Things equipped with sensors to gather


data and actuators to perform commands
received from the cloud.
 Gateways for data filtering, preprocessing
and moving it to the cloud and vice versa, –
receiving commands from the cloud.
 Cloud gateways to ensure data transition
between field gateways and central IoT
servers.
 Streaming data processors to distribute
the data coming from sensors among
relevant IoT solution’s components.
 Data lake for storing all the data of defined
and undefined value.
 Big data warehouse for collecting valuable
data.
 Control applications to send commands to
actuators.
 Machine learning to generate the models
which are then used by control applications.
 User applications to enable users to
monitor control their connected things.
 Data analytics for manual data processing.

LESSON 4: Internet of Things (IoT)


Internet of Things (IoT) – ecosystem of physical
devices, vehicles, appliances, and other thing that
have the ability to connect, collect and exchange
Categories of IoT Devices
1. The devices that can collect and send the
information

 Devices that can collect and send the


information will be ‘sensors’.
Basic Components:
2. The devices that collect and respond to the
Sensors – take data from the environment. Ex: information
daylight, sounds, etc. Lamps equipped with
 respond upon print request.
actuators to switch the light on and off
3. The devices that can equip both of the above
Data lake – stores raw data coming from sensors.
features
Big data warehouse – extracted info smart home
dweller’s behavior in various days of the week,  The sensor will collect the information and
energy costs and more. the devices intelligently should respond with
the intervention of humans.

Features of IoT technology


• Intelligence - this will be essential for smart
product
• Connectivity - network accessibility and
compatibility features
• Sensing - like collecting the information
basing the retrieval capacity and providing it
for intelligent decision
• Expressing - this will enable interactivity
with humans and the world.
• Energy - without this, there will not be the
creation of our devices. Energy harvesting
and proper infrastructure to charge and all
will be important features for our IoT
devices
• Safety: The prime feature on which the
customer relies and use the product. Hence
Advantages and Disadvantages of IoT
no compromise on this is allowed and all the
Applications
details need to be checked and validated.

IoT Tools – network or connection of devices,


vehicles, equipment, home appliances, etc.
It helps in collecting and exchanging protocols. Use of this tool involves smart home
different kinds of data. It helps the user to control tech, security, automation, and sensors.
the devices remotely over a network.
6. Node-RED
Tool (is for computing) a piece of software used
It is an interesting visual tool for writing IOT
to develop software or hardware, or to perform low-
by allowing programmers to use (integrated)
level operations.
browser-based flow editor for connecting devices,
services and APIs together. An open-source tool,
Node-RED consists of more than 60,000 modules,
1. Kinomo Create
has user-friendly interface and easy to connect
This tool precludes the need to have devices.
extensive knowledge of JavaScript to connect two
7. PlatformIO
devices. It is used to develop small IoT applications
such as temperature/movement sensors with It allows programmers to port the IDE over
mobile notifications, connecting light, a synthesizer, Atom editor or use it as a substitute for plug-in
a camera trap and an automatic alarm bell. installer. With 200+ boards, and a wonderful
feature of debugging integration, programmers find
2. Eclipse IoT
this tool quite interactive in its cross-platform IoT
It is a top tool for building IoT devices, development environment.
Cloud platforms and Gateways. It is one of the
most viable tools to develop, promote and adopt
open source IOT technologies, as well as learning
great tech expertise. MobileCoderz has vast
IoT Platforms – a multilevel technology which is
assembly of IOT app development expertise to
used to manage and automate the connected
cover your IOT needs.
devices known as the IoT. It is a service which
3. OpenSCADA helps in bringing the physical objects online,
connect devices for a machine-to-machine
It is an open implementation of SCADA
communication.
(Supervisory Control And Data Acquisition) and
HMI (Human-Machine Interface) systems.
OpenSCADA tools can be used to develop
advanced IOT apps. It supports modern design,  IoT Platform as middleware.
debugging, and editing front-end applications,
configuration tools and interface applications with
security and flexibility.
4. Raspbian
It is one of the most essential IOT
development tools due to having 35,000+ packages
and pre-compiled software which facilitates rapid
installation. Created mainly for Raspberry Pi board,
this IOT tool is considered one of the best ones for
having widened the extent of computing for users.
5. Device Hive
It is a free open source M2M
communication framework outed as one of the
most sought-after, cloud-based API IoT app
development platforms. Device Hive eliminates the
need of network configuration thus helping
programmers to control remotely management
Applications of IoT LESSON 5: Augmented Reality
Augmented Reality – a real scene viewed by the
user and is to superimpose graphics, audio and
other sense enhancements over a real-world
environment in real-time.

AR vs. VR. Mixed Reality


How does AR work?  Objects to behave in physically plausible
manners when manipulated
It overlays digital information on top of a
 Occlusion
camera-captured natural environment.
 Collision detection
It needs the ff. components:  Shadows
 Depth-sensing camera
 Registration tools
Failures in Registration due to:
 Computer vision
 Output device  Noise – position & pose of camera w/
respect to the real scene
Marker-based – real-world tag/object/photo can be
 Image distortions
scanned to trigger an extended reality experience.
 Time delays – in calculating the camera
Based on location – being in a specific location position.
can trigger the AR experience.
Notes:
AR systems sensitive to visual errors -
What is needed? virtual object may not be stationary in the real
scene or it may be in the wrong place.
 Head-mounted display
 Tracking system Misregistration of a pixel can be detected
 Mobile computing power under certain conditions.
Time delays lead to augmented image
lagging behind motions in the real scene.

Combining the Real and Virtual Worlds


Examples of Augmented Reality:
 Precise models
 Locations and optical properties of the a. Virtual Catwalk on ASOS
viewer (or camera) and the display
 Calibration of all devices
 To combine all local coordinate systems
centered on the devices and the objects in
the scene in a global coordinate system

 Register models of all 3D objects of interest


with their counterparts in the scene

 Track the objects over time when the user


moves and interacts with the scene

b. Toyota AR Demo

Notes:
Realistic Merging Requires:

You might also like