0 views

Uploaded by tccpy

Notes on Convolutional Neural Network

Notes on Convolutional Neural Network

© All Rights Reserved

- Deep Learning With PyTorch (Packt)-2018 262p
- thesis.pdf
- neuro-fuzzy_noPW
- Artificial Intelligence Machine Learning Program Brochure
- 103250968-Marriot-Final-22-07-2012
- Neural Networks
- 1511.08458v2
- [IJETA-V5I4P1]:Mr.Shreenath Waramballi, Mr.Aravinda M
- Pattern Association For Character Recognition By Back-Propagation Algorithm Using Neural Network Approach
- tmpE688.tmp
- Deep convolutional neural networks for multi-modality isointense infant brain image segmentation
- Deep+Learning+Nanodegree+Foundation+Student+Handbook_1
- Assessment of Customer Credit through Combined Clustering of Artificial Neural Networks, Genetics Algorithm and Bayesian Probabilities
- A NEW TECHNIQUE INVOLVING DATA MINING IN PROTEIN SEQUENCE CLASSIFICATION
- A New Correlation for Predicting Hydrate Formation Conditions for Various Gas Mixtures and Inhibitors
- 2610628
- 13. Electronics - IJECIERD -Skin Cancer Detection and - Santosh Achakanalli - OPaid (1) (2) (1) (2)
- fruit detection
- Facial Exp
- Comprehensive Study of Deep Learning

You are on page 1of 2

It learns directly from images.

A CNN is made up of several layers that process and transform an input to produce

an output. You can train a CNN to do image analysis tasks including scene

classification, object detection and segmentation, and image processing.

Finally, we’ll briefly discuss the three ways to train CNNs for image analysis.

In a typical neural network, each neuron in the input layer is connected to a neuron

in the hidden layer. However, in a CNN, only a small region of input layer neurons

connects to neurons in the hidden layer. These regions are referred to as local

receptive fields. The local receptive field is translated across an image to create a

feature map from the input layer to the hidden layer neurons. You can use

convolution to implement this process efficiently. That’s why it is called a

convolutional neural network.

The second concept we’ll discuss is about shared weights and biases.

Like a typical neural network, a CNN has neurons with weights and biases. The

model learns these values during the training process, and it continually updates

them with each new training example. However, in the case of CNNs, the weight and

bias values are the same for all the hidden neurons in a given layer. This means that

all the hidden neurons are detecting the same feature, such as an edge or a blob, in

different regions of the image. This makes the network tolerant to the translation of

objects in an image. For example, a network trained to recognize cats will be able to

do so wherever the cat is in the image.

Our third and final concept is activation and pooling. The activation step applies a

transformation to the output of each neuron by using activation functions. Rectified

Linear Unit, or ReLU, is an example of a commonly used activation function. It takes

the output of a neuron and maps it to the highest positive value. Or, if the output is

negative, the function maps it to zero.

You can further transform the output of the activation step by applying a pooling step.

Pooling reduces the dimensionality of the feature map by condensing the output of

small regions of neurons into a single output. This helps simplify the following layers

and reduces the number of parameters that the model needs to learn.

Now, let’s put it all together. Using these three concepts, we can configure the layers

in a CNN. A CNN can have tens or hundreds of hidden layers that each learn to

detect different features of an image.

In this feature map, we can see that every hidden layer increases the complexity of

the learned image features. For example, the first hidden layer learns how to detect

edges, and the last learns how to detect more complex shapes.

Just like in a typical neural network, the final layer connects every neuron from the

last hidden layer to the output neurons. This produces the final output. There are

three ways to use CNNs for image analysis. The first method is to train the CNN from

scratch. This method is highly accurate, although it is also the most challenging, as

you might need hundreds of thousands of labelled images and significant

computational resources.

The second method relies on transfer learning, which is based on the idea that you

can use knowledge of one type of problem to solve a similar problem. For example,

you could use a CNN model that has been trained to recognize animals to initialize

and train a new model that differentiates between cars and trucks. This method

requires less data and fewer computational resources than the first.

With the third method, you can use a pre-trained CNN to extract features for training

a machine learning model.

For example, a hidden layer that has learned how to detect edges in an image is

broadly relevant to images from many different domains. This method requires the

least amount of data and computational resources.

- Deep Learning With PyTorch (Packt)-2018 262pUploaded byalex torres
- thesis.pdfUploaded byFranz Jordan
- neuro-fuzzy_noPWUploaded byHemac Hander
- Artificial Intelligence Machine Learning Program BrochureUploaded bySunny
- 103250968-Marriot-Final-22-07-2012Uploaded byHangerme India
- Neural NetworksUploaded bys_asmath
- 1511.08458v2Uploaded byTehreem Ansari
- [IJETA-V5I4P1]:Mr.Shreenath Waramballi, Mr.Aravinda MUploaded byIJETA - EighthSenseGroup
- Pattern Association For Character Recognition By Back-Propagation Algorithm Using Neural Network ApproachUploaded byijcses
- tmpE688.tmpUploaded byFrontiers
- Deep convolutional neural networks for multi-modality isointense infant brain image segmentationUploaded byFelipe Cícero
- Deep+Learning+Nanodegree+Foundation+Student+Handbook_1Uploaded bynoioam
- Assessment of Customer Credit through Combined Clustering of Artificial Neural Networks, Genetics Algorithm and Bayesian ProbabilitiesUploaded byijcsis
- A NEW TECHNIQUE INVOLVING DATA MINING IN PROTEIN SEQUENCE CLASSIFICATIONUploaded byCS & IT
- A New Correlation for Predicting Hydrate Formation Conditions for Various Gas Mixtures and InhibitorsUploaded byأصلان أصلان
- 2610628Uploaded byTita Cholifah Rahayu
- 13. Electronics - IJECIERD -Skin Cancer Detection and - Santosh Achakanalli - OPaid (1) (2) (1) (2)Uploaded byTJPRC Publications
- fruit detectionUploaded bypradeep
- Facial ExpUploaded byAsk Bulls Bear
- Comprehensive Study of Deep LearningUploaded byEditor IJRITCC
- Adaptive System Control With PID Neural NetworksUploaded byIixshell Fosster Vazzq
- Cnn Frame FinalUploaded byjeffconnors
- IRJET-Convolutional Neural Networks for Automatic Classification of Diabetic RetinopathyUploaded byIRJET Journal
- Optimal Load SheddingUploaded byGreen Heart
- 2-208-14507777711-5Uploaded byYudha Arya
- 2015 ICRA Deep Learning HeliUploaded byKarad Karadias
- 1Uploaded byAboozar Ghorbani
- 3. a Neural Network Based Decision Support System for Real-Time Scheduling of Flexible Manufacturing SystemsUploaded bymaherkamel
- scimakelatex.98868.bublooUploaded bySarang Gupta
- Paper 1Uploaded byVishnu S. M. Yarlagadda

- A Course in Error-Correcting Codes - Justesen and HøholdtUploaded bywolgast09durden1m2
- AI-important questionsUploaded byVibhu Tiwari
- Resolvent.energy Estrada.conjectureUploaded bydragance106
- 1. Fintech ML Using AzureUploaded byVikram Pandya
- JOB SHOP SCHEDULING USING DIFFERENTIAL EVOLUTION ALGORITHMUploaded byTJPRC Publications
- Mathematics Paper 4 ImpqsUploaded byShamsiah Basar
- Cs521 Midterm CheatsheetUploaded byFrankXiao
- LU Decomposition Method for Solving Fully Fuzzy Linear System with Trapezoidal Fuzzy NumbersUploaded byBONFRING
- Mathematics for ComputingUploaded byarcanum78
- Advanced Programming ConceptsUploaded bySakib Farhad
- Analysis of AlgorithmUploaded byÁQißRąxą
- DAA-Lecture Notes-Unit I-2.docxUploaded byManu Kushaal
- Algorithm Design - Jon Kleinberg and Eva Tardos, Tsinghua University Press (2005).pdfUploaded byLuigi
- BSIT 41Uploaded byDamon Singleton
- up1 math sUploaded bymasyati
- ADVANCE 3.1-Flow DiagramUploaded byJuanSebastiánGuzmánFeria
- Hamilton Ian PathUploaded byAshish Thakur
- tutorial-lex-yaccUploaded bySumanth S Murthy
- Deadlocks in Operating SystemsUploaded byMukesh
- tut exam sol for ques six.pdfUploaded byKrishna Mishra
- Lecture 4 - Variables, Constants, And Data Types (1)Uploaded bycurlicue
- Decaf GrammarUploaded byMilton Godinez
- A Memory Efficient -Fully Parallel QC-LDPC EncoderUploaded byVeronica George
- Gradient Search by Newton39s MethodUploaded byFrancisco
- FunctionsUploaded byany
- Oracle DatatypesUploaded byRathish Kumar
- 01 - Scala Quick TutorialUploaded bymaluchurudinesh
- Em201632ada11pyth 1mcd4710 AssignmentUploaded byShanmugha Priya Balasubramanian
- Discrete Instructional MaterialsUploaded bylorenzch
- FY IT Computer Programming SEM II MAY 2018Uploaded byRutvik Ankushe