This is a complex engineering problem that is related to discrete signal processing. This problem can be solved using Matlab or python.

© All Rights Reserved

0 views

This is a complex engineering problem that is related to discrete signal processing. This problem can be solved using Matlab or python.

© All Rights Reserved

- Project Report For News Classification
- plant classification survey
- Selection-of-Accurate-&-Robust-Model-for-Binary-Classification-Problems.pdf
- Data Mining Lab Manual
- SAAIP05
- Iindoor Android
- 13bayes
- Data Mining Lab Manual
- A Review of Particle Swarm Optimization Feature Selection Classification and Hybridizations
- Lecture 3
- CKD Prediction
- NaiveBayes.ppt
- 06600965
- cpsc4430-syllabus-dec2014
- KNN Probabiistic
- 15CSL76_Students.docx
- Application of Data Mining in Term Deposit Marketing - IAENG
- Knn
- tugas review
- 13bayes.pptx

You are on page 1of 3

CEP (CLO-4) Class: BEE-6-A, B, C, D: Spring 2019 Deadline: 17-05-19

CLO-4: Design and interpret complete discrete/ digital system both in time and frequency domain

Note: Marks will be given based upon the presentation and viva. The complex engineering problem can be completed

in groups as well with maximum of two students in each group.

Music signal processing is a sub-branch of digital signal processing that deals with the analysis, interpretation and

processing of digital musical signals. Musical signals can be more complicated than human vocal sound, occupying

a wider spectral band. You have been given a task to develop an automated classification model which can

autonomously classify musical signals, speech signals and songs (the signals comprising of both musical and

human vocal patterns). The input signal fed into the proposed model can be noisy, so the first step would be to

preprocess the input signal to remove noisy contents. Afterwards, you need to extract the most relevant and

distinct features based upon your imagination for classifying the input signal. The classification model which

needs to be implemented are K-Nearest Neighbors and Probabilistic Naïve Bayes. Use both classifiers separately

on the same set of features and tell which one is performing better and why? The top-level block diagram of the

proposed classification model is shown in Figure 1:

The training dataset will be gathered by each group separately and will be marked by them. Remember the more

samples (with more variability) you collect for each class, the more exposure you will give to your classifier during

the training phase. Apart from this, the detailed description of both supervised classifiers is given below:

K-Nearest Neighbors:

In pattern recognition and machine learning, k-nearest neighbors (KNN) is a simple algorithm that stores all

available cases and classifies new cases based on a similarity measure (e.g. distance). KNN is a non-parametric

method where the input consists of the k closest training examples in the feature space. The output is a class

membership. An object is classified by a majority vote of its neighbors, with the object being assigned to the class

Page 1 of 3

Bahria University, Islamabad

Subject: Digital Signal Processing (EEN-325) Instructors: Dr. Saleem / Taimur

CEP (CLO-4) Class: BEE-6-A, B, C, D: Spring 2019 Deadline: 17-05-19

most common among its k nearest neighbors (k is a positive integer, typically small). If k = 1, then the object is

simply assigned to the class of that single nearest neighbor. The algorithm for KNN classification model is given

below:

Algorithm: KNearestNeighbors

Input: Training data X

Training data labels Y

Sample x to classify

Output: Decision 𝑦𝑝 about sample x

for i ← 1 to m do

Compute distance between training sample 𝑋𝑖 and unlabeled sample x i.e. d(𝑋𝑖 , x)

end for

Compute set I containing the indices for the k smallest distances d(𝑋𝑖 , x)

return 𝑦𝑝

Naïve Bayes:

Naïve Bayes is a probabilistic non-linear classifier that classifies a candidate test vector based on Bayes Rule where

all the features within the feature vector are assumed to be statistically independent:

𝑃(𝑥|𝑤𝑖 ) ∗ 𝑃(𝑤𝑖 )

𝑃(𝑤𝑖 |𝑥) =

𝑃(𝑥)

Page 2 of 3

Bahria University, Islamabad

Subject: Digital Signal Processing (EEN-325) Instructors: Dr. Saleem / Taimur

CEP (CLO-4) Class: BEE-6-A, B, C, D: Spring 2019 Deadline: 17-05-19

where 𝑃(𝑤𝑖 |𝑥) is the posterior probability telling the conditional probability of the class 𝑤𝑖 against the given

feature vector 𝑥, 𝑃(𝑥|𝑤𝑖 )is the likelihood of the feature vector 𝑥 given the class 𝑤𝑖 , 𝑃(𝑤𝑖 ) is the prior probability

of class 𝑤𝑖 and 𝑃(𝑥) is the evidence of feature vector 𝑥. Naïve Bayes can be used to classify both numerical and

categorical test vectors. For classifying numerical test vectors, Naïve Bayes calculates likelihood probabilities of

all features for each class labels through Gaussian probability density function. Then the likelihood probability

𝑃(𝑥|𝑤𝑖 ) of a feature vector x is computed by taking the product of likelihood probabilities for all features. The

Gaussian distribution can be computed through:

1 (𝑥−𝜇)2

−

𝑃(𝜇, 𝜎) = 𝑒 2𝜎 2

√2𝜋𝜎 2

where 𝜇 is the mean and 𝜎 is the standard deviation of the Gaussian distribution. The algorithm to implement

Bayesian classification model is given below:

Input: Training data X

Training data labels Y

Sample x to classify

Output: Decision 𝑦𝑝 about sample x

Compute prior probabilities ‘p2’ for the second-class

Compute likelihood for the test sample x against first-class training samples through Gaussian Distribution

Compute likelihood for the test sample x against second-class training samples through Gaussian Distribution

Compute posterior probability ‘P1’ by multiply all first-class likelihood probabilities for x with each other and

then with p1.

Compute posterior probability ‘P2’ by multiply all second-class likelihood probabilities for x with each other

and then with p2.

Normalize P1 by dividing it with the evidence

Normalize P2 by dividing it with the evidence

If P1 >= P2

𝑦𝑝 = 0

else

𝑦𝑝 = 1

return 𝑦𝑝

Good Luck 😊

Page 3 of 3

- Project Report For News ClassificationUploaded byAaron
- plant classification surveyUploaded bydiankusuma123
- Selection-of-Accurate-&-Robust-Model-for-Binary-Classification-Problems.pdfUploaded byeditor_ijcat
- Data Mining Lab ManualUploaded byrajianand2
- SAAIP05Uploaded bypatel_musicmsncom
- 13bayesUploaded byAbhishek Arora
- Data Mining Lab ManualUploaded byKeerthana Sudarshan
- CKD PredictionUploaded byMadhu Ck
- Iindoor AndroidUploaded byjuanka_08
- A Review of Particle Swarm Optimization Feature Selection Classification and HybridizationsUploaded byEditor IJRITCC
- NaiveBayes.pptUploaded byAnonymous jTERdc
- 06600965Uploaded byMahesh Patil
- Lecture 3Uploaded byCyboss
- cpsc4430-syllabus-dec2014Uploaded byAshif Iqubal
- KNN ProbabiisticUploaded byVitor Duarte
- 15CSL76_Students.docxUploaded byYounus Khan
- Application of Data Mining in Term Deposit Marketing - IAENGUploaded bykorodirazvan
- KnnUploaded bycofry
- tugas reviewUploaded byOktafia Mambrasar
- 13bayes.pptxUploaded bySudhagarSubbiyan
- 12Uploaded bypavan
- 15 1512984093_11-12-2017.pdfUploaded byAnonymous lPvvgiQjR
- Lesion DetectionUploaded byvasavi
- pxc3892697.pdfUploaded byTasTenunKulitBali
- 6 Land Cover Classificatio6nUploaded byMutiullahJamil
- 17ICEMTE-COMP68Uploaded byBhaskar Rao P
- BackPropagatiomUploaded bysaideepak2010
- Classification 1Uploaded byAnusha Ammu
- 10.1016@0924-27169500004-6Uploaded byEll HaKim ERdha
- Soft Skills - Problem Solving SkillsUploaded byEida Hidayah

- Wolfgang W. Fuchs-Phenomenology and the Metaphysics of Presence. An Essay in the Philosophy of Edmund Husserl.pdfUploaded byMak Alisson Borges
- 1101iba - nokia planning reportUploaded byapi-281494605
- textos filosoficos en inglesUploaded byJoseidan Joseidan
- 5 Stages of Mastery LearningUploaded bykartini nadarajan
- Gilbert-14 the Origins & Nature of CFTUploaded byPaul Miller
- Speaker Identification System Using Close SetUploaded byInternational Journal of Research in Engineering and Technology
- making the most of masters leaflet.pptxUploaded byalportela
- unplugging the hurt jordanUploaded byapi-193594284
- percuasive edu final intasc 1Uploaded byapi-242743037
- Ten Takeaway Tips for Teaching Critical Thinking _ EdutopiaUploaded byJosé Manuel
- JURNAL SKRIPSI.docxUploaded byarsyil dwi anugrah
- esl 6 field experienceUploaded byapi-270349843
- Judit -- Teachers’ Professional Development on Problem SolvingUploaded bySella Ayu Nindya
- Crisis CounselingUploaded bypaul macharia
- Teacher Leadership Project - Critical AnalysisUploaded byjlsigman4
- History of Scientific Concepts (Ian Hacking)Uploaded byracorderov
- Formal Informal AsmtsUploaded byFresilla Magial
- Lecture Notes 01 Knowledge Management SystemsUploaded byNaman Verma
- John Zerzan - LanguageUploaded byROWLAND PASARIBU
- Online Instructional Materials for StudentsUploaded byMeagen Farrell
- NTS MCQSUploaded byFaheem
- 1995 MELUCCI Process of Collective IdentityUploaded bydlp
- Framework for Case AnalysisUploaded byCharles Binu
- inquiry lesson plan edug 511Uploaded byapi-317520334
- Analysing e-Government Project Success and Failure Using the Design-Actuality Gap ModelUploaded byShefali Virkar
- Lesson Plan Organ PlantUploaded byNaRa RiChia Fairysa
- A SEMIOTIC ANALYSIS OF THE ‘NOKIA CONNECTING PEOPLE’ LOGOUploaded byNovita Arsillah
- Tangent Lines to a Circle (Action Research)Uploaded byNovie Lim
- HPCOS81_Learning+Unit+2_2018Uploaded byTakalani
- Johari Write Up - For TrainersUploaded byTarun Sharma