1. Explain the concept of a perceptron and how it works for binary
classification. 2. Describe the architecture of a multilayer perceptron (MLP) and how it differs from a perceptron 3. Discuss the backpropagation algorithm and its role in training MLPs. 4. Explain the concept of support vector machines (SVM) and their use in linear classification. 5. Discuss kernel functions and their role in transforming data for non-linear SVM classification 6. Explain the k-nearest neighbors (KNN) algorithm and how it works for classification. 7. Explain the hierarchical clustering algorithms AGNES and DIANA and their differences. 8. Discuss partitional clustering algorithms and explain the concept of K-Mode clustering 9. Describe the self-organizing map (SOM) and its applications in unsupervised learning. 10.Explain Gaussian mixture models (GMM) and their use in modeling complex data distributions 11..Discuss Principal Component Analysis (PCA) and Locally Linear Embedding (LLE) as dimensionality reduction techniques in unsupervised learning. 12..Explain combination schemes in ensemble learning and discuss the voting-based approach. 13..Describe the Error-Correcting Output Codes (ECOC) technique and its role in ensemble learning. C04 14..Discuss the concept of bagging and its implementation using RandomForest Trees. 15..Explain the boosting technique, specifically Adaboost, and its advantages in ensemble learning. C01