You are on page 1of 18

DEPARTMENT of COMPUTER SCIENCE and INFORMATION TECHNOLOGIES

CEN 559 Machine Learning
2011-2012 Fall Term

Dr. Abdülhamit Subaşı
asubasi@ibu.edu.ba

 Office Hour: Open Door Policy  Class Schedule:Monday 17:00-19:45 .

computational complexity. information theory. Draw on concepts and results from many fields. .Course Objectives Present the key algorithms and theory that form the core of machine learning. and control theory. including statistics. cognitive science. biology. artifical intelligence. philosophy.

4. 2007. 1997. Garg and Huang. Machine Learning in Computer Vision.R. Sebe. Hastie. 2006. Mitchell T. 2008. The Elements of Statistical Learning. Second Edition. Machine Learning. Du and Swamy. Tibshirani. 5. McGraw Hill. T.. 3. J.Textbooks 1. Neural Networks in a Softcomputing Framework. 2005. Friedman. Springer. Imperial College Press. Cohen. . Springer-Verlag London Limited. 2. Springer. Neural Networks and Computing. Chow and Cho.

Brief Contents Introduction Concept Learning Decision Tree Learning Artificial Neural Networks Evaluation Hypotheses Bayesian Learning Computational Learning Theory Reinforcement Learning .

related PPT and presentation .Grading Midterm Examination 25% Research & Presentation 25% Final Examination 50% Minimum 15 pages word document.

Ref1 Unsupervised Learning k means Ref5 Self-organizing Maps– Ref3 .Ref1 Basis Function Networks for Classification – Ref3 Advanced Radial Basis Function Networks– Ref3 Fundamentals of Machine Learning and Softcomputing –Ref1 Neural Networks Ref5 Multilayer Perceptrons.Research Topics:                     Linear Methods for Classification Linear Regression Logistic Regression Linear Discriminat Analysis Perceptron Kernel Smoothing Methods Ref5 Kernel Density Estimation and Classification (Naive Bayes) Mixture Models for Density Estimation and Classification Radial Basis Function Networks .Ref1 Hopfield Networks and Boltzmann Machines .Ref1 SVM Ref5 KNN Ref5 Competitive Learning and Clustering .

CNN. WNN) .Research Topics:                      Principal Component Analysis Networks (PCA. ICA).Ref1 Discussion and Outlook (SVM.Ref1 Evolutionary Algorithms and Evolving Neural Networks (PSO) .Ref1 Decision Tree Learning Duda&Hart Random Forest Ref5 PROBABILISTIC CLASSIFIERS-REF2 SEMI-SUPERVISED LEARNING-REF2 MAXIMUM LIKELIHOOD MINIMUM ENTROPY HMM-REF2 MARGIN DISTRIBUTION OPTIMIZATION-REF2 LEARNING THE STRUCTURE OF BAYESIAN NETWORK CLASSIFIERS-REF2 OFFICE ACTIVITY RECOGNITION-REF2 Model Assessment and Selection REF5 Cross-Validation Bootstrap Methods Performance ROC. statistic WEKA Machine Learning Tool TANGARA Machine Learning Tool ORANGE Machine Learning Tool NETICA Machine Learning Tool RAPID MINER Machine Learning Tool .Ref1 Fuzzy Logic and Neurofuzzy Systems .

program. . Learning by machines can overlap with simpler processes.” such as a speech recognition program improving after hearing samples of a person’s speech.What is Machine Learning? Machine learning is the process in which a machine changes its structure. such as the addition of records to a database. but other cases are clear examples of what is called “learning. or data in response to external information in such a way that its expected future performance improves.

Components of a Learning Agent • Curiosity Element – problem generator. knows what the agent wants to achieve. passes info to the learning element . takes risks (makes problems) to learn from • Learning Element – changes the future actions (the performance element) in accordance with the results from the performance analyzer • Performance Element – choosing actions based on percepts • Performance Analyzer – judges the effectiveness of the action.

why not just program a computer to know everything it needs to know already? Many programs or computer-controlled robots must be prepared to deal with things that the creator would not know about. speech programs. . electronic “learning” pets. and robotic explorers. Here. they would have access to a range of unpredictable knowledge and thus would benefit from being able to draw conclusions independently.Why is machine learning important? Or. such as game-playing programs.

.Relevance to AI • Helps programs handle new situations based on the input and output from old ones • Programs designed to adapt to humans will learn how to better interact • Could potentially save bulky programming and attempts to make a program “foolproof” • Makes nearly all programs more dynamic and more powerful while improving the efficiency of programming.

Approaches to Machine Learning • Boolean logic and resolution • Evolutionary machine learning – many algorithms / neural networks are generated to solve a problem. knows nothing about the expected results • Supervised learning – algorithm that models outputs from the input and expected output • Reinforcement learning – algorithm that models outputs from observations . the best ones survive • Statistical learning • Unsupervised learning – algorithm that models outputs from the input.

some AI chatterbots start with little linguistic knowledge but can be taught almost any language through extensive conversation with humans .Current Machine Learning Research Almost all types of AI are developing machine learning. Examples: • Facial recognition – machines learn through many trials what objects are and aren’t faces • Language processing – machines learn the rules of English through example. since it makes programs dynamic.

changes his modes) • Exploration – machines will be able to explore environments unsuitable for humans and quickly adapt to strange properties . gets different tastes.Future of Machine Learning • Gaming – opponents will be able to learn from the player’s strategies and adapt to combat them • Personalized gadgets – devices that adapt to their owner as he changes (gets older.

Problems in Machine Learning • Learning by Example: • Noise in example classification • Correct knowledge representation Heuristic Learning • Incomplete knowledge base • Continuous situations in which there is no absolute answer Case-based Reasoning • Human knowledge to computer representation • • .

Problems in Machine Learning • Grammar – meaning pairs  new rules must be relearned a number of times to gain “strength” • Conceptual Clustering Definitions can be very complicated Not much predictive power .

Successes in Research • Aspects of daily life using machine learning Optical character recognition Handwriting recognition Speech recognition Automated steering Assess credit card risk Filter news articles Refine information retrieval Data mining .