You are on page 1of 2

StatisticalModelingandLearning

inVisionandImageScience

Stat232ACS266A,MW2:003:15pm,JanuaryMarch,2014,BuncheHall3143(ToBeChanged)

syllabus.pdf

CourseDescription
This graduate level course introduces the principles, theories, and algorithms for modeling complex patterns in very high
dimensional spaces, learning these statistical models from large data sets, and verifying them through stochastic sampling
(synthesis).Morespecificallywestudytwoclassesofstatisticalmodels:
1. Descriptivemodels(Markovrandomfields,Gibbsdistributions,flatgraphicalmodels)and
2. Generativemodels(sparsecoding,stochasticgrammars,hierarchicalgraphicalmodels)
Theirintegrationleadstoageneralunifiedtheoryforpusuingstatisticalmodelsoveraseriesofprobabilisticfamilies.The
coursealsoteachesacommonframeworkforconceptualizingstochasticpatternsandforstatisticalknowledgerepresentation.
Althoughthelectureswillmostlyfocusonvisualpatternsinimagesandvideos,andthemethodologyshouldbegenerally
applicabletoawiderangeofapplications,suchasbiologicpatterns,networktrafficmodeling,materialscience,artificial
intelligence,cognitivemodeling,andautonomousrobots,etc.
Thevisualpatternsthatwewillstudyinclude:
1. Spatialpatternsprimitives(texturesandtextons),parts,objectsandscenes
2. Temporalpatternsmotionprimitive,actions,events,groupactivitiesand
3. Causalpatternsfluentsandactionsrecursion.
Wewillstudyaninformationprojectionprincipleforlearningthesepatternsandmodelsinanunsupervisedorweakly
supervisedway,aswellassometheoriesoflearninginthespaceofandorgraphs(AOG).
Inthiscourse,ourgoalistorepresentandinterpretthedatabyprobabilisticmodelswithoutaspecifictaskinmind.Bydoing
so,weemphasizethemodelstructureanddictionary.Thisisdifferentfromdescriminativemodels/methodsinmachine
learning,whichoftenoptimizetheparameterstominimizesomediscriminativelossfunctions(taughtinStat231).

Prerequisites
Basicstatistics,linearalgebra,programmingskillsforaproject.
Knowledgeandexperienceonimageswillbeaplus.

Textbook
Thereisnotextbook.Teachmaterialsincludetutorialsandpapers:onlinereadingmaterials,
Referencebooks:
PatternTheory:thestochasticAnalysisofRealWorldSignal,byMumfordandDesolneux,2010.
PatternTheory:FromRepresentationtoInference,byUlf.GrenanderandMichaelMiller,2007.

Instructor
Prof.SongChunZhu,email:sczhuatstat.ucla.edu,68693,BH9404.OfficeHours:Monday3:305:30pm.
TeachingAssistantDanXie,email:xiedanatg.ucla.edu,BH9432.OfficeHours:Tuesday35:00pm.

GradingPlan
Twohomeworkassignments
(HW17.5%,HW27.5%)
Smallprojectsandexercises
1,Naturalimagestatisticsandscaleinvariance[10%]
[writeyourownmatlabcode][downloadanaturalimagehere]
2,ImageinpaintingbyMRF(Potts)model[10%]
[writeyourownmatlaborC][Original,Mask,Distortedimages]
3,SamplingtheJulesztextureensemble[10%]
[projectdescription][C/MatlabcodeandimagesDownloadhere]
4,LearningActiveBasismodelandHybridimageTemplates[10%]
[description][code][data]
Attendanceanddiscussion
FinalExam

15%

40%

10%
35%

ListofTopics
Chapter1IntroductiontoKnowledgeRepresentation,ModelingandLearning[ch1.pdf]
1.Towardsaunifiedrepresentationofcommonsenseknowledge
2.Compositionality,reconfigurability,functionalityandcausality
3.Representationofconcepts
4.Examplesanddemos

Chapter2Empiricalstatisticalobservationsfromimagedata[ch2.pdf]
1.EmpiricalobservationI:filteredresponses
(Highkurtosis,GeneralizedGaussian,andCauchy)
2.EmpiricalobservationII:scalingproperties
(the1/fpowerlaw,scaleinvarianceofgradients,andentropyrateoverscales).
3.EmpiricalobservationIII:patchfrequency(structuralandtexturalpatches).
4.EmpiricalobservationIV:informationscalingandregimesofstatisticalmodels.

Chapter3DescriptivemodelsI:classicalMarkovRandomFields[ch3_fig.pdf]
1.Markovrandomfieldtheory[ReadWinklerChapter]
(neighborhoods,cliquesandpotentials)
2.Gibbsfields:IsingandPottsmodels
3.TheequivalenceofGibbsandMRF(HammersleyCliffordtheorem)
4.EarlyMarkovrandomfieldmodelsforimages
(Weakmembrane,regularizationforshapefromX,MumfordShah,totalvariance)
5,MaximumEntropyandMaximumLikelihoodEestimation
6,Variationsoflikelihood:pseodo,patch,partiallikelihood
7.FromGibbsdistributionstoPDEsinimageprocessing

Chapter4DescriptivemodelsII:advanced[ch4_fig.pdf]
1.Pythagoreantheoremandinformationprojection
2.Minimaxentropylearningandfeaturepursuit
3.Juleszensemble
4.Ensembleequivalencetheorem
5.Ensemblesinstatisticsmechanics
6.Examplesongeneralprior,shape,curves,Gestaltfieldetc

Chapter5DescriptivemodelsIII:variants*(optional)[ch5.pdf]
1.CausalMarkovmodels,nonparametricmethods
(texturemodels,expansionandcompression)
2.pseudolikelihoodandpatch/partiallikelihood
3.MixedMarkovFields(dynamicneighborhood)

Chapter6GenerativemodelsI:classicalmodels[ch67.pdf]
1.Frametheoryandwavelets
(frame,tightframe,pseudoinverse,compressivesensing,noiselet,randomprojection)
2.Designofframes:imagepyramids
(Gaussian,Laplacian,Gabor,Steerable)
3.Overcompletebasisandmatchingpursuit
4.Markovtreeandstochasticcontextfreegrammar
5,HierarchicalDirichiletProcess(HDP)*

Chapter7GenerativemodelsII:advanced
1.Learningsparsecodingfromnaturalimages
2.Tangrammodelsandhierarchicaltiling
3.Textonsandimagedictionary
4.Activebasismodel
5.Faces

Chapter8Integratedmodels:descriptive+generative[Ch8.pdf]
1.Primalsketchmodelaslowmiddlerepresentationsforgenericimagesandvideo
2,Hybridimagetemplatesforobjectsandscenes
3.Anintegratedlanguagemodel(SCFG+bigram)
4.AndOrgraphsforcontextsensitivegraphgrammars
5.Dependencygrammarandattributedgrammar

Chapter9learningcompositionalmodels
1.UnsupervisedlearningofAndOrgraph:
Blockpursuitandbiclustering.
2.Learningcompositionalsparsitymodels.
3.LearningtemporalAndorgraph(eventgrammar)foractionsandevents.
4.Learningcausalandorgraphforperceptualcausality.
5.PAClearning:
Howmanyexamplesarenecessoryforlearningacompositionalconcept?
6.Curriculumlearningforcompositionalmodels

Chapter10Advancetopics
1.Definingcomplexconcepts.
2.Taskorientedknowledgerepresentationandlearning.