You are on page 1of 14

Roemer'sblog

Home
Archives

RSS

Search

IntroductiontoOneclassSupportVectorMachines
byRoemerVlasveldJul12th,2013postedinchangedetection,classification,machinelearning,
matlab,noveltydetection,supportvectormachine,svm|Comments

Traditionally,manyclassificationproblemstrytosolvethetwoormulticlasssituation.Thegoalof
themachinelearningapplicationistodistinguishtestdatabetweenanumberofclasses,using
trainingdata.Butwhatifyouonlyhavedataofoneclassandthegoalistotestnewdataandfound
outwhetheritisalikeornotlikethetrainingdata?Amethodforthistask,whichgainedmuch
popularitythelasttwodecades,istheOneClassSupportVectorMachine.This(quitelengthly)blog
postwillgiveanintroductiontothistechniqueandwillshowthetwomainapproaches.

Justoneclass?
Firstlookatourproblemsituationwewouldliketodeterminewhether(new)testdataismemberofa
specificclass,determinedbyourtrainingdata,orisnot.Whywouldwewantthis?Imagineafactory
typeofsettingheavymachineryunderconstantsurveillanceofsomeadvancedsystem.Thetaskof
thecontrollingsystemistodeterminewhensomethinggoeswrongtheproductsarebelowquality,
themachineproducesstrangevibrationsorsomethinglikeatemperaturethatrises.Itisrelativelyeasy
togathertrainingdataofsituationsthatareOKitisjustthenormalproductionsituation.Butonthe
otherside,collectionexampledataofafaultysystemstatecanberatherexpensive,orjustimpossible.
Ifafaultysystemstatecouldbesimulated,thereisnowaytoguaranteethatallthefaultystatesare
simulatedandthusrecognizedinatraditionaltwoclassproblem.

Tocopewiththisproblem,oneclassclassificationproblems(andsolutions)areintroduced.Byjust
providingthenormaltrainingdata,analgorithmcreatesa(representational)modelofthisdata.If
newlyencountereddataistoodifferent,accordingtosomemeasurement,fromthismodel,itislabeled
asoutofclass.WewilllookintheapplicationofSupportVectorMachinestothisoneclassproblem.

BasicconceptsofSupportVectorMachines
Letusfirsttakealookatthetraditionaltwoclasssupportvectormachine.Consideradataset
d
= {(x1 , y1 ), (x2 , y2 ), , (xn , yn )} pointsxi R ina(forinstancetwo
dimensional)spacewherexi istheithinputdatapointandyi {1, 1}istheithoutput
pattern,indicatingtheclassmembership.

AverynicepropertyofSVMsisthatitcancreateanonlineardecisionboundarybyprojectingthe
datathroughanonlinearfunctiontoaspacewithahigherdimension.Thismeansthatdatapoints
whichcantbeseparatedbyastraightlineintheiroriginalspaceI areliftedtoafeaturespaceF
wheretherecanbeastraighthyperplanethatseparatesthedatapointsofoneclassfromanother.
WhenthathyperplanewouldbeprojectedbacktotheinputspaceI ,itwouldhavetheformofanon
linearcurve.Thefollowingvideoillustratesthisprocessthebluedots(inthewhitecircle)cannotbe
linearlyseparatedfromthereddots.Byusingapolynomialkernelforprojection(latermoreonthat)
allthedotsareliftedintothethirddimension,inwhichahyperplanecanbeusedforseparation.When
theintersectionoftheplanewiththespaceisprojectedbacktothetwodimensionalspace,acircular
boundaryarises.

ThehyperplaneisrepresentedwiththeequationwT x + b = 0 ,withw F andb R .The


hyperplanethatisconstructeddeterminesthemarginbetweentheclassesallthedatapointsforthe
class1areononeside,andallthedatapointsforclass1ontheother.Thedistancefromthe
closestpointfromeachclasstothehyperplaneisequalthustheconstructedhyperplanesearchesfor
themaximalmargin(separatingpower)betweentheclasses.TopreventtheSVMclassifierfrom
overfittingwithnoisydata(ortocreateasoftmargin),slackvariablesi areintroducedtoallow
somedatapointstoliewithinthemargin,andtheconstantC > 0determinesthetradeoffbetween
maximizingthemarginandthenumberoftrainingdatapointswithinthatmargin(andthustraining
errors).TheobjectivefunctionoftheSVMclassifieristhefollowingminimizationformulation:
2 n
w
min + C i
w,b, i 2
i=1

subject to:
T
yi (w (x i ) + b) 1 i for alli = 1, , n

i 0 for alli = 1, , n

Whenthisminimizationproblem(withquadraticprogramming)issolvedusingLagrangemultipliers,
itgetsreallyinteresting.Thedecisionfunction(classification)ruleforadatapointxthenbecomes:
n

f(x) = sgn( i yi K(x, xi ) + b)

i=1

Herei aretheLagrangemultiplierseveryi > 0 isweightedinthedecisionfunctionandthus


supportsthemachinehencethenameSupportVectorMachine.SinceSVMsareconsideredtobe
sparse,therewillberelativelyfewLagrangemultiplierswithanonzerovalue.

KernelFunction
T
ThefunctionK(x, xi ) = (x) (xi )isknownasthekernelfunction.Sincetheoutcomeof
thedecisionfunctiononlyreliesonthedotproductofthevectorsinthefeaturespaceF (i.e.allthe
pairwisedistancesforthevectors),itisnotnecessarytoperformanexplicitprojectiontothatspace
(aswasdoneintheabovevideo).AslongasafunctionK hasthesameresults,itcanbeusedinstead.
ThisisknownasthekerneltrickanditiswhatgivesSVMssuchagreatpowerwithnonlinear
F
separabledatapointsthefeaturespaceF canbeofunlimiteddimensionandthusthehyperplane
separatingthedatacanbeverycomplex.Inourcalculationsthough,weavoidthatcomplexity.

Popularchoicesforthekernelfunctionarelinear,polynomial,sigmoidalbutmostlytheGaussian
RadialBaseFunction:
2
x x

K(x, x ) = exp ( )
2
2


where R isakernelparameterandx x isthedissimilaritymeasure.

Withthissetofformulasandconceptsweareabletoclassifyasetofdatapointintotwoclasseswith
anonlineardecisionfunction.But,weareinterestedinthecaseofasingleclassofdata.Roughly
therearetwodifferentapproaches,whichwewilldiscussinthenexttwosections.

OneClassSVMaccordingtoSchlkopf
TheSupportVectorMethodForNoveltyDetectionbySchlkopfetal.basicallyseparatesallthedata
pointsfromtheorigin(infeaturespaceF )andmaximizesthedistancefromthishyperplanetothe
origin.Thisresultsinabinaryfunctionwhichcapturesregionsintheinputspacewherethe
probabilitydensityofthedatalives.Thusthefunctionreturns+1inasmallregion(capturingthe
trainingdatapoints)and1elsewhere.

Thequadraticprogrammingminimizationfunctionisslightlydifferentfromtheoriginalstatedabove,
butthesimilarityisstillclear:
n
1 1
2
min w + i
w, i , 2 n
i=1

subject to:

(w (x i )) i for alli = 1, , n

i 0 for alli = 1, , n

InthepreviousformulationtheparameterC decidedthesmoothness.Inthisformulaitisthe
parameter thatcharacterizesthesolution

1. itsetsanupperboundonthefractionofoutliers(trainingexamplesregardedoutofclass)and,
2. itisalowerboundonthenumberoftrainingexamplesusedasSupportVector.

Duetotheimportanceofthisparameter,thisapproachisoftenreferredtoas-SVM.

AgainbyusingLagrangetechniquesandusingakernelfunctionforthedotproductcalculations,the
decisionfunctionbecomes:
n

f(x) = sgn((w (xi )) ) = sgn( i K(x, xi ) )

i=1

Thismethodthuscreatesahyperplanecharacterizedbyw and whichhasmaximaldistancefrom


theorigininfeaturespaceF andseparatesallthedatapointsfromtheorigin.Anothermethodisto
createacircumscribinghyperspherearoundthedatainfeaturespace.Thisfollowingsectionwill
showthatapproach.

OneClassSVMaccordingtoTaxandDuin
ThemethodofSupportVectorDataDescriptionbyTaxandDuin(SVDD)takesaspherical,instead
ofplanar,approach.Thealgorithmobtainsasphericalboundary,infeaturespace,aroundthedata.
Thevolumeofthishypersphereisminimized,tominimizetheeffectofincorporatingoutliersinthe
solution.

Theresultinghypersphereischaracterizedbyacentera andaradiusR > 0asdistancefromthe


2
centerto(anysupportvectoron)theboundary,ofwhichthevolumeR willbeminimized.The
centera isalinearcombinationofthesupportvectors(thatarethetrainingdatapointsforwhichthe
Lagrangemultiplierisnonzero).Justasthetraditionalformulation,itcouldberequiredthatallthe
distancesfromdatapointsxi tothecenterisstrictlessthenR,buttocreateasoftmarginagainslack
variablesi withpenaltyparameterC areused.Theminimizationproblemthenbecomes:
n

2
min R + C i
R,a
i=1

subject to:
2 2
x i a R + i for alli = 1, , n

i 0 for alli = 1, , n

AftersolvingthisbyintroductionLagrangemultipliersi ,anewdatapointzcanbetestedtobein
oroutofclass.Itisconsideredinclasswhenthedistancetothecenterissmallerthanorequaltothe
radius,byusingtheGaussiankernelasadistancefunctionovertwodatapoints:
n 2
z x i
2 2
z x = i exp ( ) R /2 + CR
2

i=1

Youcanseethesimilaritywiththetraditionaltwoclassmethod,thealgorithmbySchlkopfandTax
andDuin.SofarthetheoreticalfundamentalsofSupportVectorMachines.Letstakeaveryquick
looktosomeapplicationsofthismethod.

Applications(inMatlab)
AverygoodandmuchusedlibraryforSVMclassificationisLibSVM,whichcanbeusedforMatlab.
OutoftheboxitsupportsoneclassSVMfollowingthemethodofSchlkopf.Alsoavailableinthe
LibSVMtoolsisthemethodforSVDD,followingthealgorithmofTaxandDuin.

Togiveanicevisualclearificationofhowthekernelmapping(tofeaturespaceF works),Icreateda
smallMatlabscriptthatletsyoucreatetwodatasets,redandbluedots(note:thissimulatesatwo
classexample).Afterclicking,youareabletoinspectthedataafterbeingprojectedtothethree
dimensionalspace.Thedatawillthenresultinashapelikethefollowingimage.
1 %DemotovisualizethemappingwithaGaussianRadialBasisFunction,
2 %especiallyinthecontextofSupportVectorMachines.
3 %
4 %Whenthisscriptisexecuted,firstacollectionofredpointscanbe
5 %clickedonthegraph.
6 %Afterthat,thebluepointscanbegenerated.
7 %Thentheusermustprovideagammavalue.
8 %Thefinalgraphcanberotatedtoinspectthe3Dspace(usethe"turn"
9 %iconnextthehandinthetoolbar)
10 %
11 %CreatedbyRoemerVlasveld(roemer.vlasveld@gmail.com)
12 %
13 %Theblogpostwherethisisused:
14 %http://rvlasveld.github.io/blog/2013/07/12/introductiontooneclasssupportvectormachine
15 %
16 %Pleasefeelfreetousethisscripttoyourneed.
17
18 figure
19 axis([10101010])
20 holdon
21 gridon
22 %Initially,thelistofpointsisempty.
23 red=[]
24 blue=[]
25
26 %Loop,pickingupthepointsfortheredclass.
27 disp('')
28 disp('Clickinthegraphfortheredpoints,e.g.inawidecircularform')
29 disp('Leftmousebuttonpickspoints.')
30 disp('Rightmousebuttonpickslastpoint.')
31 but=1
32 n=0
33 whilebut==1
34 [xi,yi,but]=ginput(1)
35 plot(xi,yi,'ro')
36 n=n+1
37 red(:,n)=[xiyi]
38 end
39
40 disp('Finishedcollectionredpoints')
41 disp('')
42
43 %Loopagain,pickingupthepointsfortheblueclass
44 disp('Nowclickinthegraphforthebluepoints,e.g.inasmallercircularform')
45 disp('Leftmousebuttonpickspoints.')
46 disp('Rightmousebuttonpickslastpoint.')
47 but=1
48 n=0
49 whilebut==1
50 [xi,yi,but]=ginput(1)
51 plot(xi,yi,'bo')
52 n=n+1
53 blue(:,n)=[xiyi]
54 end
55
56 disp('Finishedcollectionbluepoints')
57 disp('')
58
59 sigma=input('sigma=?(defaultvalue:1):')
60 ifisempty(sigma)
61 sigma=1
62 end
63
64 project=@(data,sigma)sum(exp((squareform(pdist(data,'euclidean').^2)./(2*sigma
65
66 blue_z=project(blue',gamma)
67 red_z=project(red',gamma)
68
69 clf
70 holdon
71 gridon
72 scatter3(red(1,:),red(2,:),red_z,'r')
73 scatter3(blue(1,:),blue(2,:),blue_z,'b')
74

visualize_projection.mhostedwithbyGitHub viewraw

Applicationtochangedetection
AsaconclusiontothispostIwillgivealookattheperspectivefromwhichIamusingoneclass
SVMsinmycurrentresearchformymasterthesis(whichisperformedattheDutchresearch
companyDobots).Mygoalistodetectchangepointsinatimeseriesdataalsoknownasnovelty
detection.OneclassSVMshavealreadybeenappliedtonoveltydetectionfortimeseriesdata.Iwill
applyitspecificallytoaccelerometerdata,collectionbysmartphonesensors.Mytheoryisthatwhen
thechangepointsinthetimeseriesareexplicitlydiscovered,representingchangesintheactivity
performedbytheuser,theclassificationalgorithmsshouldperformbetter.ProbablyinanextpostI
willtakeafurtherlookatanalgorithmfornoveltydetectionusingoneclassSupportVector
Machines.

Update:GitHubrepository
CurrentlyIamusingtheSVDDmethodbyTaxandDuintoimplementchangedetectionandtemporal
segmentationforaccelerometerdata.IamusingtheMatlabdd_toolspackage,createdbyTax,forthe
incrementalversionofSVDD.Youcanusemyimplementationandforkitfromtheoc_svmgithub
repository.Mostofthefunctionsaredocumented,butitisunderheavydevelopmentandthusthe
preciseworkingsdifferfromtimetotime.Iamplanningtowriteagoodreadme,butifyouare
interestedIadviseyoutolookattheapply_inc_svdd.mfile,whichcreatestheSVMclassifierand
extractspropertiesfromtheconstructedmodel.

byRoemerVlasveldJul12th,2013postedinchangedetection,classification,machinelearning,
matlab,noveltydetection,supportvectormachine,svm

Tweet 64
22 Like Share 42peoplelikethis.Bethefirstofyourfriends.
CreatinginteractivegraphswithSVG,Part2AnopenlettertotheLinkedInUserExperienceteam

Comments
AROUNDTHEWEB ALSOONROEMER'SBLOG WHAT'STHIS?

SmartHealthNevergrillmeat,unless VisualizingaccelerometerdataRoemer
youdothisfirstSmartHealth Vlasveld'scollectionofstuff1comment
IAmConvincedThese34Celebrities CreatinginteractivegraphswithSVG,Part
WereSeparatedAtBirth,Simply 1RoemerVlasveld'scollectionof
ViralNova 12comments
QuadJumpoverFlyingAirplane Openandcloseapplicationswhenan
whysearch.com otherlaunchesorterminatesRoemer
1comment
TheseCelebritiesWereOnceHomeless AnopenlettertotheLinkedIn(UX)team
AfriZap 1comment

32Comments Roemer'sblog Login

Recommend 1 Share SortbyBest

Jointhediscussion

RazaRahi 4monthsago
pleasetellmethecodeinmatlabthathowmarginsamplingusedinSVMclassifir
1 Reply Share

DhiVya 6monthsago
Ineedresultsfromkmeans,klofandsvddfromdd_tools
1 Reply Share

RoyaAliakbari 10monthsago
DearMr.Velasveld

I'mworkingonanarticlesthatisrelatedtoSVDD.inthisarticletheauthorsusedlibsvm
forimplementation.
theychangedtheparameterNuanddrewsomediagram!!!butinSVDD(TaxandDuin)
optimizationproblemthereisn'tanyNuparameter.I'mratherconfused!whatisNu??Nu
isC??
1 Reply Share

sanj22 amonthago
Ihave4differentcatagoriesofimagesandwanttoclassifytheimageintooneofthese4.
foreachofthecatagoryihavearound50imagesthatcanbeusedastemplateto
compare.IwanttouseSVMmethodforthisbuthavenoideahowtogeneratea
SVMmoduleforthis.Iwouldbeextremelythankfulifsomeonecanhelpmeoutinthis
ASAP.Iamusingmatlab.
Reply Share

DhiVya 6monthsago
howtoinsertmydatasetindd_tools.
Reply Share

RangoHu ayearago
It'sveryhelpful.Iamwondering,doesoneclassSVMstillworkwhentwoclassesare
verybiased?Imean,thedataIprovidedtooneclassSVMcomefromoneclass,butthis
classisminority.CanoneclassSVMstilldetectthedatafromtheothermajorityclass?
Thanks.
Reply Share

RoemerVlasveld Mod >RangoHu ayearago


HiRango,nicetohearyoufindithelpful.

CurrentlyIhavenomathematicalofliteraturbacking,butmyexperience(and
memoryofthepapersIhaveread)saysthatastrongbiasdoesnotmatter.The
amountofwhichthetwoclassesdifferintheirproperties,andthusthevaluesof
the"minority"classrelativetothe"major"class,aremoreimportant.

Youcaneasilycreateatestcase,withthedd_toolsmentionedintheupdateofmy
post(ifyouarefamiliairwithMatlab).Otherwise,theLibSVMSVDDextension
shouldalsoworkforanyplatformyouuse.
Reply Share

RangoHu>RoemerVlasveld ayearago
Interesting,Iwilltryitout.Thanks.BTW,ifyouonlyhavedatafromone
class,whenoneclassSVMtellsyouanewcomingrecorddoesn'tbelong
tothesameclass,thenhowcanyoutellwhetherthisresultisrightor
wrong?
Reply Share

gprakash66 ayearago
canyoupleaseadvicehowcaniperformoneclassclassificationusiglibsvm.whatare
commandishoulduse..
Reply Share

gprakash66 ayearago
yourpostisveryinterestingiamalsodoingoneclassSVMusinglibsvmfor
accelerometerdata
Reply Share

see>gprakash66 6monthsago
rudonewiththissvmimplementation..ifyess.docontactme..ihaveaquery..
Reply Share

Sepi ayearago
Veryusefularticle.enjoyeditverymuch,aswellascomments.

I'malsousingOCSVMforanomalydetectionindatareceivedfromIMUsensorsfrom
mobiledevices.
Lookingforwardtoseehowyourresearchturnsout.Whatareaisyourresearchfocusing
on?AreyouusingrawdataorareyougoingtouseSTDandsoon?
Reply Share

huchenlong ayearago
Canyougivemesomeadvicesabouthowtosetthelabelsoftrainingsetonlibsvm?
Becauseallthepredictedlabelsare"1"whenIsetthetraininglabelto1
Reply Share

huchenlong ayearago
DearRoemerVlasveld,canIaskyouhowtoperformoneclasssvm?Didyouperformit
onlibSVM?
Reply Share

RoemerVlasveld Mod >huchenlong ayearago


ItriedthelibSVMimplementationoftheSVDDalgorithm,whichyoucanfindhere:
http://www.csie.ntu.edu.tw/~cj...

IfyouuseMatlab,Iwouldadvisetousedd_tools,whichiscreatedbyTaxhimself
(thecreatoroftheSVDDalgorithm),whichyoucanfind:
http://prlab.tudelft.nl/david...

libSVMisavailableforalotoflanguagesandplatforms,dd_toolsonlyforMatlab.
Reply Share

clh>RoemerVlasveld ayearago
DidyoueverperformoneclasssvmofSchlkopfetonlibSVM?
Allthepredictedlabelsare"1"whichisverystrange
Reply Share
RoemerVlasveld Mod >clh ayearago
Canyoupostyourcodesomewhere?Githubgistmaybe?
Reply Share

clh>RoemerVlasveld ayearago
Mycodeissosimple:
(usematlabonlibSVM)
tr_data=oc_set(gendatb([50,0]),1)
tr_label=ones(50,1)
te_data=oc_set(gendatb([50,50]),1)
te_label=[ones(50,1)ones(50,1)]

model=svmtrain(tr_label,tr_data,'s2n0.05g1')

[pr_label,accuracy]=svmpredict(te_label,te_data,model)
Reply Share

RoemerVlasveld Mod >clh ayearago


I'msorry,Ihaven'ttriedtheSchlkopfmethodinMatlab/libSVM.
Maybeyoucansearchhttp://stats.stackexchange.com...foran
example,orotherwiseyoucanpostyourquestionthere.
Reply Share

CHENLONG ayearago
Iamconfusedwiththeprimalformulationofoneclasssvminthepaper"Estimatingthe
SupportofaHighDimensionalDistribution",whydidyouleaveminusrho(p)inthe
primalminimizationformula?Doesthishavesomegeometricmeaning?
Reply Share

RoemerVlasveld Mod >CHENLONG ayearago


Firstofall,togetthingsstraight,Iamnottheauthorof"EstimatingtheSupportof
aHighDimensionalDistribution",thatisSchlkopf.Itisbetterifyoudirectyour
questiondirectlytohim.

Idohaveanideawhytherhoisintheminimizationfunctionthewandptogether
formthesolutiontheseparationproblem.Inthatsense,wdeterminesthenormal
(or:orientation)oftheplane,relativetotheorigin.Theparameterrho(p)
determinesthedistancefromtheorigintothatplane.Sothersisageometric
relation.Also,thevalueoftheslackvariablesarerelatedtorhointhesensethatit
determinestheamountofoutlierdistanceacceptableforthesolution.
Reply Share

Edmund 2yearsago
HowdoIquantifytheparametersCandsigmatomakea'good'boundary?Itseemslike
thesvmisclassifyingatrandom!
Reply Share
RoemerVlasveld Mod >Edmund 2yearsago
HiEdmund,

Thesettingsoftheparametersisindeedsometimestricky.InthecaseofSVDD,
thefollowing(seemsto)holds:

TheCparameterdeterminestheinfluenceofdatapointsonthecalculationofthe
centerandradiusofthecircle.Itholdsthatevery\alpha_iisboundedbyC,and
when\alpha_iequalsCthatcorrespondingdatapointisanoutlier.

ThismeansthatahighvalueofC(e.g.0.95)meansthat95%percentofthedata
willberegardedas"inclass"andonly5%asanoutlier.Theradius(and
boundary)thatisconstructedaroundthedatawillthusbequitelargerto
incorporatemoredatapoints.

Thevalueofsigmasetsthe"width"oftheGaussiankernel.Thismeansthata
smallvaluecreatesonly"narrow"peaksinthelandscape,andthusatight
boundaryaroundthedata,whilstlargevaluesforsigmacreateawideboundary.

TheseparametersarediscussedinthePhDthesisofDavidTax,youcanviewit
hereonline:http://repository.tudelft.nl/v...

Especiallychapter2isagoodreadonSVDD.Onpage34(fig2.6)afewgraphs
areplottedthatdifferthevalueofsigma.

Ihopethiswasahelpfulanswer.Feelfreetoaskmore.
Reply Share

Edmund>RoemerVlasveld 2yearsago
HiRoemer,thanksforanswering.ButwhataboutSVMingeneral?SoC
appliestothedatapointsafterthedimensionaltransformationwhilesigma
appliesonlyduringthetransformation?
Reply Share

RoemerVlasveld Mod >Edmund 2yearsago


Ithinkyoucouldstatethat.TheCparameterappliesonlyto"soft
margin"SVMsitdefineshowflexiblethesystemhandlesdata
pointsonthe"wrong"sideoftheborderandsetsthe"amountof
penalty"forthevaluesoftheslackvariables.

Thesigmaisaparameterofthekernelmethod,andindeed
determinesthetransformation.InthiscaseaRBFkernelisused,so
thesigmaappliestothat.Withalinearorpolynomialkernelyou
wouldhaveanother(setof)parameter(s).

Update:Ifyoureadchapter2ofthePhDthesis,page37and38
makeaclearrelationsoftherelationforCandsigmatothemodel.
InthecaseofOCSVMahighvalueofC(near1.0)restrictsthe
numberofoutliers,soalldatapointswillbeincluded.Itsetsthe
numberofoutliers,soalldatapointswillbeincluded.Itsetsthe
fractionofoutliers.Sigmasetstheshapeofthemodelalowvalue
willresultinmanyobjectsactingassupportvector.Ahighvaluewill
useafewdatapointsassupportvector,resultingin(almost)arigid
sphereinthehyperspace.

Thevaluesofsigmaareconsidered"low"and"high"inrelationto
theminimumandmaximininterobjectdistanceofthedatapoints.
Reply Share

DeepakRoyChittajallu 2yearsago
Nicepost.Nottobepicky,butthereisamisplacedparanthesisinthefirstinequality
constraint(set)inthetwoclassSVMformulation.Itmustinsteadbey_i\(w^T\phi\(x_i\)
+b\)\geq1\xi_i
Reply Share

RoemerVlasveld Mod >DeepakRoyChittajallu 2yearsago


Thankyoufornoticing!Fixeditinthepost.
Reply Share

DeepakRoyChittajallu>RoemerVlasveld 2yearsago
You'rewelcome.Onethingalwayspuzzledmewhatistheintuition
behindseparatingthedatafromtheorigininthescholkopf'soneclass
SVMtheoptimizationisquitestraightforwardbutidintquitedigestthe
intuitionbehindthatstatement,Whyorigin?Soundsabitabsurdtome.On
thecontrary,thetaxandduinoneclassSVMisquiteintuitive....theyare
tryingtofindatighthyperballenclosingthedata"aftermappingthedatato
ahighdimensionalspaceviathekerneltrick"whichwillprovidetightly
enclosinghypersurfacesofarbitraryshapesintheoriginalspaceofthe
data.IntwoclassSVMeventhoughtheoriginaldecisionboundaryis
nonlinear,weexpecttofindalineardecisionboundaryafterprojectingtoa
veryhighdimspace(viakernel)andlikewiseintheoneclass(oftaxand
duin)caseeventhoughthetightlyenclosinghypersurfaceisnonsperical
whenweprojecttoaveryhighdimspaceitsintuitivetolookforasphere
enclosingdataallthesestatementsarequiteintuitiveright.Now,whats
theintuitionbehindscholkopf'soneclasssvm?separatingfromorigin
seemsanabsurdthoughttome.Iwonderwhatisyourtakeonthis?
Reply Share

RoemerVlasveld Mod >DeepakRoyChittajallu 2yearsago


ThatisagoodquestionDeepakRoyChittajallu.Myintuitionabout
theseparatinghyperplaneofSchlkopfisthefollowing.

The(GaussianRBF)kernelfunctioncanbeseenasa
measurementofnearness.Inbothvisualexample(animatedgif)we
seethatpointsthataremoreclosetoothersaremapped"higher"in
thezdimension(inthegifthebluepointsareclosertoeachother
andarethusmappedhigher).

Themappingconsistsofasumoverthekernelfunction,which
measuresthenearnessforthetobemappedpointtoalltheother
datapoints.Thecloser,thehigherthesumis.

Whenallthedatapointsaremappedtothefeaturespace,oncan
constructa"lowerbound"onallthedatapointsintheformofa
hyperplane.Forthemostpreciselowerbound,thedistancefrom
theorigintothehyperplaneismaximized.Allthedatapoints
"above"theplanearecloseenoughtoeachotherandarethusin
class.Fornewdatapointswhicharemappedbelowthehyperplane
thedistancetootherpointsintheoriginalspaceistoolargethey
areoutofclass.

Inthegifyoucanvisualizethatbelowallthebluepointsaplaneis
drawnitmustbeascloseaspossibletothelowestbluepoint(thus
maximaldistancefromtheorigin),andallthepointsabovebelong
totheblueclass.

Ihopethisintuitioniscorrectandmakestheconceptclearforyou.
Reply Share

me 2yearsago
Havealookatthisarticle(sorryfortheformatting,ihavenoideahowtoremove
formatting):

Localoutlierdetectionreconsidered:ageneralizedviewonlocalitywithapplicationsto
spatial,video,andnetworkoutlierdetection
ErichSchubert,
ArthurZimek,
HansPeterKriegel

DataMiningandKnowledgeDiscoveryDecember2012
DOI10.1007/s106180120300z
http://link.springer.com/artic...

Theauthorsshowhowtogeneralizeexistinganomalydetectionmethodstodifferent
domains,includingdatastreams.Thedemonstratetheprocessonavideostream,and
howanexistingvectorspacemethodcanbeusedtodetectchangesinavideofeed.This
maybeexactlywhatyouneedtodetectchangesinmovementpatterns.
Reply Share

RecentPosts
Herokudatabasesizewarning:findingtheproblem
RailsPresenters:fillingtheModelViewControllergap
Maelstrom:usingPortofRotterdamopendata
AnopenlettertotheLinkedInUserExperienceteam
IntroductiontooneclassSupportVectorMachines

Archives
GitHubRepos

pustovalov.github.com

thesis
MasterthesisaboutHumanActivityRecognition

oc_svm
OneClassSVMimplementationsfortemporalsegmentationofhumanactivities

maelstrom
MaelstromAppCongestionriskpredictionforthePortofRotterdam.

FoundOn

ContactAt
roemer.vlasveld@gmail.com

Roemer'sblogCopyright2014RoemerVlasveld

PoweredbyOctopress.DesigninspiredbyAdrianArtiles.