1Givethedefinitionsoryourcomprehensionsofthefollowingterms.(12’)1.1TheinductivelearninghypothesisP171.2OverfittingP491.4ConsistentlearnerP1482Givebriefanswerstothefollowingquestions.(15’)2.2Ifthesizeofaversionspaceis,Ingeneralwhatisthesmallestnumberofqueriesmayberequiredbyaconceptlearnerusingoptimalquerystrategytoperfectlylearnthetargetconcept?P272.3Ingenaral,decisiontreesrepresentadisjunctionofconjunctionsofconstrainsontheattributevaluesofinstanse,thenwhatexpressiondoesthefollowingdecisiontreecorrespondsto?3Givetheexplainationtoinductivebias,andlistinductivebiasofCANDIDATE-ELIMINATIONalgorithm,decisiontreelearning(ID3),BACKPROPAGATIONalgorithm.(10’)4Howtosolveoverfittingindecisiontreeandneuralnetwork?(10’)Solution:Decisiontree:及早停止树增长(stopgrowingearlier)后修剪法(post-pruning)NeuralNetworkOutLookHumidityWindSunnyOvercastRainYesHighNormalYesNoStrongYesWeakNo权值衰减(weightdecay)验证数据集(validationset)5ProvethattheLMSweightupdateruleperformsagradientdescenttominimizethesquarederror.Inparticular,definethesquarederrorEasinthetext.NowcalculatethederivativeofEwithrespecttotheweight,assumingthatisalinearfunctionasdefinedinthetext.Gradientdescentisachievedbyupdatingeachweightinproportionto.Therefore,youmustshowthattheLMStrainingrulealtersweightsinthisproportionforeachtrainingexampleitencounters.()(8’)Solution:AsVtrain(b)(Successor(b))wecangetE==AsmentionedinLMS:WecangetTherefore,gradientdescentisachievementbyupdatingeachweightinproportionto;LMSrulesaltersweightsinthisproportionforeachtrainingexampleitencounters.6Trueorfalse:ifdecisiontreeD2isanelaborationoftreeD1,thenD1ismore-general-thanD2.AssumeD1andD2aredecisiontreesrepresentingarbitrarybooleanfuncions,andthatD2isanelaborationofD1ifID3couldextendD1toD2.Iftruegiveaproof;iffalse,acounterexample.(Definition:Letandbeboolean-valuedfunctionsdefinedover.thenismore_general_than_or_equal_to(written)Ifandonlyifthen)(10’)Thehypothesisisfalse.OnecounterexampleisAXORBwhileifA!=B,trainingexamplesareallpositive,whileifA==B,trainingexamplesareallnegative,then,usingID3toextendD1,thenewtreeD2willbeequivalenttoD1,i.e.,D2isequaltoD1.7Designatwo-inputperceptronthatimplementsthebooleanfunction.Designatwo-layernetworkofperceptronsthatimplements.(10’)8Supposethatahypothesisspacecontainingthreehypotheses,,,,andtheposteriorprobabilitiesofthesetypothesesgiventhetrainingdataare0.4,0.3and0.3respectively.Andifanewinstanceisencountered,whichisclassifiedpositiveby,butnegativebyand,thengivetheresultanddetailclassificationcourseofBayesoptimalclassifier.(10’)P1259SupposeSisacollectionoftraining-exampledaysdescribedbyattributesincludingHumidity,whichcanhavethevaluesHighorNormal.AssumeSisacollectioncontaining10examples,[7+,3-].Ofthese10examples,suppose3ofthepositiveand2ofthenegativeexampleshaveHumidity=High,andtheremainderhaveHumidity=Normal.Pleasecalculatetheinformationgainduetosortingtheoriginal10examplesbytheattributeHumidity.(log21=0,log22=1,log23=1.58,log24=2,log25=2.32,log26=2.58,log27=2.8,log28=3,log29=3.16,log210=3.32,)(5’)Solution:(a)HerewedenoteS=[7+,3-],thenEntropy([7+,3-])==0.886;(b)Gain(S,a2)Values()={High,Normal},=4,=5ThusGain=0.886-=0.0410Finishthefollowingalgorithm.(10’)(1)GRADIENT-DESCENT(trainingexamples,)Eachtrainingexampleisapairoftheform,whereisthevectorofinputvalues,andtisthetargetoutputvalue.isthelearningrate(e.g.,0.05).InitializeeachtosomesmallrandomvalueUntiltheterminationconditionismet,DoInitializeeachtozero.Foreachintraining_examples,DoInputtheinstancetotheunitandcomputetheoutputoForeachlinearunitweight,DoForeachlinearunitweight,Do(2)FIND-SAlgorithmInitializehtothemostspecifichypothesisinHForeachpositivetraininginstancexForeachattributeconstraintaiinhIfThendonothingElsereplaceaiinhbythenextmoregeneralconstraintthatissatisfiedbyxOutputhypothesish