000 04160cam a2200361 a 4500
005 20150712005511.0
008 020125s2002 flua b 001 0 eng
010 _a 2002020214
020 _a1584881712 (acid-free paper)
040 _aDLC
_cDLC
_dDLC
050 0 0 _aQA278.2
_b.M56 2002
082 0 0 _a519.536
_221
084 _a519.536
_bM S
100 1 _aMiller, Alan J.
245 1 0 _aSubset selection in regression
_h[[Book] /]
_cAlan Miller.
250 _a2nd ed.
260 _aBoca Raton :
_bChapman & Hall/CRC,
_cc2002.
300 _axvii, 238 p. :
_bill. ;
_c24 cm.
440 0 _aMonographs on statistics and applied probability ;
_v95
504 _aIncludes bibliographical references (p. 223-234) and index.
505 8 _aMachine generated contents note: 1 Objectives -- 1.1 Prediction, explanation, elimination or what? -- 1.2 How many variables in the prediction formula? -- 1.3 Alternatives to using subsets -- 1.4 'Black box' use of best-subsets techniques -- 2 Least-squares computations -- 2.1 Using sums of squares and products matrices -- 2.2 Orthogonal reduction methods -- 2.3 Gauss-Jordan v. orthogonal reduction methods -- 2.4 Interpretation of projections -- Appendix A. Operation counts for all-subsets regression -- A.1 Garside's Gauss-Jordan algorithm -- A.2 Planar rotations and a Hamiltonian cycle -- A.3 Planar rotations and a binary sequence -- A.4 Fast planar rotations -- 3 Finding subsets which fit well -- 3.1 Objectives and limitations of this chapter -- 3.2 Forward selection -- 3.3 Efroymson's algorithm -- 3.4 Backward elimination -- 3.5 Sequential replacement algorithms -- 3.6 Replacing two variables at a time -- 3.7 Genierating all subsets -- 3.8 Using branch-and-bound techniques -- 3.9 Grouping variables -- 3.10 Ridge regression and other alternatives -- 3.11 The nonnegative garrote and the lasso -- 3.12 Some examples -- 3.13 Conclusions and recommendations -- Appendix A. An algorithm for the lasso -- 4 Hypothesis testing -- 4.1 Is there any information in the remaining variables? -- 4.2 Is one subset better than another? -- 4.2.1 Applications of Spj-tvoll's method -- 4.2.2 Using other confidence ellipsoids -- Appendix A.Spjotvoll's method - detailed description -- 5 When to stop? -- 5.1 What criterion should we use? -- 5.2 Prediction criteria -- 5.2.1 Mean squared errors of prediction (MSEP) -- 5.2.2 MSEP for the fixed model -- 5.2.3 MSEP for the random model -- 5.2.4 A simulation with random predictors -- 5.3 Cross-validation and the P SS statistic -- 5.4 Bootstrapping -- 5.5 Likelihood and information-based stopping rules -- 5.5.1 Minimum description length (MDL) -- Appendix A. Approximate equivaence of stppingules -- A.1 F-to-enter -- A.2 Adjusted R2 or Fisher's A-statistic -- A.3 Akaikesinformatibn criterion (AIC) -- 6 Estatmaion of regression eficients -- 6.1 Selection bias -- 6.2 Choice between two varies -- 6.3 Selection rduction -- 6.3.1 Monte C o et tionfias i f d lection -- 6.3.2 Shrinkage methods -- 6.3.3 Using the jack-knife -- 6.3.4 Independent; data sets ; -- 6.4 Conditional likiood estimations -- 6.5 Estimationofpopulation means -- 6.6 Estimating least-squares projections ; -- Appendix A. Changing projections to equate sums of squares -- 7 Bayesian mnethods -- 7.1 Bayesian introduction -- 7.2 'Spike and slab'prior -- 7.3 Normal prior for regression coefficients -- 7.4 Model averaging -- 7.5 Picking the best model -- 8 Conclusions and some recommendations -- References -- Index.
521 _aAll Ages.
650 0 _aRegression analysis.
650 0 _aLeast squares.
856 4 1 _3Table of contents only
_uhttp://www.loc.gov/catdir/toc/fy022/2002020214.html
856 4 2 _3Publisher description
_uhttp://www.loc.gov/catdir/enhancements/fy0646/2002020214-d.html
906 _a7
_bcbc
_corignew
_d1
_eocip
_f20
_gy-gencatlg
925 0 _aacquire
_b2 shelf copies
_xpolicy default
955 _akay 2002-01-25 to SSCD
_cjp20 2002-02-01 to subj
_djp99 2002-02-01
_ejp85 2002-02-04 to Dewey
_aaa07 2002-02-05
_aps16 2002-06-20 bk rec'd, to CIP ver.
_fpv12 2002-06-24 CIP ver. to BCCD
_aja15 2003-01-14 copy 2 to BCCD
001 0000046416
003 0000
942 _cBK
999 _c53400
_d53400