非参数可加模型的迭代自适应稳健变量选择.pdf
《非参数可加模型的迭代自适应稳健变量选择.pdf》由会员分享,可在线阅读,更多相关《非参数可加模型的迭代自适应稳健变量选择.pdf(28页珍藏版)》请在咨信网上搜索。
1、应用概率统计第 40 卷第 2 期2024 年 4 月Chinese Journal of Applied Probability and StatisticsApr.,2024,Vol.40,No.2,pp.201-228doi:10.3969/j.issn.1001-4268.2024.02.001Iterative Adaptive Robust Variable Selection inNomparametric Additive ModelsZHU Nenghui(School of Mathematics and Statistics,Xiamen University of Te
2、chnology,Xiamen,361024,China)YOU Jinhong(School of Statistics and Management,Shanghai University of Finance and Economics,Shanghai,200433,China)XU Qunfang(Business School,Ningbo University,Ningbo,315211,China)Abstract:By utilizing the robust loss function,B-spline approximation and adaptive group La
3、s-so,a nonparametric additive model is investigated to identify insignificant covariates for the“largep small n”setting.Compared with the ordinary least-square adaptive group Lasso,the proposedmethod is resistant to heavy-tailed errors or outlines in the responses.To prove facilitate presen-tation,a
4、 more general weighted robust group Lasso estimator is considered.Moreover,the weightvectors play a pivotal role for the suggested estimators to enjoy the model selection oracle propertyand asymptotic normality.The robust group Lasso and adaptive robust group Lasso can be seenas special circumstance
5、s of different weight vectors.In practice,we use the robust group Lasso toobtain an initial estimator to reduce the dimension of the problem,and then apply the iterativeadaptive robust group Lasso to select nonzero components.The results of simulation studies showthat the proposed methods work well
6、with samples of moderate size.A high-dimensional geneTRIM32 data is used to illustrate the application of the proposed method.Keywords:adaptive group Lasso;high-dimensional data;nonparametric regression;oracle prop-erty;robust estimation2020 Mathematics Subject Classification:primary 62G35;secondary
7、 62A01Citation:ZHU N H,YOU J H,XU Q F.Iterative adaptive robust variable selection in nom-parametric additive modelsJ.Chinese J Appl Probab Statist,2024,40(2):201228.The project was supported by the National Natural Science Foundation of China(Grant No.11971291),the National Social Science Foundatio
8、n of China(Grant No.19BTJ032),Fujian Alliance of Mathematics(Grant No.2023SXLMMS10)and Scientific Research Climbing Program of Xiamen University of Tech-nology(Grant No.XPDKT20037).Corresponding author,E-mail:.Received October 30,2023.Revised January 9,2024.202Chinese Journal of Applied Probability
9、and StatisticsVol.401IntroductionConsider the following nonparametric additive model:Yi=pj=1fj(Xij)+i,(1)where(Yi,Xi)are independent and identically distributed(i.i.d.)with the same distribu-tion as(Y,X),X=(X1,X2,Xp)Tis the p-dimensional predictor and iis the noisewith mean zero,the fjs are unknown
10、functions.The statistical problem is to determinewhich additive components are insignificant components(i.e.,fj 0)and estimate thenonzero components under the heavy-tailed errors or outlines in responses.The chal-lenge becomes more serious when the number of covariates diverges with the sample size.
11、Therefore,we present a robust group Lasso penalization-based procedure to identify in-significant covariates for the“large p small n”setting under the heavy-tailed errors oroutlines in responses,and show that the proposed method can correctly select the nonzerocomponents with high probability.There
12、is a large body of literature on penalized methods for variable selection andestimation with high-dimensional data.Penalized methods have been proposed includingLasso1,SCAD penalty2,3,MCP4,etc.Particularly in high-dimensional settings,muchprogress has been made in the variable selection,estimation a
13、nd prediction propertiesof the Lasso,see 419,among others.All these works are assumed a linear or otherparametric model.However,in many applications,nonparametric models is more flexiblethan the linear model and can be fitted to high-dimensional data when fully nonparametricmodels is infeasible due
14、to“curse of dimensionality”.As we known,the additive model is a widely used nonparametric models.However,in practise,only some of the components are really nonparametric,and other compo-nents contain predictors unrelated to the responses.For this,several independent workshave shown that the sparse a
15、dditive models can be fitted successfully to various data setswith large dimensions.Combining ideas from sparse linear modeling and additive non-parametric regression,Ravikumar et al.20showed that SPAM can be effective in fittingsparse nonparametric models in high dimensional data.Meier et al.11prop
16、osed a newsparsity-smoothness penalty for high-dimensional generalized additive models.Using B-spline bases,Huang et al.21applied group Lasso iteratively for the additive model andshowed that adaptive group Lasso can be used for high-dimensional nonparametric prob-lems despite that the initial group
17、 Lasso estimator cannot achieven consistency,thusNo.2ZHU N.H.,et al.:Iterative Adaptive Robust Variable Selection in Nomparametric Additive Models203extending the results of 7.Similar to the ideas of 22 and 12,Lian et al.23proposed adouble penalization based procedure to distinguish covariates that
18、enter the nonparamet-ric and parametric parts and to identify insignificant covariates simultaneously the“largep small n”setting.However,the above mentioned estimations are mostly built on least squares(LS)typemethods.LS-type method may suffer from much bias when the error follows a heavy-taileddist
19、ribution or in the presence of outliers.To overcome this drawback,in this paper,aniterative robust group Lasso is proposed for nonparametric additive model,in which thenumber of additive components may be larger than the sample size.The idea behindour method is similar to those of 21 and 15.We consi
20、dered a nonparametric additivemodel in the“large p small n”setting and the number of nonzero fjs is assumed fixed.Similar as 21,by utilizing B-splines to approximate the unknown functions fj(x),theproblem of component selection becomes that of selecting the groups of coefficients in theexpansion.Whe
21、reas,being different from 21,we introduce the robust loss function toadaptive group Lasso penalty.Compared with the ordinary least-square adaptive groupLasso21,the adaptive robust group Lasso is resistant to heavy-tailed errors or outlinesin the responses.Furthermore,to prove facilitate presentation
22、,we consider more gener-ally the weighted robust group Lasso(WR-Group)estimator with nonnegative weightsd1,d2,dp.Moreover,we show that the choice of the weight vector d plays a pivotalrole for the WR-Group Lasso estimate to enjoy the model selection oracle property andasymptotic normality.The robust
23、 group Lasso and adaptive robust group Lasso can beseen as special circumstances of weighted robust group Lasso by using different weight.In practice,we use the robust group Lasso to obtain an initial estimator to reduce thedimension of the problem,and then apply the iterative adaptive robust group
24、Lasso toselect nonzero components.Under appropriate conditions,the iterative adaptive robustgroup Lasso possessed the oracle property in the sense that it is as efficient as the estimatorwhen the true model is known prior to statistical analysis.The paper is organized as follows.In Section 2,we firs
25、t propose a class of variableselection procedures for identifying the insignificant components via the iterative adap-tive robust group Lasso penalized approach and then study the statistical properties ofthe proposed procedures.In Section 3,simulations studies are carried out to assess theperforman
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 参数 模型 自适应 稳健 变量 选择
1、咨信平台为文档C2C交易模式,即用户上传的文档直接被用户下载,收益归上传人(含作者)所有;本站仅是提供信息存储空间和展示预览,仅对用户上传内容的表现方式做保护处理,对上载内容不做任何修改或编辑。所展示的作品文档包括内容和图片全部来源于网络用户和作者上传投稿,我们不确定上传用户享有完全著作权,根据《信息网络传播权保护条例》,如果侵犯了您的版权、权益或隐私,请联系我们,核实后会尽快下架及时删除,并可随时和客服了解处理情况,尊重保护知识产权我们共同努力。
2、文档的总页数、文档格式和文档大小以系统显示为准(内容中显示的页数不一定正确),网站客服只以系统显示的页数、文件格式、文档大小作为仲裁依据,个别因单元格分列造成显示页码不一将协商解决,平台无法对文档的真实性、完整性、权威性、准确性、专业性及其观点立场做任何保证或承诺,下载前须认真查看,确认无误后再购买,务必慎重购买;若有违法违纪将进行移交司法处理,若涉侵权平台将进行基本处罚并下架。
3、本站所有内容均由用户上传,付费前请自行鉴别,如您付费,意味着您已接受本站规则且自行承担风险,本站不进行额外附加服务,虚拟产品一经售出概不退款(未进行购买下载可退充值款),文档一经付费(服务费)、不意味着购买了该文档的版权,仅供个人/单位学习、研究之用,不得用于商业用途,未经授权,严禁复制、发行、汇编、翻译或者网络传播等,侵权必究。
4、如你看到网页展示的文档有www.zixin.com.cn水印,是因预览和防盗链等技术需要对页面进行转换压缩成图而已,我们并不对上传的文档进行任何编辑或修改,文档下载后都不会有水印标识(原文档上传前个别存留的除外),下载后原文更清晰;试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓;PPT和DOC文档可被视为“模板”,允许上传人保留章节、目录结构的情况下删减部份的内容;PDF文档不管是原文档转换或图片扫描而得,本站不作要求视为允许,下载前自行私信或留言给上传者【自信****多点】。
5、本文档所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用;网站提供的党政主题相关内容(国旗、国徽、党徽--等)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。
6、文档遇到问题,请及时私信或留言给本站上传会员【自信****多点】,需本站解决可联系【 微信客服】、【 QQ客服】,若有其他问题请点击或扫码反馈【 服务填表】;文档侵犯商业秘密、侵犯著作权、侵犯人身权等,请点击“【 版权申诉】”(推荐),意见反馈和侵权处理邮箱:1219186828@qq.com;也可以拔打客服电话:4008-655-100;投诉/维权电话:4009-655-100。