勷勤数学•专家报告-肖运海

勷勤数学•专家报告


题      目:Graph-based Square-Root Estimation for Sparse Linear Regression


报  告  人:肖运海 教授  (邀请人:陈艳男)

                                             河南大学


时      间:5月4日  10:00-11:00


地     点:数科院西楼二楼会议室


报告人简介:

          肖运海,河南大学教授,博士生导师,河南省特聘教授,研究方向为统计优化。于2007年获得湖南大学博士学位,并在南京大学和台湾成功大学完成博士后研究。曾在加拿大西蒙弗雷泽大学、新加坡国立大学、香港理工大学和台湾成功大学等进行学术交流访问。主研国家自然科学基金项目4项和河南省杰出青年基金项目1项。在MPC、COAP、JSC、OMS、CSDA等学术期刊上发表论文60余篇。任中国运筹学会理事,中国工业与应用数学会理事,河南大学学术委员会委员等。


摘      要:

           Sparse linear regression is one of the classic problems in the field of statistics, which has deep connections and high intersections with optimization, computation, and machine learning. To address the effective handling of high-dimensional data, the diversity of real noise, and the challenges in estimating standard deviation of the noise, we propose a novel and general graph-based square-root estimation (GSRE) model for sparse linear regression. Specifically, we use square-root-loss function to encourage the estimators to be independent of the unknown standard deviation of the error terms and design a sparse regularization term by using the graphical structure among predictors in a node-by-node form. Based on the predictor graphs with special structure, we highlight the generality by analyzing that the model in this paper is equivalent to several classic regression models. Theoretically, we also analyze the finite sample bounds, asymptotic normality and model selection consistency of GSRE method without relying on the standard deviation of error terms. In terms of computation, we employ the fast and efficient alternating direction method of multipliers. Finally, based on a large number of simulated and real data with various types of noise, we demonstrate the performance advantages of the proposed method in estimation, prediction and model selection.


          欢迎老师、同学们参加、交流!