IDEAS home Printed from https://ideas.repec.org/r/oup/biomet/v99y2012i4p879-898.html
   My bibliography  Save this item

Scaled sparse linear regression

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Pierre Bellec & Alexandre Tsybakov, 2015. "Sharp oracle bounds for monotone and convex regression through aggregation," Working Papers 2015-04, Center for Research in Economics and Statistics.
  2. Alexandre Belloni & Victor Chernozhukov & Lie Wang, 2013. "Pivotal estimation via square-root lasso in nonparametric regression," CeMMAP working papers CWP62/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
  3. Anindya Bhadra & Jyotishka Datta & Nicholas G. Polson & Brandon T. Willard, 2020. "Global-Local Mixtures: A Unifying Framework," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 82(2), pages 426-447, August.
  4. Lasanthi C. R. Pelawa Watagoda & David J. Olive, 2021. "Comparing six shrinkage estimators with large sample theory and asymptotically optimal prediction intervals," Statistical Papers, Springer, vol. 62(5), pages 2407-2431, October.
  5. Laura Freijeiro‐González & Manuel Febrero‐Bande & Wenceslao González‐Manteiga, 2022. "A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates," International Statistical Review, International Statistical Institute, vol. 90(1), pages 118-145, April.
  6. Yimin Huang & Xiangshun Kong & Mingyao Ai, 2020. "Optimal designs in sparse linear models," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(2), pages 255-273, February.
  7. Beyhum, Jad, 2019. "Inference robust to outliers with L1‐norm penalization," TSE Working Papers 19-1032, Toulouse School of Economics (TSE).
  8. Seunghwan Lee & Sang Cheol Kim & Donghyeon Yu, 2023. "An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled lasso," Computational Statistics, Springer, vol. 38(1), pages 217-242, March.
  9. Kou Fujimori, 2019. "The Dantzig selector for a linear model of diffusion processes," Statistical Inference for Stochastic Processes, Springer, vol. 22(3), pages 475-498, October.
  10. Jad Beyhum, 2020. "Inference robust to outliers with L1‐norm penalization," Post-Print hal-03235868, HAL.
  11. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
  12. Pierre Bellec, 2015. "Optimal bounds for aggregation of affine estimators," Working Papers 2015-06, Center for Research in Economics and Statistics.
  13. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
  14. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers 56/15, Institute for Fiscal Studies.
  15. Xie, Jichun & Kang, Jian, 2017. "High-dimensional tests for functional networks of brain anatomic regions," Journal of Multivariate Analysis, Elsevier, vol. 156(C), pages 70-88.
  16. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
  17. Breunig, Christoph & Mammen, Enno & Simoni, Anna, 2020. "Ill-posed estimation in high-dimensional models with instrumental variables," Journal of Econometrics, Elsevier, vol. 219(1), pages 171-200.
  18. Jacob Bien & Irina Gaynanova & Johannes Lederer & Christian L. Müller, 2019. "Prediction error bounds for linear regression with the TREX," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(2), pages 451-474, June.
  19. Zemin Zheng & Jinchi Lv & Wei Lin, 2021. "Nonsparse Learning with Latent Variables," Operations Research, INFORMS, vol. 69(1), pages 346-359, January.
  20. Chang, Jinyuan & Qiu, Yumou & Yao, Qiwei & Zou, Tao, 2018. "Confidence regions for entries of a large precision matrix," Journal of Econometrics, Elsevier, vol. 206(1), pages 57-82.
  21. Zheng, Zemin & Li, Yang & Yu, Chongxiu & Li, Gaorong, 2018. "Balanced estimation for high-dimensional measurement error models," Computational Statistics & Data Analysis, Elsevier, vol. 126(C), pages 78-91.
  22. Xin Wang & Lingchen Kong & Liqun Wang, 2022. "Estimation of Error Variance in Regularized Regression Models via Adaptive Lasso," Mathematics, MDPI, vol. 10(11), pages 1-19, June.
  23. Chang, Jinyuan & Qiu, Yumou & Yao, Qiwei & Zou, Tao, 2018. "Confidence regions for entries of a large precision matrix," LSE Research Online Documents on Economics 87513, London School of Economics and Political Science, LSE Library.
  24. Adel Javanmard & Jason D. Lee, 2020. "A flexible framework for hypothesis testing in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(3), pages 685-718, July.
  25. Yuyang Liu & Pengfei Pi & Shan Luo, 2023. "A semi-parametric approach to feature selection in high-dimensional linear regression models," Computational Statistics, Springer, vol. 38(2), pages 979-1000, June.
  26. Zehua Chen & Yiwei Jiang, 2020. "A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(1), pages 65-90, February.
  27. He, Yong & Zhang, Liang & Ji, Jiadong & Zhang, Xinsheng, 2019. "Robust feature screening for elliptical copula regression model," Journal of Multivariate Analysis, Elsevier, vol. 173(C), pages 568-582.
  28. Tom Boot & Didier Nibbering, 2017. "Inference in high-dimensional linear regression models," Tinbergen Institute Discussion Papers 17-032/III, Tinbergen Institute, revised 05 Jul 2017.
  29. Wang, Yihe & Zhao, Sihai Dave, 2021. "A nonparametric empirical Bayes approach to large-scale multivariate regression," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
  30. Guo, Zijian & Kang, Hyunseung & Cai, T. Tony & Small, Dylan S., 2018. "Testing endogeneity with high dimensional covariates," Journal of Econometrics, Elsevier, vol. 207(1), pages 175-187.
  31. Jana Janková & Rajen D. Shah & Peter Bühlmann & Richard J. Samworth, 2020. "Goodness‐of‐fit testing in high dimensional generalized linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(3), pages 773-795, July.
  32. Sai Li & T. Tony Cai & Hongzhe Li, 2022. "Transfer learning for high‐dimensional linear regression: Prediction, estimation and minimax optimality," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(1), pages 149-173, February.
  33. Qing Zhou & Seunghyun Min, 2017. "Uncertainty quantification under group sparsity," Biometrika, Biometrika Trust, vol. 104(3), pages 613-632.
  34. Lucas Janson & Rina Foygel Barber & Emmanuel Candès, 2017. "EigenPrism: inference for high dimensional signal-to-noise ratios," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 1037-1065, September.
  35. Gueuning, Thomas & Claeskens, Gerda, 2016. "Confidence intervals for high-dimensional partially linear single-index models," Journal of Multivariate Analysis, Elsevier, vol. 149(C), pages 13-29.
  36. Zhou, Jia & Zheng, Zemin & Zhou, Huiting & Dong, Ruipeng, 2021. "Innovated scalable efficient inference for ultra-large graphical models," Statistics & Probability Letters, Elsevier, vol. 173(C).
  37. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
  38. Lan, Wei & Zhong, Ping-Shou & Li, Runze & Wang, Hansheng & Tsai, Chih-Ling, 2016. "Testing a single regression coefficient in high dimensional linear models," Journal of Econometrics, Elsevier, vol. 195(1), pages 154-168.
  39. Patrick L. Combettes & Christian L. Müller, 2021. "Regression Models for Compositional Data: General Log-Contrast Formulations, Proximal Optimization, and Microbiome Data Applications," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 13(2), pages 217-242, July.
  40. Zeyu Wu & Cheng Wang & Weidong Liu, 2023. "A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 75(4), pages 619-648, August.
  41. Antoine Bichat & Christophe Ambroise & Mahendra Mariadassou, 2022. "Hierarchical correction of p-values via an ultrametric tree running Ornstein-Uhlenbeck process," Computational Statistics, Springer, vol. 37(3), pages 995-1013, July.
  42. Sermpinis, Georgios & Tsoukas, Serafeim & Zhang, Ping, 2018. "Modelling market implied ratings using LASSO variable selection techniques," Journal of Empirical Finance, Elsevier, vol. 48(C), pages 19-35.
  43. Luo, Shan & Chen, Zehua, 2014. "Edge detection in sparse Gaussian graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 70(C), pages 138-152.
  44. Qi Zhang, 2022. "High-Dimensional Mediation Analysis with Applications to Causal Gene Identification," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 14(3), pages 432-451, December.
  45. Wu, Jie & Zheng, Zemin & Li, Yang & Zhang, Yi, 2020. "Scalable interpretable learning for multi-response error-in-variables regression," Journal of Multivariate Analysis, Elsevier, vol. 179(C).
  46. Pun, Chi Seng & Hadimaja, Matthew Zakharia, 2021. "A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions," Computational Statistics & Data Analysis, Elsevier, vol. 155(C).
  47. Tung Duy Luu & Jalal Fadili & Christophe Chesneau, 2020. "Sharp oracle inequalities for low-complexity priors," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(2), pages 353-397, April.
  48. Georgios Sermpinis & Serafeim Tsoukas & Ping Zhang, 2019. "What influences a bank's decision to go public?," International Journal of Finance & Economics, John Wiley & Sons, Ltd., vol. 24(4), pages 1464-1485, October.
  49. Alexis Derumigny, 2017. "Improved bounds for Square-Root Lasso and Square-Root Slope," Working Papers 2017-53, Center for Research in Economics and Statistics.
  50. Tianxi Cai & T. Tony Cai & Zijian Guo, 2021. "Optimal statistical inference for individualized treatment effects in high‐dimensional models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 669-719, September.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.