IDEAS home Printed from https://ideas.repec.org/a/bpj/sagmbi/v10y2011i1n32.html
   My bibliography  Save this article

Random Forests for Genetic Association Studies

Author

Listed:
  • Goldstein Benjamin A

    (Stanford University)

  • Polley Eric C

    (National Institutes of Health)

  • Briggs Farren B. S.

    (University of California, Berkeley)

Abstract

The Random Forests (RF) algorithm has become a commonly used machine learning algorithm for genetic association studies. It is well suited for genetic applications since it is both computationally efficient and models genetic causal mechanisms well. With its growing ubiquity, there has been inconsistent and less than optimal use of RF in the literature. The purpose of this review is to breakdown the theoretical and statistical basis of RF so that practitioners are able to apply it in their work. An emphasis is placed on showing how the various components contribute to bias and variance, as well as discussing variable importance measures. Applications specific to genetic studies are highlighted. To provide context, RF is compared to other commonly used machine learning algorithms.

Suggested Citation

  • Goldstein Benjamin A & Polley Eric C & Briggs Farren B. S., 2011. "Random Forests for Genetic Association Studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 10(1), pages 1-34, July.
  • Handle: RePEc:bpj:sagmbi:v:10:y:2011:i:1:n:32
    DOI: 10.2202/1544-6115.1691
    as

    Download full text from publisher

    File URL: https://doi.org/10.2202/1544-6115.1691
    Download Restriction: For access to full text, subscription to the journal or payment for the individual article is required.

    File URL: https://libkey.io/10.2202/1544-6115.1691?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    2. Tuglus Catherine & van der Laan Mark J., 2009. "Modified FDR Controlling Procedure for Multi-Stage Analyses," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 8(1), pages 1-15, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Michel Fuino & Andrey Ugarte Montero & Joël Wagner, 2022. "On the drivers of potential customers' interest in long‐term care insurance: Evidence from Switzerland," Risk Management and Insurance Review, American Risk and Insurance Association, vol. 25(3), pages 271-302, September.
    2. Sim Aaron & Tsagkrasoulis Dimosthenis & Montana Giovanni, 2013. "Random forests on distance matrices for imaging genetics studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 12(6), pages 757-786, December.
    3. Florian Marcel Nuţă & Alina Cristina Nuţă & Cristina Gabriela Zamfir & Stefan-Mihai Petrea & Dan Munteanu & Dragos Sebastian Cristea, 2021. "National Carbon Accounting—Analyzing the Impact of Urbanization and Energy-Related Factors upon CO 2 Emissions in Central–Eastern European Countries by Using Machine Learning Algorithms and Panel Data," Energies, MDPI, vol. 14(10), pages 1-23, May.
    4. Wei, Pengfei & Lu, Zhenzhou & Song, Jingwen, 2015. "Variable importance analysis: A comprehensive review," Reliability Engineering and System Safety, Elsevier, vol. 142(C), pages 399-432.
    5. Cihan Şahin, 2023. "Predicting base station return on investment in the telecommunications industry: Machine‐learning approaches," Intelligent Systems in Accounting, Finance and Management, John Wiley & Sons, Ltd., vol. 30(1), pages 29-40, January.
    6. Maria Angela Echeverry-Galvis & Jennifer K Peterson & Rajmonda Sulo-Caceres, 2014. "The Social Nestwork: Tree Structure Determines Nest Placement in Kenyan Weaverbird Colonies," PLOS ONE, Public Library of Science, vol. 9(2), pages 1-7, February.
    7. Xianguo Ren & Haiqing Tian & Kai Zhao & Dapeng Li & Ziqing Xiao & Yang Yu & Fei Liu, 2022. "Research on pH Value Detection Method during Maize Silage Secondary Fermentation Based on Computer Vision," Agriculture, MDPI, vol. 12(10), pages 1-17, October.
    8. Lauric A Ferrat & Marc Goodfellow & John R Terry, 2018. "Classifying dynamic transitions in high dimensional neural mass models: A random forest approach," PLOS Computational Biology, Public Library of Science, vol. 14(3), pages 1-27, March.
    9. Dinesh Reddy Vangumalli & Konstantinos Nikolopoulos & Konstantia Litsiou, 2019. "Clustering, Forecasting and Cluster Forecasting: using k-medoids, k-NNs and random forests for cluster selection," Working Papers 19016, Bangor Business School, Prifysgol Bangor University (Cymru / Wales).
    10. Silke Janitza & Roman Hornung, 2018. "On the overestimation of random forest’s out-of-bag error," PLOS ONE, Public Library of Science, vol. 13(8), pages 1-31, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Borup, Daniel & Christensen, Bent Jesper & Mühlbach, Nicolaj Søndergaard & Nielsen, Mikkel Slot, 2023. "Targeting predictors in random forest regression," International Journal of Forecasting, Elsevier, vol. 39(2), pages 841-868.
    2. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    3. Sexton, Joseph & Laake, Petter, 2009. "Standard errors for bagged and random forest estimators," Computational Statistics & Data Analysis, Elsevier, vol. 53(3), pages 801-811, January.
    4. Joshua Rosaler & Dhruv Desai & Bhaskarjit Sarmah & Dimitrios Vamvourellis & Deran Onay & Dhagash Mehta & Stefano Pasquali, 2023. "Towards Enhanced Local Explainability of Random Forests: a Proximity-Based Approach," Papers 2310.12428, arXiv.org.
    5. Mendez, Guillermo & Lohr, Sharon, 2011. "Estimating residual variance in random forest regression," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2937-2950, November.
    6. Li, Yiliang & Bai, Xiwen & Wang, Qi & Ma, Zhongjun, 2022. "A big data approach to cargo type prediction and its implications for oil trade estimation," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 165(C).
    7. Yi Fu & Shuai Cao & Tao Pang, 2020. "A Sustainable Quantitative Stock Selection Strategy Based on Dynamic Factor Adjustment," Sustainability, MDPI, vol. 12(10), pages 1-12, May.
    8. Ishwaran, Hemant & Kogalur, Udaya B., 2010. "Consistency of random survival forests," Statistics & Probability Letters, Elsevier, vol. 80(13-14), pages 1056-1064, July.
    9. José María Sarabia & Faustino Prieto & Vanesa Jordá & Stefan Sperlich, 2020. "A Note on Combining Machine Learning with Statistical Modeling for Financial Data Analysis," Risks, MDPI, vol. 8(2), pages 1-14, April.
    10. Biau, Gérard & Devroye, Luc, 2010. "On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification," Journal of Multivariate Analysis, Elsevier, vol. 101(10), pages 2499-2518, November.
    11. Olivier BIAU & Angela D´ELIA, 2010. "Euro Area GDP Forecast Using Large Survey Dataset - A Random Forest Approach," EcoMod2010 259600029, EcoMod.
    12. Cleridy E. Lennert‐Cody & Richard A. Berk, 2007. "Statistical learning procedures for monitoring regulatory compliance: an application to fisheries data," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 671-689, July.
    13. Ruoqing Zhu & Donglin Zeng & Michael R. Kosorok, 2015. "Reinforcement Learning Trees," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1770-1784, December.
    14. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.
    15. Jincheng Shen & Lu Wang & Jeremy M. G. Taylor, 2017. "Estimation of the optimal regime in treatment of prostate cancer recurrence from observational data using flexible weighting models," Biometrics, The International Biometric Society, vol. 73(2), pages 635-645, June.
    16. Gérard Biau & Erwan Scornet, 2016. "A random forest guided tour," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 25(2), pages 197-227, June.
    17. Dhruv Desai & Ashmita Dhiman & Tushar Sharma & Deepika Sharma & Dhagash Mehta & Stefano Pasquali, 2023. "Quantifying Outlierness of Funds from their Categories using Supervised Similarity," Papers 2308.06882, arXiv.org.
    18. Hoora Moradian & Denis Larocque & François Bellavance, 2017. "$$L_1$$ L 1 splitting rules in survival forests," Lifetime Data Analysis: An International Journal Devoted to Statistical Methods and Applications for Time-to-Event Data, Springer, vol. 23(4), pages 671-691, October.
    19. Arlen Dean & Amirhossein Meisami & Henry Lam & Mark P. Van Oyen & Christopher Stromblad & Nick Kastango, 2022. "Quantile regression forests for individualized surgery scheduling," Health Care Management Science, Springer, vol. 25(4), pages 682-709, December.
    20. Prasad, Ramendra & Ali, Mumtaz & Kwan, Paul & Khan, Huma, 2019. "Designing a multi-stage multivariate empirical mode decomposition coupled with ant colony optimization and random forest model to forecast monthly solar radiation," Applied Energy, Elsevier, vol. 236(C), pages 778-792.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bpj:sagmbi:v:10:y:2011:i:1:n:32. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.degruyter.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.