Sparse and Stable Markowitz Portfolios
The Markowitz mean-variance optimizing framework has served as the basis for modern portfolio theory for more than 50 years. However, efforts to translate this theoretical foundation into a viable portfolio construction algorithm have been plagued by technical difficulties stemming from the instability of the original optimization problem with respect to the available data. In this paper we address these issues of estimation error by regularizing the Markowitz objective function through the addition of a penalty proportional to the sum of the absolute values of the portfolio weights (l1 penalty). This penalty stabilizes the optimization problem, encourages sparse portfolios, and facilitates treatment of transaction costs in a transparent way. We implement this methodology using the Fama and French 48 industry portfolios as our securities. Using only a modest amount of training data, we construct portfolios whose out-of-sample performance, as measured by Sharpe ratio, is consistently and significantly better than that of the naïve portfolio comprising equal investments in each available asset. In addition to their excellent performance, these portfolios have only a small number of active positions, a highly desirable attribute for real life applications. We conclude by discussing a collection of portfolio construction problems which can be naturally translated into optimizations involving l1 penalties and which can thus be tackled by algorithms similar to those discussed here.
|Date of creation:||Sep 2007|
|Date of revision:|
|Contact details of provider:|| Postal: Centre for Economic Policy Research, 77 Bastwick Street, London EC1V 3PZ.|
Phone: 44 - 20 - 7183 8801
Fax: 44 - 20 - 7183 8820
|Order Information:|| Email: |
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006.
"Forecasting Using a Large Number of Predictors: Is Bayesian Regression a Valid Alternative to Principal Components?,"
CEPR Discussion Papers
5829, C.E.P.R. Discussion Papers.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2008. "Forecasting using a large number of predictors: Is Bayesian shrinkage a valid alternative to principal components?," Journal of Econometrics, Elsevier, vol. 146(2), pages 318-328, October.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006. "Forecasting using a large number of predictors: Is Bayesian regression a valid alternative to principal components?," Working Paper Series 0700, European Central Bank.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006. "Forecasting using a large number of predictors: is Bayesian regression a valid alternative to principal components?," Discussion Paper Series 1: Economic Studies 2006,32, Deutsche Bundesbank, Research Centre.
- Ravi Jagannathan & Tongshu Ma, 2003.
"Risk Reduction in Large Portfolios: Why Imposing the Wrong Constraints Helps,"
Journal of Finance,
American Finance Association, vol. 58(4), pages 1651-1684, 08.
- Ravi Jagannathan & Tongshu Ma, 2002. "Risk Reduction in Large Portfolios: Why Imposing the Wrong Constraints Helps," NBER Working Papers 8922, National Bureau of Economic Research, Inc.
When requesting a correction, please mention this item's handle: RePEc:cpr:ceprdp:6474. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()
If references are entirely missing, you can add them using this form.