Sparse and stable Markowitz portfolios
We consider the problem of portfolio selection within the classical Markowitz meanvariance optimizing framework, which has served as the basis for modern portfolio theory for more than 50 years. Efforts to translate this theoretical foundation into a viable portfolio construction algorithm have been plagued by technical difficulties stemming from the instability of the original optimization problem with respect to the available data. Often, instabilities of this type disappear when a regularizing constraint or penalty term is incorporated in the optimization procedure. This approach seems not to have been used in portfolio design until very recently. To provide such a stabilization, we propose to add to the Markowitz objective function a penalty which is proportional to the sum of the absolute values of the portfolio weights. This penalty stabilizes the optimization problem, automatically encourages sparse portfolios, and facilitates an effective treatment of transaction costs. We implement our methodology using as our securities two sets of portfolios constructed by Fama and French: the 48 industry portfolios and 100 portfolios formed on size and book-to-market. Using only a modest amount of training data, we construct portfolios whose out-of-sample performance, as measured by Sharpe ratio, is consistently and significantly better than that of the naïve portfolio comprising equal investments in each available asset. In addition to their excellent performance, these portfolios have only a small number of active positions, a desirable feature for small investors, for whom the fixed overhead portion of the transaction cost is not negligible. JEL Classification: G11, C00
|Date of creation:||Sep 2008|
|Contact details of provider:|| Postal: 60640 Frankfurt am Main, Germany|
Phone: +49 69 1344 0
Fax: +49 69 1344 6000
Web page: http://www.ecb.europa.eu/
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006.
"Forecasting Using a Large Number of Predictors: Is Bayesian Regression a Valid Alternative to Principal Components?,"
CEPR Discussion Papers
5829, C.E.P.R. Discussion Papers.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2008. "Forecasting using a large number of predictors: Is Bayesian shrinkage a valid alternative to principal components?," Journal of Econometrics, Elsevier, vol. 146(2), pages 318-328, October.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006. "Forecasting using a large number of predictors: is Bayesian regression a valid alternative to principal components?," Discussion Paper Series 1: Economic Studies 2006,32, Deutsche Bundesbank, Research Centre.
- De Mol, Christine & Giannone, Domenico & Reichlin, Lucrezia, 2006. "Forecasting using a large number of predictors: Is Bayesian regression a valid alternative to principal components?," Working Paper Series 0700, European Central Bank.
- Ravi Jagannathan & Tongshu Ma, 2003.
"Risk Reduction in Large Portfolios: Why Imposing the Wrong Constraints Helps,"
Journal of Finance,
American Finance Association, vol. 58(4), pages 1651-1684, 08.
- Ravi Jagannathan & Tongshu Ma, 2002. "Risk Reduction in Large Portfolios: Why Imposing the Wrong Constraints Helps," NBER Working Papers 8922, National Bureau of Economic Research, Inc.
When requesting a correction, please mention this item's handle: RePEc:ecb:ecbwps:20080936. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Official Publications)
If references are entirely missing, you can add them using this form.