Do Agents Learn by Least Squares? The Evidence Provided by Changes in Monetary Policy
Understanding how agents formulate their expectations about Fed behavior is critical for the design of monetary policy. In response to a lack of empirical support for a strict rationality assumption, monetary theorists have recently introduced learning by agents into their models. Although a learning assumption is now common, there is practically no empirical research on whether agents actually earn. In this paper we test if the forecast of the three month T-bill rate in the Survey of Professional Forecasters (SPF) is consistent with least squares learning when there are discrete shifts in monetary policy. Discrete shifts in policy introduce temporary biases into forecasts while agents process data and learn about the policy shift. We first derive the mean, variance and autocovariances of the forecast errors from a recursive least squares learning algorithm when there are breaks in the structure of the model. We then apply the Bai and Perrron (1998) test for structural change to a Taylor rule and a forecasting model for the three month T-bill rate in order to identify changes in monetary policy. Having identified the policy regimes, we then estimate the implied biases in the interest rate forecasts within each regime. We find that when the forecast errors from the SPF are corrected for the biases due to shifts in policy, the forecast are consistent with least squares learning.
|Date of creation:|
|Contact details of provider:|| Postal: 221 Burwood Highway, Burwood 3125|
Phone: 61 3 9244 3815
Fax: +61 3 5227 2655
Web page: http://www.deakin.edu.au/buslaw/aef/index.php
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:dkn:ecomet:fe_2012_09. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dr Susan S Sharma)
If references are entirely missing, you can add them using this form.