When I ask my children what happened during the day, I get one of two styles of response. Sometimes they say nothing, after which you have to depose them like an attorney to get sufficient detail. Or at other times, they launch into a narrative of minutiae so lengthy that you finally have to cut them off. I really dont need to know whether the chicken nuggets in the school cafeteria were crispy or not.
My wife and I have been seeking the middle ground between these two extremes for more than twenty years now. Over roughly that same interval, the Federal Reserve has been trying to find the right balance between saying nothing and saying too much in its efforts to signal its intentions to the markets.
A generation ago, the curtain of secrecy surrounding FOMC decisions was almost complete. There was no immediate announcement of changes in monetary policy; analysts had to infer them from market trading, which sometimes took weeks. Minutes of the meetings were released with a six-week lag, and background materials considered by the committee were entirely confidential.
Since then, the curtain has been pulled back a great deal. Decisions, and some color around them, are captured in a release that goes out immediately after each FOMC meeting. Forecasts of FOMC participants are released quarterly. The Chairman now conducts a press conference after the FOMC meeting four times per year. How times have changed.
To some, all of the added color has been helpful. To others, though, the extra information has made Fed policy muddier. Many in this latter camp would prefer that monetary policy follow simple rules that would be transparent to the market, like the Taylor rule shown below.
The Taylor rule, developed by Stanford professor John Taylor, produces a target funds rate by considering the distance of actual inflation from some target level, and the distance of economic growth from its potential. The greater these distances, the greater the need for adjustments. Formulaically:
Target funds rate = Current Inflation + Long-Run Equilibrium Funds Rate + α (Actual Inflation Target Inflation)+ β (Actual GDP Potential GDP)
The Taylor rule has won adherents because of its suggestion that monetary policy was too easy for too long ten years ago, and thereby contributed to excesses which formed kindling for the 2008 crash. The use of discretion to override the formulaic approach, from this perspective, is to be discouraged: not only can discretion be misused, but it creates mystery where markets want clarity.
Central to the application of a rule like Taylors is the establishment of key parameters. For example, at the beginning of this year the FOMC adopted a 2% target for medium-term inflation.
But other elements required to apply Taylors rule, or any other, involve some subjective judgment:
The rate of unemployment that will not stress long-term inflation is a central input to the determining potential GDP growth. (Economists use the ungainly acronymn NAIRU to describe this quantity; I much prefer the old representation of the natural rate of unemployment.) In Chart 1, this has been set at 6%, but this is a quantity that can vary over time and has been the subject of some philosophical disagreement.
There are a variety of target variables which can be considered for policy rules; how to choose among them? There are at least a half-dozen major inflation rates, and conditions in the labor markets can be described in a number of ways.
Should the variables used in these rules be global in scope, or just based in the U.S.? Economic slack (or the lack thereof) outside of our borders can have an important influence on inflation.
There are important levels of imprecision in the measurement of key economic variables. Revisions and rebenchmarking are routine, and they sometimes change our perception of performance sharply. For example, the recent revisions to the history of real GDP revealed that the early stages of the recent recovery were far more tepid than had previously been understood.
If performance varies from target, should reaction be immediate, or does the variance have to persist for some time before prompting correction?
These are but several of many uncertainties which must be addressed before a quantitative regime can be put into place.
The reality is this: even the application of policy rules involves a lot of discretion. So from this perspective, the Feds efforts to better inform markets about how it thinks about conditions have been very important. Rules can be a useful place to start the conversation, but the conversation cannot end there.
At the most recent FOMC meeting, the participants discussed the appropriate application of rules for decision making. Some have recently suggested establishing a target level for employment and factoring this into policy rules. This would certainly be consistent with the Federal Reserves dual mandate, but would also complicate the specification of models.
On another front, the Feds current promise to hold interest rates near zero for a set period of time carries the risk of having to break this promise if the economy outperforms expectations. To address this risk, the FOMC is considering the construction of a consensus forecast, which outsiders could use to gauge how actual results vary from expectation. These deviations could be used as a guide to when policy might ultimately change, and relieve the need for the FOMC to use a specific date when communicating policy intentions.
I suspect that some of you are terribly confused by all of this, and perhaps wish for the more opaque operating style that the Fed had a generation ago. Yet this observer appreciates the willingness of our central bank to explain its thinking and provide a window into whats going on behind the scenes. I just wish my children would do the same succinctly.
Highlights of the Week
The calendar was not inundated with economic reports this week, but the minutes of the June FOMC meeting were in the limelight. Several members appeared willing to take action fairly soon if the nature of economic data showed the disappointing trend seen prior to their deliberations. The economic landscape has undergone modifications since this meeting and the call remains close for the September 12-13 FOMC meeting.
Bernanke has pointed out in recent public statements that labor market developments are the driver of near term monetary policy. It will not be surprising to hear more about the labor market at the upcoming Jackson Hole event. The comments below put this issue in complete context so as to appreciate the nuances of labor market indicators that are at the front and center of monetary policy discussions.
Starting with the most well-known statistic of the labor market, the unemployment rate has held between 8.1% and 8.3% in the first seven months of 2012 compared with an 8.5% rate in December 2011. A broader measure of the jobless rate inclusive of those marginally attached to the labor force has hovered between 9.6% and 9.9% in the seven months ended July, down from 10.0% as 2011 wound down.
Payroll employment has averaged a paltry gain of 151,000 in the January-July months. Initial jobless claims have averaged around 367,000 in the July-August period, an improvement from the 385,000 reading seen in June. Continuing claims stand at 3.315 million in the first two weeks of August and are close to numbers posted for June and July, but down from 3.513 million in January 2012. Recipients of unemployment insurance under the emergency programs have dwindled significantly, which represents a big plus. The looming question is how many are no longer eligible for unemployment insurance. Unfortunately, there is no explicit record of folks who have exhausted their benefits.
The number of recipients of unemployment insurance could be declining either because of employment or disqualification from exhaustion. As data pertaining to the latter are not available, we can judge the status of jobless claims from the trend of first payments to establish if the universe of recipients of unemployment insurance is growing or shrinking. The tally of first payments has declined to 8.919 million as of July 31, 2012 from 9.183 million in March 2012 and 9.474 million in December 2011. The peak of first payments was 14.486 million in November 2009. It is not amiss to infer that the downward trend of unemployment claims reflects an improvement in employment.
An understanding of the labor market is not complete without digging into what types of unemployment exist in the economy. Data of the type of unemployment brings to the table important questions about the scope of monetary policy. The unemployment rate at the onset of the recession stood at 5.0%; this is a large difference from the July 2012 mark of 8.3%. Monetary policy actions can address issues of cyclical unemployment (results from inadequate demand), while structural unemployment (arising from a mismatch of skills) is not within the purview of monetary policy. In reality, it is a challenge to disentangle cyclical and structural unemployment because both tend to move up in a recession. Economists call upon the Beveridge Curve to investigate whether structural or cyclical unemployment prevails in a given economy.
The Beveridge Curve is a graphical representation of job openings and the unemployment rate. Of late, it has gained visibility due to the persistent elevated jobless rate. The chart below is from the Bureau of Labor Statistics monthly Job Openings and Labor Turnover Statistics, referred to as the JOLTS report. In a stylized form, the Beveridge Curve has downward slope; data points on the Beveridge Curve denote where the economy stands in a business cycle. During an economic contraction the unemployment rate is high and job openings are few. As the economy recovers, the unemployment rate declines as job openings are typically rising. Essentially, there is movement along the Beveridge Curve. But, the latest readings beg questions because job openings are increasing and the unemployment rate has not fallen rapidly (data points are located above the Beveridge Curve, see Chart 4). In other words, there is a mismatch of skills or structural unemployment. This is where differing positions emerge about whether a significant part of current unemployment is cyclical or structural.
The Beveridge Curve (seasonally adjusted)
Bernankes speech on March 26, 2012 speech pointed out that the apparent shift in the relationship between vacancies and unemployment is neither unusual for a recession or likely to be persistent. Research has found that during and immediately after the serious recessions of 1973-75 and 1981 to 1982, the Beveridge Curve also shifted outward, but in both cases it shifted back inward during the recovery. Janet Yellen, Vice Chair, has voiced a similar opinion and cited evidence to invalidate claims that structural factors are the major cause of unemployment in two speeches in April and June 2012.
There is another view held within the FOMC that structural unemployment accounts for a significant part of unemployment at the present time. Therefore, in their opinion, monetary policy support is not the solution to fix labor markets woes. These differences were aired in the minutes of the June FOMC meeting.
Recent research (see here and here) has identified the availability of extended unemployment benefits as a factor preventing acceptance of jobs that would otherwise have been found suitable. Selective and delayed hiring as business conditions are soft is yet another explanation.
A plausible conclusion from these competing theories and evidence is that both cyclical and structural unemployment co-exist, with the former making up the giant share of unemployment. Construction and auto sectors come to mind as sectors where structural unemployment is force to reckon with, while retail unemployment is suitable example of inadequate demand as the root cause of a tepid hiring pace. Aggregate demand policies to combat unemployment are suitable as long as cyclical unemployment is the major culprit. The existence of different opinions about the nature of unemployment in the economy is stuff for a spirited debate at the September FOMC meeting.
Alongside the labor market, the housing sector is another top priority of policy makers. The July data on home sales point to improving conditions but the overall trend is not robust enough to declare that normality has been reestablished in the housing sector. Nevertheless, the forward momentum gleaned from Julys sales of existing and new homes is a positive development.
Sales of new homes advanced 3.6% to 372,000 in July, a 26% increase from a year ago. Inventories of unsold new homes at a 4.6-month mark are a far cry from the historical mean of 6.2-month supply. The median price of new single family home ($224,200) was down from a month ago and a year ago. The median number of months to sell a new single-family home has increased to 8.7 months form 6.7 months in December 2011. These two aspects are the downsides of the July report.
Sales of all existing homes increased 2.2% to an annual rate of 4.47 million homes in July after a decline in the prior month. Purchases of single-family homes advanced 2.1% to annual rate of 3.98 million homes. The median price of an existing single-family home increased 9.6% from a year ago to $188,100, the fifth consecutive monthly increase of the median price of an existing single-family. Distressed properties (foreclosures and short sales) made up 24% of sales of existing homes in July, down from 25% in the May-June period and 29% in July 2011 and lowest since October 2008. The shrinking share of distressed properties in home sales is a plus for future price gains. Also inventories of unsold existing homes moved down in July (6.1-month supply vs. 6.3 months in June), another source of support for home prices.
The FOMC has depicted the housing sector as depressed in policy statements since December 2010. Will this assessment change this time around? Stay tuned for an update of our take on Chairman Bernankes speech at Jackson Hole.