- PERS, revenue chatter continues; rally against illegal immigration: Oregon ... - OregonLive.com
- Betsy Johnson makes first appearance in Oregon Senate since car accident - OregonLive.com
- Cormorant control, public meetings, Jason Lee vs. Mark Hatfield: Oregon ... - OregonLive.com
- Getting more doctors, nurses into rural Oregon should win Legislature's support - OregonLive.com
- New Oregon Medical Pot Bill Would Pump Nearly $1 Million Into State Coffers - Boise Weekly
John Kitzhaber calls Oregon lawmakers together to work out budget deal
View full sizeOregon House Speaker Tina Kotek, left, D-Portland, shook hands with Oregon Gov. John Kitzhaber, right, while Oregon Senate President Peter Courtney, D-Salem, stood by after the 77th Oregon Legislature took the oath of office in January.
Hey, Oregon legislators: Leave Gain Share alone: Editorial
Back in 2007, lawmakers hoping to goose Oregon's economy passed a bill that requires local governments to waive some property taxes for businesses that expand locally and create new jobs. In return, the state would give back 50 percent of the income ...
Jeff Sachs is interviewed by Paul Vigna of the WSJ's MoneyBeat:Jeffrey Sachs: Banking Abuses ‘Can’t Get More in Your Face’, by Paul Vigna: ....When I really started to ... keep track of the number of lawsuits, and the number of settlements, and it’s amazing actually how many there are, of course. Libor, Abacus, other financial fraud scandals, money laundering, insider trading. The list is actually extraordinary. The frequency of new cases, new settlements, new SEC charges, is stunning. ... Why the lack of prosecution? The legal defenses are very powerful, the lobbying is very powerful, the government in general is completely squeezed even if it would like to regulate. But we also have a revolving door of senior regulatory officials, congressional staff, congressmen and senators. Everyone’s in on this. ... What will it take to change the system? I think that the public is utterly disgusted, of course, and that is a major start. There’s going to be a massive backlash..., what one does feel is that the extent of abuse, the stench of it, is reaching such a high level that we’re not in an equilibrium, political or social, right now. This is explosive stuff (scandals like Abacus and insider trading). It’s unbelievable. So far it hasn’t stopped the practice, but it can’t get more in your face than this actually. I think in the end the question will be ... whether a political movement not based on mega-donations can win political control. I believe that it can actually. Some movement like the populist movement or the progressive era of the past is going to rise and say ‘we don’t need contributions, we’re not taking them, and if you the American people want a way out of this that doesn’t involve politicians bought for big money, we’re the ones.” But short of that I don’t see a way out. ...
Via Eric Weiner at INET, and continuing a recent discussion, this is Jan Höffler and Thomas Kneib on replication in economics:Economics Needs Replication, by INET Grantees Jan Höffler and Thomas Kneib: The recent debate about the reproducibility of the results published by Carmen Reinhart and Kenneth Rogoff offers a showcase for the importance of replication in empirical economics. Replication is basically the independent repetition of scientific analyses by other scientists... The principle is well accepted in the natural sciences. However, it is far less common in empirical economics, even though non-reproducible research can barely be considered a contribution to the consolidated body of scientific knowledge. ... In the narrow sense, replicability means that the raw data for an analysis can be accessed, that the transformation from the raw data to the final data set is well documented, and that software code is available for producing the final data set and the empirical results. Basically, this comes down to a question of data and code availability, but nonetheless it is a necessary prerequisite for replication. A successful replication would then indicate that all of the material has been provided and that the same results were obtained when redoing the analysis. In the wider sense, a replication could go much further by challenging previous analyses via changing the data sources (such as changing countries, switching time periods, or using different surveys), altering the statistical or econometric model, or questioning the interpretational conclusions drawn from the analysis. Here, the scientific debate really starts, since this type of replication isn’t concerned with simply redoing exactly the same analysis as the original study. Rather, the goal is to rethink the entire analysis, from data collection and operationalization to the interpretation of results and robustness checks. Unfortunately, very few journals in economics have mandatory online archives for data and code and/or strict rules for ensuring replicability. Moreover, the incentive for making your own research reproducible, and for reproducing research done by others, is low. In this respect, there are several interesting lessons to be learned by the Reinhart/Rogoff case. One is that the impact of replication can actually be quite high, especially when replicating papers that have been influential... Still, it is important to remember that replications that question earlier results are not the only ones that are of value. It is also helpful to know if a specific study could be replicated... Another important lesson is that involving students in replications can significantly change attitudes towards replication. For students, a replication is a perfect opportunity to perform their own analyses based on an already available paper. They get to learn how experienced scientists tackle applied-research questions and they also learn that the consolidated body of scientific knowledge is constantly changing as it is questioned and transferred to new contexts. Finally, it is very important for the raw data to be made available so that every step up to the final results of a study can be replicated. ... In recent years, we have been teaching replication to students at all levels (ranging from undergraduates to Ph.D. candidates) and have set up a large global network to support the idea of replication by students. ... As a part of our INET project on empirical replication, we therefore are collecting and sharing a large dataset of empirical studies. These studies are all potential candidates for replication that meet the minimal requirements for replicability. Information on these studies, as well as additional information about already published replications, is available in a wiki shared with collaborators who join our teaching initiative. Moreover, we will soon provide via the same wiki website additional resources to support teaching replication seminars. We also started a working paper series on replication so that replication papers can be published as reports and provide a forum for discussing replicability as another part of the wiki website. We welcome you to join our efforts... You can find more information and contact us here and here.
One place where replication occurs regularly is assignments in graduate classes. I routinely ask students to replicate papers as part of their coursework. Even if they don't find explicit errors (and most of the time they don't), it almost always raises good questions about the research (why this choice, this model, what if you relax this assumption, there's a better way to do this,here's the next question to ask, etc., etc.). So replication doers occur routinely in economics, and it is very valuable, but it is not a formal part of the profession the way it should be, and much of the replication is done by people (students) who generally assume that if they can't replicate something, it is probably their error. We have a lot of work to do on the replication front, and I want to encourage efforts like this.
In case this is of interest:
Global Financial Regulation
- James Barth, Senior Finance Fellow, Milken Institute; Lowder Eminent Scholar in Finance, Auburn University
- Bob Corker, U.S. Senator
- Carey Lathrop, Managing Director and Head of Global Credit Markets, Citi
- Kevin Lynch, Vice Chairman, BMO Financial Group
- Thomas Perrelli, Partner, Jenner & Block; Former Associate U.S. Attorney General
Following up on Brad DeLong's theme today, more on the Oregon Medicaid experiment (and whether expansion of Medicaid is a good idea -- DeLong has a more cautionary but ultimately positive take on the results -- Krugman comments here):How Medicaid affects adult health, MIT News: Enrollment in Medicaid helps lower-income Americans overcome depression, get proper treatment for diabetes, and avoid catastrophic medical bills, but does not appear to reduce the prevalence of diabetes, high blood pressure and high cholesterol, according to a new study with a unique approach to analyzing one of America’s major health-insurance programs. The study, a randomized evaluation comparing health outcomes among more than 12,000 people in Oregon, employs the same research approach as a clinical trial, but applies it in a way that provides a window into the health outcomes of poor Americans who have been given the opportunity to get health insurance. “What we found was that Medicaid significantly increased the probability of being diagnosed with diabetes, and being on diabetes medication,” says Amy Finkelstein, the Ford Professor of Economics at MIT and, along with Katherine Baicker of Harvard University’s School of Public Health, the principal investigator for the study. “We find decreases in rates of depression, and we continue to find reduced financial hardship. However, we were unable to detect a decline in the incidence of diabetes, high blood pressure, or high cholesterol.” A paper based on the study, “The Oregon Experiment — Medicaid’s Effects on Clinical Outcomes,” is being published today in the New England Journal of Medicine. The findings bear on the expansion of the federal government’s Affordable Care Act (ACA), currently being phased in across the nation. The ACA provides funding for states to expand Medicaid coverage to low-income adults who are currently not part of the program. Winning the lottery The researchers analyzed the impact that Medicaid had on people over a two-year span. Among other things, they found about a 30 percent decline in the rate of depression among people on Medicaid; an increase in people being diagnosed with, and treated for, diabetes; and increases in doctor visits, use of preventative care, and prescription drugs. They also found that Medicaid reduced, by about 80 percent, the chance of a person having catastrophic out-of-pocket medical expenses, defined as spending 30 percent of one’s annual income on health care. “That’s important, because from an economics point of view, the purpose of health insurance is to … protect you financially,” Finkelstein says. The researchers did not find any change in three other health measures: blood pressure, cholesterol, or a blood test for diabetes. But the data does provide important indicators about the ways newly-insured people are using medical services. “There was a big increase in the use of preventative medicine,” says Baicker, noting that Medicaid increased the use of services such as mammograms and cholesterol screening, as well as increasing doctor's office visits and prescription drugs. Other health researchers say these findings correspond with a developing picture of how increased medical care addresses different kinds of problems over different spans of time. “I would expect a more immediate impact when it comes to measures of mental health and emotional well-being, including depression,” says Thomas McDade, an anthropologist at Northwestern University and director of its Laboratory of Human Biology Research, who studies public-health issues. “Things like risk for cardiovascular disease, your lipid concentrations, your blood pressure, these are things that are really established over a lifetime of exposure to diet, physical activity, and psychosocial environment, so we don’t expect them to move as quickly.” The study uses data from a unique program the state of Oregon founded in 2008, after officials realized they had Medicaid funds for about 10,000 additional uninsured residents. The state created a lottery system to fill those 10,000 slots; about 90,000 residents applied. That lottery thus generated a group of residents gaining Medicaid coverage who were otherwise similar to the applicants still lacking coverage. Using this divide, the researchers compared to a control group of 6,387 people who signed up for the lottery and were selected to 5,842 people who applied for Medicaid but were not selected to enroll. “We recognized the lottery as a literally once-in-a-lifetime opportunity to bring the rigors of a randomized controlled trial, which is the gold standard in medical and scientific research, to one of the most pressing social policy questions of our day, namely, the consequences of covering the uninsured,” Finkelstein says. Or as Baicker puts it, “We would never accept a medical trial that didn’t have a control group.” In particular, this kind of study, by matching two like groups of people, eliminates one longstanding problem in studying health insurance: that people in worse health may seek out health insurance more often than those in good health do, thus making it appear, at a glance, that having health insurance does not help improve medical outcomes. “The whole tension with studying the effects of insurance is, you have to wonder why some people have insurance and other people don’t, and whether those reasons could be related to the outcomes you’re studying,” Finkelstein explains, “like the possibility that people who are sicker seek out insurance more. So you can get perverse results [on the surface], indicating that health insurance makes you sicker, not because it actually does, but because of the kinds of people who are seeking it out.” As McDade also notes, “It’s a true experiment, and these kinds of opportunities do not come along very often.” ...
Tethering -- students and dogs: Oregon Legislature today
Electronic tethering of students and actual tethering of dogs get the attention of Oregon lawmakers today. A Senate education committee takes up House Bill 2386, which would prohibit school districts from requiring a student to wear or carry an ...
2013 shaping up to be an active legislative session for Oregon laborNwlaborpress
Putting a focus on winePolkio.com
Oregon Senate approves guidelines for service animalsKATU
Mail Tribune -vtdigger.org
all 524 news articles »
Driver's license bill unfair to legal Oregonians - Portland Tribune
In the Oregon Legislature, few bills have been moved with greater alacrity than Senate Bill 833, which grants four-year driver's licenses to undocumented residents. Only three weeks ago, the Senate Business and Transportation Committee held the ...
Oregon driving bill becomes law at May Day rallyMcMinnville News-Register
Oregon governor signs immigrant driving billThe Register-Guard
Reward for a crime?Albany Democrat Herald (blog)
Statesman Journal -CAUSA (blog)
all 63 news articles »
- Rethinking Macroeconomic Policy - iMFdirect
- The core CPI-PCE inflation gap - FT Alphaville
- The Trouble with Low Inflation - Jared Bernstein
- Finite Sample Properties of GMM - Dave Giles
- Estimating an Euler Equation Using GMM - Dave Giles
- Treasury and MBS Markets as QE Continues - Carola Binder
- When does +2.2% = -0.1%? er, never? - Environmental Economics
- How Long Will You Be Willing to Tweet for Free? - Justin Fox
- Speaking of Getting It Wrong - Paul Krugman
- The lost generation - Free exchange
- Austerity is not the only answer to debt - Reinhart and Rogoff
- Any Examples of Thoughtful Public-Spirited Republicans? - Brad DeLong
- Six Ways to Separate Lies From Statistics - Bloomberg
- Skilled work - UnderstandingSociety
- Quandaries for Macroeconomic Policy - Tim Taylor
- Not Everything Is Political - Paul Krugman
- Is a Higher Borrowing Trajectory Warranted or Not? - Brad DeLong
- GOP bill prevents government from collecting economic data - Dylan Mathews
- I Debated Alan Reynolds on Fiscal Policy... - Brad DeLong
- Bubbles - Crooked Timber
FOMC Leaves Policy Unchanged, by Tim Duy: The FOMC concluded their two-day meeting by holding policy constant, as expected. The assessment of the economy was largely unchanged. From March:
Information received since the Federal Open Market Committee met in January suggests a return to moderate economic growth following a pause late last year. Labor market conditions have shown signs of improvement in recent months but the unemployment rate remains elevated. Household spending and business fixed investment advanced, and the housing sector has strengthened further, but fiscal policy has become somewhat more restrictive. Inflation has been running somewhat below the Committee's longer-run objective, apart from temporary variations that largely reflect fluctuations in energy prices. Longer-term inflation expectations have remained stable.
Information received since the Federal Open Market Committee met in March suggests that economic activity has been expanding at a moderate pace. Labor market conditions have shown some improvement in recent months, on balance, but the unemployment rate remains elevated. Household spending and business fixed investment advanced, and the housing sector has strengthened further, but fiscal policy is restraining economic growth. Inflation has been running somewhat below the Committee's longer-run objective, apart from temporary variations that largely reflect fluctuations in energy prices. Longer-term inflation expectations have remained stable.
Notably, recent data has had little impact on the Fed's economic outlook. This includes the last employment report as well. The inclusion of the term "on balance" was clearly intended to downplay the March numbers.
My interpretation is that the Fed is attempting to move away from being pulled this way and that by the monthly fluctuations of the data and instead focus on the underlying trend; presumably, it is that trend that should be guiding policy decisions. Of course, one could argue that that underlying trend should induce them to additional action, but that is neither here nor there at this point. From their perspective, policy is appropriate given that trend. The Fed also strengthened its language on fiscal policy, but again the damage so far is not sufficient to change the course of policy. Or, probably more accurately, the damage is not so great that the Fed is willing to let Congress hit the ball into their court.
The other significant change came latter in the statement. From March:
The Committee will closely monitor incoming information on economic and financial developments in coming months. The Committee will continue its purchases of Treasury and agency mortgage-backed securities, and employ its other policy tools as appropriate, until the outlook for the labor market has improved substantially in a context of price stability. In determining the size, pace, and composition of its asset purchases, the Committee will continue to take appropriate account of the likely efficacy and costs of such purchases as well as the extent of progress toward its economic objectives.
The Committee will closely monitor incoming information on economic and financial developments in coming months. The Committee will continue its purchases of Treasury and agency mortgage-backed securities, and employ its other policy tools as appropriate, until the outlook for the labor market has improved substantially in a context of price stability. The Committee is prepared to increase or reduce the pace of its purchases to maintain appropriate policy accommodation as the outlook for the labor market or inflation changes. In determining the size, pace, and composition of its asset purchases, the Committee will continue to take appropriate account of the likely efficacy and costs of such purchases as well as the extent of progress toward its economic objectives.
In the wake of the last meeting, comments from some Fed presidents as well as the minutes themselves seemed to imply that further expansion of the large scale asset purchase program was out of the question. Instead, it seemed the focus had firmly shifted to ending QE as soon as possible, with the end of this year as a goal. With this language shift, the FOMC pulls back on this direction, and instead makes it clear than an expansion of the program is still possible. And possible not only due to a changing employment outlook, but also due to a deteriorating inflation picture.
But isn't the inflation picture already deteriorating? Yes, the latest numbers suggest a worsening disinflation trend. The Fed, however, probably has not adjusted their forecast; they probably do not expect that substantially lower inflation is likely given that inflation expectations remain anchored and economic activity is not deteriorating. In such an environment, they likely are not all that concerned that inflation is running somewhat below target.
Moreover, the Fed has that cost-benefit analysis thing working in the background, and likely believes that any more than $85 billion a month is not likely to have large, positive marginal benefits. Not enough to justify expanding policy further for any small changes to the forecast.
Bottom Line: The urge to taper off quantitative easing has lessened since the last meeting. That pushes the beginning of the end back to the later back of the year. The door is open to additional stimulus as well, but I suspect that it would have to be driven by the employment side of the mandate. Clear evidence of a deflationary threat is likely necessary to drive action on the other side of the mandate; such a threat seems unlikely in an expanding economy.
I have a few comments at MoneyWatch on the Fed's decision to keep policy unchanged:What comes next for the Fed?
Publication in our best journals is based on factors other than merit, "author prestige also comes into play":
We spoke with Virginia economics professor William R. Johnson, who edited the edition of the Review in which the [Reinhart and Rogoff] paper first appeared.
This annual edition, "Papers and Proceedings," differs from all others in that the papers come out of presentations made at the yearly meeting of the American Economic Association, he said.
The papers are personally selected by the AEA's president-elect, in consultation with a committee.
As a result of these unusual circumstances, Johnson said, the editing of "Proceeding" papers is less rigorous.
"Normal peer review doesn't happen for these papers in the way of other issues of the AER." ...
But author prestige also comes into play, Johnson said, adding that that was true for all AER papers, not just the ones that appear in "Proceedings."We all knew this but it is a little surprisingly to see its admission. [via www.businessinsider.com]
It was awhile ago, but I once discovered an error in a paper in the AER (the author used the level of the price level rather than the log). When I pointed out the error, the author -- who is very well known (and now the Fed chair) -- wrote a letter to the editor of the AER arguing that it didn't materially affect the results, and subsequently our note pointing out the error was rejected (much like Reinhart and Rogoff argue that their results are not materially affected by their error, but I think the error mattered as it weakens the case in the Bernanke and Blinder paper, but even if it doesn't wrong results should be corrected -- the results cannot be replicated based on the information in the paper). The prestigious author won out over lowly me, and to this day there is a wrong result in a table in a influential paper in the debate over how to conduct monetary policy. The only place I know of where the error is noted is in a footnote to a (relatively obscure) paper of mine (along with Jo Anna Gray). The footnote says:Table 1 of Bernanke and Blinder [1992 ] reports marginal significance levels for the federal funds rate that are dramatically higher than those for M2. There are numerous differences between our studies, most of which are inconsequential. However, this discrepancy in results is due to a computational error in the Bernanke and Blinder study. While the error significantly affects some of the F-statistics reported by Bernanke and Blinder, it has little effect on the corresponding variance decompositions reported in the paper. We thank Ben Bernanke for providing assistance that allowed us to confirm and correct the error
Again, whether or not is materially affects the results is debatable. The F-statistics changed quite a bit, the IRFs less so, but in any case results containing computational errors should be corrected, especially in papers as influential as this one turned out to be. (Update: This also shows that the profession cares very little about making it easy to replicate results.)
We will never be a science so long as this crap persists.
Editorial: Legislature makes the right call on driver's cards
Corvallis Gazette Times
Earlier in the session, the Oregon Legislature made the right call when it approved a “tuition equity” measure that allows the children of immigrants to qualify for in-state tuition at Oregon universities, assuming they meet certain guidelines. The ...
May Day, driver's cards and biodiesel: Oregon Legislature today
SALEM -- The state capitol is expected to buzz with activity today during a May Day protest, which is expected to draw Latino and labor groups. The theme for the event is “immigrant spring,” a fitting backdrop for Gov. John Kitzhaber to sign Senate ...
What About Inflation?, by Tim Duy: I find Binyamin Applelbaum's Fed preview to be rather depressing and distressing. Appelbaum begins with a solid insight - reducing the unemployment rate is not the same as maximizing employment:
The Federal Reserve is making modest progress in its push to reduce the unemployment rate. But that is not the jobs goal Congress actually established for the Fed. The central bank is supposed to be maximizing employment. And on that front, it is not making progress.
Applelbaum points to the employment to population ratio as evidence that the Fed is falling short of the mandate. But are Fed officials ready to do more? No:
There is little sign, however, that Fed officials are considering an expansion of their four-year-old stimulus campaign as the Fed’s policy-making committee prepares to convene Tuesday and Wednesday in Washington.
Applelbaum notes that the recent flow of data has forced monetary policymakers to back away from talk of ending large scale assets purchases. But among the reasons to avoid expansion of the program we find this:
Another reason the Fed is not embracing new measures is that it already has tied the duration of low interest rates to the unemployment rate. The Fed said in December that it intended to hold interest rates near zero at least as long as the unemployment rate remained above 6.5 percent, provided that inflation remained under control. The theory is that the economy will get as much stimulus as it needs.
But what if the inflation rate is persistently below the target? Or, worse, trending lower? Clearly then the economy is not getting the stimulus it needs. If we are missing on both targets, then the economy needs more stimulus. And while we can debate the efficacy of monetary policy in influencing the pace of employment growth, surely monetary policy can influence the inflation rate. Correct?
The distressing part of this article is that it reads as if the Fed has given up not only on its ability to influence the pace of employment growth, but also on its ability to influence the inflation rate. Or, possibly worse, that the Fed is simply no longer concerned with the inflation rate now that the obvious threat of deflation has passed. This again feeds suspicion that the Fed's 2 percent target is really an upper bound.
Bottom Line: The Fed is supposed to have a dual mandate. Dual, as in two. Maximum employment and price stability. One would think that failing at the latter would be at least as important as failing at the former. Perhaps we are learning that the Evan's rule is flawed - it should not be about only conditions before which the Fed considers removing stimulus, but also conditions by which the Fed deliberately considers adding additional stimulus. A two-side Evan's rule is needed.
- Leading Macroeconomist Leaves Central Bank - mainly macro
- China Manufacturing Weakens - WSJ
- What the ECB can and cannot do to heal the eurozone - Gavyn Davies
- Fiscal forecasts: Governments vs independent agencies - Vox EU
- The Republican War on Social Science - Slate Magazine
- Some Official Data Come With Standard Errors! - Dave Giles
- There is No Debt Crisis In Europe - Macro and Other Market Musings
- Fed weighs tighter cap on bank leverage - FT.com
- The study behind 'uncertainty' is worse... - Ezra Klein
- Why the Fed probably won’t expand QE on Wednesday - Neil Irwin
- Confidence Intervals for Impulse Response Functions - Dave Giles
- The war on science takes a turn - Steve Benen
- Looking for Inflation? It Vanished - WSJ
- The Myth of the Manufacturing ‘Renaissance’ - WSJ
- Can You Say “Bubble”? - The Baseline Scenario
- Is antimatter anti-gravity? - EurekAlert
- Does antimatter fall up or down? - EurekAlert
- Which way for the ECB? - Bruegel Blog
- The Beatings Must Continue - Paul Krugman
- The missing workers - Economic Policy Institute
- Wealth Inequality and Political Inequality - Economix
- The mother of invention - Interfluidity
- Taylor on monetary policy - John Cochrane
- The Costs and Benefits of QE - Jared Bernstein
- The Economics Knowledge of High Schoolers - Tim Taylor
- Is it Crazy to Think We Can Eradicate Poverty? - NYT
I thought this session on electronic payments was interesting:The Future of Money: What's in Your Digital Wallet?
Tuesday, April 30, 2013 9:30 AM - 10:30 AM Speakers:
- Eric Dunn, Senior Vice President, Commerce Network Solutions, Intuit
- Paul Galant, CEO, Citi Enterprise Payments
- Mark Lavelle, Senior Vice President, Strategy and Business Development, PayPal
- Ed McLaughlin, Chief Emerging Payments Officer, MasterCard
Paul Krugman:Blogging is a bit like teaching the same class year after year; inevitably there are moments when you feel exasperated at the class’s failure to grasp some point you know you explained at length — then you realize that this was last year or the year before, and it was to a different group of people. So, I gather that the old core inflation bugaboo is rearing its head again — the complaint that it’s somehow stupid, dishonest, or worse to measure inflation without food and energy prices, often coupled with the claim that the statistics are being manipulated anyway. So, time for a refresher. ...
Hs refresher is here. Let me offer one of my own (from 2008):
Why Do We Use Core Inflation?: There is a lot of confusion over the Fed's use of core inflation as part of its policy making process. One reason for confusion is that we using a single measure to summarize three different definitions of the term "core inflation" based upon how it is used.
First, core inflation is used to forecast future inflation. For example, this recent paper uses a "bivariate integrated moving average ... model ... that fits the data on inflation very well," and finds that the long-run trend rate of inflation "is best gauged by focusing solely on prices excluding food and energy prices." That is, this paper finds that predictions of future inflation based upon core measures are more accurate than predictions based upon total inflation.
Second, we also use the core inflation rate to measure the current trend inflation rate. Because the inflation rate we observe contains both permanent and transitory components, the precise long-run inflation rate that consumers face going forward is not observed directly, it must be estimated. When food and energy are removed to obtain a core measure, the idea is to strip away the short-run movements thereby giving a better picture of the core or long-run inflation rate faced by households. I should note, however that this is not the only nor the best way to extract the trend and the Fed also looks at other measures of the trend inflation rate that have better statistical properties. Thus while the first use of core inflation was for forecasting future inflation rates, this use of core inflation attempts to find today's trend inflation rate [There is a way to combine the first and second uses into a single conceptual framework that encompasses both, but it seemed more intuitive to keep them separate. In both cases, the idea is to find the inflation rate that consumers are likely to face in the future.]
Let me emphasize one thing. If the question is "what is today's inflation rate," the total inflation rate is the best measure. It's intended to measure the cost of living and there's no reason at all to strip anything out. It's only when we ask different questions that different measures are used.
Third, and this is the function that is ignored most often in discussions of core inflation, but to me it is the most important of the three. The inflation target that best stabilizes the economy (i.e. best reduces the variation in output and employment) is a version of core inflation.
In theoretical models used to study monetary policy, the procedure for setting the policy rule is to find the monetary policy rule that maximizes household welfare (by minimizing variation in variables such as output, consumption, and employment). The rule will vary by model, but it usually involves a measure of output and a measure of prices, and those measures can be in levels, rates of change, or both depending upon the particular model being examined.
In general, a Taylor rule type framework comes out of this process ( i.e. a rule that links the federal funds rate to measures of output and prices). However, in the policy rule, the best measure of prices is usually something that looks like a core measure of inflation. Essentially, when prices are sticky, which is the most common assumption driving the interaction between policy and movements in real variables in these models, it's best to target an index that gives most of the weight to the stickiest prices (here's an explanation as to why from a post that echoes the themes here). That is, volatile prices such as food and energy are essentially tossed out of the index used in the policy rule.
The indexes that come out of this type of theoretical exercise often includes both output and input prices, and occasionally asset prices as well. That is, a core measure of inflation composed of just output prices isn't the best thing for policymakers to target, a more general core inflation rate combining both input and output prices works better. ...
Finally, there is also a question of what we mean by inflation conceptually. Does a change in relative prices, e.g. from a large increase in energy costs, that raises the cost of living substantially count as inflation, or do we require the changes to be common across all prices as would occur when the money supply is increased? Which is better for measuring the cost of living? Which is a better target for stabilizing the economy? The answers may not be the same. For a nice discussion of this topic, see this speech given yesterday by Dennis Lockhart, President of the Atlanta Fed:
Inflation Beyond the Headlines, by Dennis P. Lockhart, President, Federal Reserve Bank of Atlanta: ...Let me begin by posing the simple question: What do we mean by "inflation"? The answer to that simple question isn't as simple as it may seem.
The popular treatment of inflation in our sound bite society risks confusing inflation with relative price movements and the cost of living. By cost of living, I'm referring to the costs you and I incur to maintain our level of consumption of various goods and services including essential items such as food, gasoline, and lodging.
Relative price movements occur continuously in an economy as individual prices react to market forces affecting that good or service. Neither relative price movements nor sustained high living costs constitute inflation as economists commonly use the term....
And I think I'll end with this part of his remarks:
Attempts to measure the aggregate rate of price change—no matter how sophisticated—remain imperfect. As a result, when it comes to measuring inflation, judgment is needed to distinguish persistent price movements that underlie overall inflation from the relative price adjustments. Separating the inflation signal from noise involves much uncertainty—especially when making decisions in real time. Discerning accurately the underlying trend is difficult. It is essential for those of us who have responsibility for responding to these trends to use a wide variety of core measures and inflation projections to make the most informed judgment we can.
This shouldn't be overly surprising, but Nouriel Roubini is pessimistic about the global economy (bubble/boom for two years, then a big crash):Lunch Panel: Global Overview
Monday, April 29, 2013 12:00 PM - 1:45 PM Introduction By: Michael Klowden, CEO, Milken Institute Speakers:
- Pierre Beaudoin, President and CEO, Bombardier Inc.
- Scott Minerd, Managing Partner and Global Chief Investment Officer, Guggenheim Partners
- Nouriel Roubini, Chairman and Co-Founder, Roubini Global Economics; Professor of Economics and International Business, Stern School of Business, New York University
- Geraldine Sundstrom, Partner and Portfolio Manager, Emerging Markets Strategies Master Fund Limited, Brevan Howard
“Should public resources go to the group most likely to take full advantage of them, or to the group that is most desperately in need of assistance?”:Are we purging the poorest?, by Peter Dizikes, MIT News Office: In cities across America over the last two decades, high-rise public-housing projects, riddled with crime and poverty, have been torn down. In their places, developers have constructed lower-rise, mixed-use buildings. Crime has dropped, neighborhoods have gentrified, and many observers have lauded the overall approach.
But urban historian Lawrence J. Vale of MIT does not agree that the downsizing of public housing has been an obvious success.
“We’re faced with a situation of crisis in housing for those of the very lowest incomes,” says Vale, the Ford Professor of Urban Design and Planning at MIT. “Public housing has continued to fall far short of meeting the demand from low-income people.”
Take Chicago, where the last of the Cabrini-Green high-rises was torn down in 2011, ending a dismantling that commenced in 1993. Those buildings — just a short walk from the neighborhood where Vale grew up — have been replaced by lower-density residences. But where 3,600 apartments were once located, there are now just 400 units constructed for ex-Cabrini residents. Other Cabrini-Green occupants were given vouchers to help subsidize their housing costs, but their whereabouts have not been extensively tracked.
“There is a contradiction in saying to people, ‘You’re living in a terrible place, and we’re going to put massive investment into it to make it as safe and attractive as possible, but by the way, the vast majority of you are not going to be able to live here again once we do so,’” Vale says. “And there is relatively little effort to truly follow through on what the life trajectory is for those who go elsewhere and don’t have an opportunity to return to the redeveloped housing.”
Now Vale is expanding on that argument in a new book, “Purging the Poorest: Public Housing and the Design Politics of Twice-Cleared Communities”...
“Chicago and Atlanta are probably the nation’s most conspicuous experiments in getting rid of, or at least transforming, family public housing,” Vale explains. However, he notes, “It’s hard to find an older American city that doesn’t have at least one example of this double clearance.”
Essentially, Vale says, these cities exemplify one basic question: “Should public resources go to the group most likely to take full advantage of them, or to the group that is most desperately in need of assistance?”
Vale sees U.S. policy as vacillating between these views over time. At first, public housing was meant “to reward an upwardly mobile working-class population” — making public housing a place for strivers. Slums were cleared and larger apartment buildings developed, including Atlanta’s Techwood Homes, the first such major project in the country.
But after 1960, public housing tended to be the domain of urban families mired in poverty. “The conventional wisdom was that public housing dangerously concentrated poor people in a poorly designed and poorly managed system of projects, and we are now thankfully tearing it all down,” Vale says. “But that was mostly a middle phase of concentrated poverty from 1960 to 1990.”
Over the last two decades, he says, the pendulum has swung back, leaving a smaller number of housing units available for the less-troubled, which Vale calls “another round of trying to find the deserving poor who are able to live in close proximity with now-desirable downtown areas.”
Vale’s critique of this downsizing involves several elements. Projects such as Cabrini-Green might have been bad, but displacing people from them means “the loss of the community networks they had, their church, the people doing day care for their children, the opportunities that neighborhood did provide, even in the context of violence.”
Demolishing public housing can hurt former residents financially, too. “Techwood and Cabrini-Green were very central to downtown and people have lost job opportunities,” Vale says. Indeed, the elimination of those developments, even with all their attendant problems, does not seem to have measurably helped many former residents gain work... “We don’t have very fine-tuned instruments to understand the difference between the person who genuinely needs assistance and the person who is gaming the system,” Vale says. “Far larger numbers of people get demonized, marginalized or ignored, instead of assisted.” ... Ultimately, Vale thinks, the reality of the ongoing demand for public housing makes it an issue we have not solved.
“The irony of public housing is that people stigmatize it in every possible way, except the waiting lists continue to grow and it continues to be very much in demand,” Vale says. “If this is such a terrible [thing], why are so many hundreds of thousands of people trying to get into it? And why are we reducing the number of public-housing units?”