Blog

Economically Optimal Food Waste

It is hard to turn around without seeing another story on food waste.  The latest was this Freakonomics blog post covering an article in Foreign Policy by John Norris.  Just prior to that was a widely discussed study by Harvard Law School, which focused on the effect of expiration dates on food waste.  A widely cited statistic comes from this UN FAO publication, which suggests a third of all food produced is wasted. 

Much could be said about the methodological short-comings of many of the studies on this topic, not to say anything about the ideological motivations behind many (but certainly not all) such claims (waste is taken as some sort of condemnation of capitalism; the problem of production is “solved”, and we just need to distribute more equitably – as if one can confiscate and redistribute without destroying the incentive to produce). 

Nevertheless, when thinking about the problems of global hunger and feeding a growing population, all solutions need to be on the table, and reducing was is one of them.  As some of these publications make clear, there are legal and industry practices that could be changed to reduce waste (crazy policies like Bloomberg’s ban on food donations to homeless shelters because of salt content is one obvious example), and we should never forget technological advancements that help prevent waste. (preservatives anyone?) 

But, we will never have zero waste. 

Why?  As my friend Bailey Norwood pointed out to me the other day: there is an economically optimal amount of waste. 

Do you ever buy milk with the expectation that some of it will get thrown out?  I do.  The cost to me of running out of milk and having to go out to buy more at midnight if one of my kids has a midnight craving is much higher than the cost of buying an extra half gallon which goes sour before it can be completely consumed.  Convenience, hassle avoidance, and extra trips to the store all are valuable to me; valuable enough that it occasionally makes sense to throw away a little milk.  Otherwise, I’d be throwing away my valuable time, sleep, and gas to the store.  One thing “wasted” is another thing gained (or at least not foregone).

At each and every phase of the food production, distribution, and consumption chain, similar calculations will reveal situations in which the benefit of preventing waste simply isn’t high enough to merit the effort.

I’m sure there must be some papers on this in the economics literature, but a quick search didn’t reveal much.  Some sort of modeling would be useful to identify the determinants of waste, and reveal when it is actually economically efficient to do something about it.

The Foreign Policy article has some useful discussion of factors that could fit well into an economic model of waste.  My intuition is that it is more likely to be economically optimal to waste when:

  • food prices are lower relative to fuel, storage, etc.
  • incomes are higher
  • food preserving technologies (e.g., infrastructure, refrigeration, sodium benzoate, etc.) are more expensive or less available
  • there is greater demand for freshness, appearance, etc. (likely correlated with income)  
  • laws encouraging waste are more prevalent

Thoughts?  

Reduced Meat Consumption and Environmental Impacts

It is often said by environmental groups and by many in the media that eating meat is one of the worst things one can do for the environment. 

Just to give a few examples, NPR ran a series of shows last year about this time about meat.  In one of these shows, it was said  that meat consumption has: 

more of an impact on the environment than any other food we eat.

and Dan Charles, the NPR correspondent wrote meat production:

It's one cause of deforestation, global warming, water pollution, a lot of environmental problems

To give another example, Bryan Walsh, writing for TIME magazine in 2008 said: 

It's true that giving up that average 176 lb. of meat a year is one of the greenest lifestyle changes you can make as an individual.

And, of course, one can find even more polemical arguments that make a similar case, such as Mark Bittman's TED talk.

One of the bases for these claims are the greenhouse gas emissions caused by livestock production.  Estimates widely vary, but one common stat cited from the UN FAO is that livestock are responsible for 18% of all global greenhouse has emissions (note, however, some mistakes in their calculations have come to light suggesting this figure is inflated).  Some environmental groups put the statistic much higher, saying livestock production is "tied to" 51% of global greenhouse emissions (a figure I don't find many credible scientists supporting).  But our own EPA estimates that within the US that ALL of agriculture only contributes 8% of total greenhouse gas emissions from 1990-2011, and only 6.9% in 2011.  Livestock, thus must be something less than this (it was estimated at around 3% by the EPA a few years ago).   

I mention all this because of several news reports I've heard in the past couple days, such as this one from the Washington Post, indicating:

Greenhouse gas emissions from power plants and other industrial facilities declined by 4.5 percent from 2011 to 2012 as utilities continued to switch from coal to natural gas to generate electricity and produced slightly less power overall, the Environmental Protection Agency reported Wednesday.

Greenhouse gas emissions from these sources have declined by 10 percent in the two years since the EPA began compiling the data in 2010.

A 4.5% reduction in 1 year and a 10% reduction in two years is a sizable change. According to the EPA data, power plants account for 31% of total U.S. greenhouse gas emissions emissions.  Thus, a 10% decrease in power plant emissions results in a 3.1% decrease in total US emissions.  

How much would one have to cut livestock production to achieve this same 3.1% decrease in total US emissions resulting from a switch to natural gas (primary brought about, in part, by fracking technology)?  Well, simple math shows that it if you hold the share of greenhouse gas emissions by livestock constant, you'd have to reduce livestock production by more than 100% if you believe the EPA's figure (that 3% of all GHG emissions are from livestock) or 17.2% if you believe the UN FAO's number (that 18% of all GHG emissions are from livestock) to achieve the same outcome that we've actually witnessed in the last two years in part through fracking.   Yes, reducing livestock production might reduce greenhouse gas emissions, but it seems much more has been cut by a switch from coal to gas than we can probably ever expect by reducing meat consumption.  

It is also useful to add that technological change in has led to reductions in greenhouse gas emissions in livestock production.  One study in the Journal of Animal Science by Jude Capper calculates technological change from 1997 to 2007 has reduced methane  emissions by about 23% and carbon emissions by about 20%.  Indeed, the executive summary from the EPA's report on changes in emissions indicates a major reduction in methane emissions has come from changes in livestock production (emphasis added):

CH4 emissions, which have decreased by 8.2 percent since 1990, resulted primarily from natural gas systems, enteric fermentation associated with domestic livestock, and decomposition of wastes in landfills.

Effects of Climate Change

Matt Ridley has an interesting piece in the Spectator on the effects of climate change.  He makes the rather unremarkable observation that we should count the benefits, not just the costs of climate change.  Unremarkable except that people almost exclusively focus on the costs.  

What are these benefits?  He writes: 

The chief benefits of global warming include: fewer winter deaths; lower energy costs; better agricultural yields; probably fewer droughts; maybe richer biodiversity.

and  

The greatest benefit from climate change comes not from temperature change but from carbon dioxide itself. It is not pollution, but the raw material from which plants make carbohydrates and thence proteins and fats.

Ridley reads the scientific research to suggest that the benefits will exceed the costs unless temperature rises too high - or until about 2080 - according to some projections.  He writes: 

You can choose not to believe the studies Prof Tol has collated. Or you can say the net benefit is small (which it is), you can argue that the benefits have accrued more to rich countries than poor countries (which is true) or you can emphasise that after 2080 climate change would probably do net harm to the world (which may also be true). You can even say you do not trust the models involved (though they have proved more reliable than the temperature models). But what you cannot do is deny that this is the current consensus. If you wish to accept the consensus on temperature models, then you should accept the consensus on economic benefit.
Overall, Prof Tol finds that climate change in the past century improved human welfare. By how much? He calculates by 1.4 per cent of global economic output, rising to 1.5 per cent by 2025. For some people, this means the difference between survival and starvation.
It will still be 1.2 per cent around 2050 and will not turn negative until around 2080. In short, my children will be very old before global warming stops benefiting the world. Note that if the world continues to grow at 3 per cent a year, then the average person will be about nine times as rich in 2080 as she is today. So low-lying Bangladesh will be able to afford the same kind of flood defences that the Dutch have today.

 

How much do you value the present over the future

Economists have long been interested in people's "discount rates" - the rate at which people discount the value of a dollar in the future as compared to a dollar today.  If given the choice between being given a dollar today or a dollar ten years from now, I suspect almost everyone would take the dollar today.  The key question researchers try to answer is this: exactly how many dollars would you have to be given in, say, 10 years to make you indifferent to 1 dollar today?  

What this number is has important implications for how we should "discount" the value of projects that provide benefits in the future by incurring costs today.  Examples where a discount value is needed include road building, going to college, and building a factory.  But, the issue has become particularly heated in relation to debates over climate change.  Whether and to what extent one is willing to incur costs today to mitigate carbon emissions depends critically on the extent to which the future benefits (and potential costs) are discounted.   

One can get a feel for this time trade-off by looking at market interest rates but that doesn't tell the whole story.  As a result, many economists have turned to laboratory experiments where people make choices like the one I described above.  The trouble has been that such experiments have been limited to making future payoffs that are typically only 1 to 6 months away.  My co-authors, Threse Grijalva and  Douglass Shaw, and I found away around this, and the results are discussed in a paper forthcoming in the journal Environmental and Resource Economics

We use a laboratory experiment to elicit discount rates over a 20-year time horizon using government savings bonds as a payment vehicle. When using a constant (exponential) discount rate function, we find an implied average discount rate of 4.9 %, which is much lower than has been found in previous experimental studies that used time horizons of days or months. However, we also find strong support for non-constant, declining discount rates for longer time horizons, with an extrapolated implied annual discount rate approaching 0.5 % in 100 years. There is heterogeneity in discount rates and risk preferences in that people with more optimistic beliefs about technological progress have higher discount rates. These findings contribute to the debate over the appropriate discount rate to use in comparing the long-term benefits of climate change mitigation to the more immediate costs.

 

Is making beef like making beer?

That is the claim in this piece at CNN by the Director of New Harvest, which aims to make lab-raised meat. 

The production of beer requires living organisms -- yeast -- and nourishment for those organisms -- grain. How these elements come together with others to make beer is straightforward in theory, and nuanced in practice. The products are varied and distinct.
Cultured meat production is extremely similar. Explained simply, all that is required is a cell line and nourishment for those cells. How the cells are grown, and under what conditions, are adjustable. The potential variety of materials and processes will allow cultured meat to take on many distinctly unique forms, flavors and textures.

I really liked the following two paragraphs about technology in food:

It's a new way of thinking because this is food science by the public, for the public. It prompts a widespread conversation about a food technology years in advance of its market release rather than years afterward. It's a new way of thinking because it's a technology largely driven by societal demand and people have pushed this forward, as donors to a cause.
The biggest reasons why cultured meat hasn't progressed further is a lack of funding and a lack of creative understanding. We're not used to food technology being a positive solution. We're not used to food development being nonprofit. And we're not used to a nonprofit group generally categorized as an animal rights/environmentalist group requiring a cancer research-scale budget. But we're learning.