Blog

What happens when we ban the slaughter of horses?

One of my former Ph.D. students, Mallory Vestal, sought to answer that question in a paper that we just published in the Journal of Agricultural and Applied Economics.  Mallory is a horse-lover, a former graduate assistant coach of the Oklahoma State University Equestrian team, and is now an assistant professor at West Texas A&M University. Here's the abstract of the paper:

As a result of several judicial rulings, processing of horses for human consumption came to a halt in 2007. This article determines the change in horse prices resulting from elimination of horse-processing facilities. As expected, lower-valued horses were more affected by the ban than higher-valued horses. The analysis suggests the slaughter ban reduced horse prices, on average, by about 13% and resulted in a loss in producer surplus to sellers of approximately 14% at the sale we analyzed. We also show horse prices are affected by a myriad of factors including breed, gender, age, coat color, and sale catalog description.

Because "lower value" horses were those most likely to (eventually) head to the slaughter house, we anticipated that their prices would be most affected by the slaughter ban, and that's indeed what we found.  Here's the impact of the ban on horses priced in the upper 20$, 40% . . and 80% of the price distribution.  

There were a number of interesting side-results, like these . . .

The indicator variables related to the horse catalog descriptions were significantly
associated with horse prices. Consistent with Levitt and Dubner (2005), an ambiguous description such as “nice” was shown to negatively impact prices by −5% to −10% across all models. A more objective descriptive variable such as “finished” was significant in several of the quantiles examined and in the OLS model. Including the word “finished” in the horse’s description was associated with increased prices from 26% to 68%. This result is intuitive as it indicates the horse has specialized training and will be ready to show in the specified discipline. Another descriptive and informative variable, “100% sound,” positively impacted prices from 8% to 11%, whereas “athletic” and “quiet/gentle” negatively impacted higher-quantile prices by −10% and −8% respectively.

Want to know my own view on eating horse meat?  I hinted at it in this editorial.

New Dietary Guidelines

The federal committee that makes dietary guidelines and recommendations has just released their newest report.  As expected, they've incorporated "sustainability" objectives and have recommended a move away from meat eating.  I've previously commented on the the problem with a single committee making both nutritional and sustainability recommendations, and I had a piece in the Wall Street Journal on environmental impacts of meat production.   Now we can take a look at what's actually been proposed.

Here's one tidbit from a Washington Post summary on the issue.

“We’re not saying that people need to become vegans,” said Miriam Nelson, a professor at Tufts University and one of the committee’s members. “But we are saying that people need to eat less meat.”

The panel’s findings, which were released to the public in the form on a 572 page report this afternoon, specifically recommend that Americans be kinder to the environment by eating more plant-based foods and fewer animal-based foods. The panel is confident that the country can align both health goals and environmental aims, but warns that the U.S. diet, as currently constructed, could improve.

Other conservative news sources point to some pretty heavy handed portions of the report.  The Dietary Guidelines Advisory Committee (DGAC):  

called for diet and weight management interventions by “trained interventionists” in healthcare settings, community locations, and worksites.

"Interventionists" is the right word here, but rarely are interventionists so forthcoming in their intentions.They also want to tax foods, limit speech, and monitor TV use.   

DGAC also called for policy interventions to “reduce unhealthy options,” limit access to high calorie foods in public buildings, “limit the exposure” of advertisements for junk food, a soda tax, and taxing high sugar and salt items and dessert.

“Align nutritional and agricultural policies with Dietary Guidelines recommendations and make broad policy changes to transform the food system so as to promote population health, including the use of economic and taxing policies to encourage the production and consumption of healthy foods and to reduce unhealthy foods,” its report read.

“For example, earmark tax revenues from sugar-sweetened beverages, snack foods and desserts high in calories, added sugars, or sodium, and other less healthy foods for nutrition education initiatives and obesity prevention programs.”

The amount of sedentary time Americans spend in front of computers and TV sets is also a concern to the federal panel.

If you think this is a one-off isolated example, you haven't been paying attention.

Why people lie on surveys and how to make them stop

Companies spend millions (perhaps billions?) of dollars every year surveying consumers to figure out want they want.  Environmental, health, and food economists do the same to try to figure out the costs and benefits of various policies.  What are people willing to pay for organic or non-GMO foods or for country of origin labels on meat?  These are the sorts of questions I'm routinely asked.

Here's the problem: there is ample evidence (from economics and marketing among other disciplines) that people don't always do what they say they will do on a survey.  A fairly typical result from the economics literature is that the amount people say they are willing to pay for a new good or service is about twice what they'll actually pay when money is on the line.  It's what we economists call hypothetical bias.

We don't yet have a solid theory that explains this phenomenon in every situation, and it likely results from a variety of factors like: social desirability bias (we give answers we think the surveyor wants to hear), warm glow, yea-saying, and self presentation bias (it feels good to support "good" causes and say "yes", and why not say we're willing to do something, particularly when there is no cost to doing so and it can make us look and feel good about ourselves), idealized responses (we imagine whether we'd ever buy the good when we eventually have the money and the time is right, rather than answering whether we'd buy it here and now), strategy (if we think our answers to a survey question can influence the eventual price that is charged or whether the good is actually offered, we might over- or under-state our willingness to buy), uncertainty (research suggest a lot of the hypothetical bias comes from those who say they aren't sure about whether they'd buy the good), among other possible reasons.

What to do?

Various fixes have been proposed over the years.

  • Calibration.  Take responses from a survey and reduce them by some factor so that they more closely approximate what consumers will actually do.  The problem: calibration factors are unknown and vary across people and goods.
  • Cheap talk.  On the survey, explain the problem of hypothetical bias and explicitly ask people to avoid it.  The problem: it doesn't always "work" for all people (particularly experienced people familiar with the good), and there is always some uncertainty over whether you've simply introduced a new bias.
  • Certainty scales.  Ask people how sure they are about their answers, and for people who indicate a high level of uncertainty, re-code their "yes" answers to "no".  The problem: the approach is ad-hoc, and it is hard to know a priori what the cut-off on the certainty scale should be.  Moreover, it only works for simple yes/no questions.
  • Use particular question formats.  Early practitioners of contingent valuation (an approach for asking willingness-to-pay popular in environmental economics) swear by a "double-bounded dichotomous choice, referendum question" which they believe has good incentives for truth telling if respondents believe their answers might actually influence whether the good is provided (i.e., if the answer is consequential).  I'm skeptical.  I'm more open to the use of so-called "choice experiments", where people make multiple choices between goods that have different attributes, and where we're only interested in "marginal" trade offs (i.e., whether you want good A vs. good B).  There is likely more bias in the "total" (i.e., whether you want good A or nothing).    

There is another important alternative.  If the problem is that surveys don't prompt people to act as they would in a market, well, whey don't we just create a real market?  A market where people have to give up real money for real goods - where we make people put their money where their mouth is?  It is an approach I wrote about in the book Experimental Auctions with Jason Shogren and it is the approach I teach with  Rudy Nayga, Andreas Drichoutis, and Maurizio Canavari in the summer school we have planned for this summer in Crete (sign up now!)  It is an approach with a long history , stemming mainly from the work of experimental economists.

One of the drawbacks with the experimental market approach is that it is often limited to a particular geographic region.  You've got to recruit people and get them in a room (or as people like John List and others have done, go to a real-world market already in existence and bend it to your research purposes).   

Well, there's now a new option with particularly wider reach.  Several months ago I was contacted by Anouar El Haji who is at the Business School at the University of Amsterdam.  He's created a simple online platform he calls Veylinx where researchers can conduct real auctions designed to give participants an incentive to truthfully reveal their maximum willingness-to-pay.  The advantage is that one can reach a large number of people across the US (potentially across the world).  It's a bit like ebay, but with a much simpler environment (which researchers can control) with a clearer incentive to get people to bid their maximum willingness-to-pay.  

One of the coolest parts is that you can even sign up to participate in the auctions.  I've done so, and encourage you to do the same.  Hopefully, we'll eventually get some auctions up and running that relate specifically to food and agriculture.  

Food Demand Survey (FooDS) - February 2015

The newest release of the Feed Demand Survey (FooDS) is now out.

Compared to last month, we found 8% to 15% jumps in willingness-to-pay (WTP) for both beef products (steak and hamburger) and for deli ham.  There was also a sizable increase (9%) in spending on food away from home relative to last month.  

Following up on all the controversy surrounding last month's question on DNA labeling, we delved into the issue again, but this time in a slightly different way.  First, we asked the question in isolation (on a single page by itself), rather than in a list with other food policy issues (Ben Lillie had argued in a blog post following our last result our result was at least partially due tot he fact that the DNA label issue appeared in a list with other issues).  Secondly, the question was reworded so that it was clear that the label was meant to indicate the presence absence of DNA.  The precise wording was, "Do you support or oppose mandatory labels on foods that would indicate the presence or absence of DNA?"  The choice options were support or oppose (the order of which was randomized across respondents). We found essentially the same result as before, 83.5% of respondents supported DNA labeling (note: sample size is 1,001, sampling error is +/-3%, sample weighted to match the population demographics).   

I also looked at the demographic breakdown of those who answered support vs. oppose.  For those who supported, 43%  had a college degree, 49% were female, 46% were Democrats, and 20% were Republications; for those who opposed, 58% had a college degree, 45% were female, 38% were Democrats, and 28% were Republicans. Education and political party affiliation appear to be partial drivers of support for DNA labeling.

Then, on a following page, we asked a number of true/false questions to gauge people's knowledge about DNA, genetics, etc.

Most respondents, 64.6%, correctly knew it was false that "ordinary tomatoes do not contain genes while genetically modified tomatoes do."  However, a remarkably high number of respondents, 52%, said it was false that "all vegetables contain DNA", and only 58.6% that it was true that "yeast for brewing beer contains living organisms."