Blog

Look at Me, I'm Buying Organic

That’s the title of a new paper I co-authored with Seon-Woong Kim and Wade Brorsen, which was just published in the Journal of Agricultural and Resource Economics.

We know consumers have a number of motivations for buying organic food - from perceptions about health, taste, safety, and environment to perceptions about impacts on smaller farmers. Whether these perceptions are accurate is debatable. In this paper, we were interested in an all together different motivation: the extent to which consumers feel social pressure to buy organic.

In our study, people made simulated choices between apples or milk cartons, where one of the characteristics was the presence or absence of the organic label. People were divided into one of four groups:

1) The control (CTRL): no manipulation.

2) The eye (EYE) treatment. This is going to sound crazy, but following some previous research, we showed an image of a person’s eyes on the screen as people were making their apple and milk choices. Prior research suggest that exposure to an image of eyes creates the aura of being watched, which increases reputational concerns and cooperative behavior.

eyes2.JPG

3) The name (NAME) treatment. Just prior to the apple/milk choices, people in this group were asked to type in their first name, and we ask them to confirm if they lived in the location associated with their IP address. The idea was to remove the perception of anonymity and increase social pressure.

4) The friend (FRND) treatment. Here we used a vignette approach. Jut prior to the apple/milk choices, people were told, “Now, imagine you are in the specific situation described below. Your good friends have family visiting. They’ve asked you to help out by taking their sibling, whom you’ve never met, to the grocery store. While you’re there with your friend’s sibling, you also need to do some shopping for yourself.”

If social pressure is a driver of organic food purchases, willingness-to-pay for the organic label would be expected to be higher in the EYE, NAME, and FRND treatments relative to the control.

Here are some summary statistics showing the percent of choices made by consumers in each treatment group in which a product with the organic label was chosen.

organic_seon.JPG

We see some support for the idea that organic is more likely to be chosen in the social pressure treatments (EYE, NAME, FRND) than the control. The effects are statistically different, but that aren’t huge for every treatment considered. However, the above table doesn’t consider price differences. When we convert the choices into a measure of willingness-to-pay, we find the biggest effect is for the vignette (FRND). For this treatment, we find willingness-to-pay for organic is about 88% higher for apples and 82% higher for milk than the control.

For all groups, we found that education levels moderate the relationship between the social pressure treatment variables and willingness-to-pay for organic. In particular, social pressure is higher for more highly educated consumers. The effect was particularly large in the EYE treatment, where more highly educated consumers valued the organic label between 150% and 200% than less educated consumers when exposed to eyes.

organic_seon2.JPG

These results provide evidence that at least a portion of organic consumption is likely driven by a form of conspicuous consumption. Some might call it a form of conspicuous conservation, but that’s a whole other can of worms.

Hierarchy, Disagreement, and Food Politics

Discussions about food are frequently divisive.  Low-carb or low-fat?  Organic or conventional?  Local or exotic?  Is our food system fantastic or broken?

Now, look out into the future to the year 2050.  Do you think our future food conversations will be more or less divisive than they are today?  As much as I hope the opposite, I suspect that we're likely to have more disagreement, not less, as we we go forward.  

Here's my theory.  You've no doubt heard of Maslow's Hierarchy of Needs, which characterizes stages of human growth.  The basic idea is that one has to satisfy more basic needs (e.g., food and shelter for survival) before moving on to worry about other "higher" needs, like social belonging.  Other's have posited a similar phenomenon in the domain of food.  For example, see Ellyn Satter in this 2007 academic article where she lays out a hierarchy of food needs.

Below, I've constructed my own version of Satter's food need hierarchy.  At the bottom, when people are highly income and resource constrained, people are asking questions like, "how do I get enough calories to eat?"  Once that question is answered, they can then worry about other things like: "Is this food safe?"  As a person (or country) develops and gains more income, they move from food being primarily consumed to survive to food consumption eventually serving as a form of self expression and actualization.

hierarchy.JPG

So, here's my twist on this.  When a community or country is largely at the bottom of the pyramid, there is likely to be broad agreement about "society's" objective in the food and farm realm: produce enough food to eat.  However, at the top of this pyramid, there is no reason to expect "society" to agree on the primary objective.  Satter called the top of this pyramid "instrumental food" and she said such foods were consumed to "achieve a desired physical, cognitive, or spiritual outcome."  If we're talking about food satisfying a particular view of what I think of myself (I eat what I am) or food satisfying a "spiritual outcome", why would we expect you and I to agree on what is "best"?  In this sense, we might expect food consumption to be more politicized.  

Satter also says of such food consumption, "These instrumental reasons may or may not be rational or supported by scientific inquiry."  No kidding!  That's precisely the world in which we now live.  A couple of years ago, for example, the Pew Foundation found that the widest gap between the general public and scientists was on the topic of the safety of GMOs.  Clearly, something other than peer-reviewed science is driving many people's food beliefs and consumption patterns.  

Another challenge is that psychology research shows that we have a tendency to think others are more like us than they actually are.  For those of us who have had the opportunity to "move up" the pyramid, we might forget the more foundational challenges many food consumers' face.  This might be one of the causes of food paternalism I've written about on a number occasions - the view that others should be eating more like me.  This quote from a psychology paper on "egocentric empathy gaps" is particularly apt:

A traditional Irish proverb, for example, states that ‘the full person does not understand the needs of the hungry.’ Most people in affluent societies may have little appreciation of the desperation of true starvation, and may consequently work less to alleviate it than if they understood how hunger really felt.”

It's not just that we might "work less" but that we might work to solve the problem in ways that suit our own particular desires rather than those we aim to help.  

So, is my little theory correct?  That greater affluence will lead to greater disagreement about which food and food systems are ideal?  As we often say in academic papers when we don't know the answer: "that question is left to future research."  

When Consumers Don't Want to Know

Since I first started working on the topic of animal welfare, I've had the sense that some (perhaps many?) consumers don't want to know how farm animals are raised.  While that observation probably rings intuitively true for many readers, for an economist it sounds strange.  Whether we're talking about GMO labeling, nutritional labels, country of origin labels on meat, or labels on cage free eggs, economists typically assume more information can't make a person worse off.  Either the consumer uses the information to make a better choice or they ignore it all together.    

There is a stream of literature in economics and psychology that is beginning challenge the idea that "more information is better."  One simple explanation for the phenomenon could be that consumers, if they know for sure they will continue to consume the same amount of a good, could be better off ignoring information because the information could only lower their satisfaction (perhaps because they'll feel guilty) for doing something they've already committed to doing.  In this paper by Linda Thunstrom and co-authors, 58% of consumers making a meal choice chose to ignore free information on caloric content, a finding that Thunstrom calls "strategic self ignorance" arising from guilt avoidance. 

Another possible explanation that I've previously published on is that, when people have limited attention, more information on topic A might distract people from a topic B, even though topic B ultimately has a larger impact on the consumers well-being.  

It may also be the case that people want to believe certain things.  They derive satisfaction from holding onto certain beliefs and will avoid information that challenges them.  These ideas and more are discussed by Russell Golman, David Hagmann and George Loewenstein in a nice review paper on what they call "information avoidance" for the Journal of Economic Literature.

A graduate student in our department, Eryn Bell, has been working with Bailey Norwood to apply some of these concepts to the topic of animal welfare.  They conducted a survey of 1,000 Oklahomans and asked them one of the two simple questions shown below.  Depending on how the question was asked, from 24% to 44% of respondents self declared that they would rather NOT know how hogs are raised.  The primary reasons given for this response were that farmers were trusted (a belief consumers may prefer to hold), that there are more important issues to worry about (limited attention), and guilt aversion. 

In the same survey, Bell and Norwood also included a set of questions based on some ideas I suggested.  The question gave respondents the option to see a picture of how sows are raised or to simply see a blank screen for a certain period of time.  People were divided into three groups that varied how long they had to see the blank screen.  The idea was that we could use the waiting time as a "cost", which would allow us to ask: how long are people willing to wait to NOT receive free information?  As it turns out, people weren't very sensitive to the waiting time.  Nonetheless, regardless of the waiting time, about a third of respondents preferred to see an uninformative blank screen as opposed to a more informative screenshot of sow housing.  These findings suggest at least some people, at least some of the time, would prefer not to know.  

When behavioral biases meet the market

Have you ever gone shopping, only to be overwhelmed by the number of options available to choose from?  You're not alone.  In fact, psychologists have created a name for the phenomenon: the "excessive choice effect."  In one of the more famous studies on the topic, aptly titled "When choice is demotivating", the authors found that when consumers were offered the opportunity to buy an exotic jam, 30% bought when only 6 varieties were presented.  However, only 3% of consumers bought when 24 variety were presented.  On the face of it, this seems to violate basic economic logic: when there are more varieties available, there is a greater likelihood of finding one you like, and thus there should be a higher likelihood of purchase.  

These sorts of findings have led to popular books (like this one titled The Paradox of Choice) and some bold claims that we'd all be happier and our society would have less depression if we (or namely the government) restricted our choice and freedom.  

Well, as it turns out, subsequent studies found that the "excessive choice effect" doesn't always exist, and the phenomena is much more nuanced than first suggested.  

Now, enter of of my Ph.D. students, Trey Malone (who is on his way to an assistant professor position at Michigan State University).  Our co-authored paper on this topic was just released by the Journal of Behavioral and Experimental Economics.  Trey's insight was this: if the "excessive choice effect" (or ECE) exists, surely companies will want to do something about it.  It's bad business to present consumers with so many options that they don't make a purchase.  Yet, in many markets (and in particular in the market for craft beer which was the focus of our study), there is an apparent explosion of variety of choice.  What's going on?  

From the paper:

In a competitive market, the choice architecture is endogenous, and sellers compete to provide environments that consumers find appealing, thereby increasing profits. In such cases, the market, at least partially, provides incentives to ameliorate the ECE by, for example, reducing search costs for consumers (e.g., see Kamenica, 2008; Kuksov and Villas-Boas, 2010; Norwood, 2006). This raises the possibility that ECE may arise in laboratory contexts or oneshot field experiments while at the same time having limited relevance in day-to-day business decisions. Whereas prior research mainly focus on the identification of an ECE, we show that sellers have access to market-specific mechanisms (or informational nudges) that narrow its influence. We demonstrate that if the ECE exists, sellers can mitigate or exasperate its negative effects through targeted interventions.

The interventions (or private nudges) that we consider were beer sellers providing consumers more information about the varieties either through a "special" or the provision of beer advocacy scores.  

Trey worked with a local wine bar in town to run field experiments. Unbeknownst to the patrons, we strategically varied the number of options on the beer menu over time.  The menu either presented 6 or 12 options (note that the menu of 12 included all 6 of the varieties on the smaller menu).  And, we also varied information about the beers as previously indicated, sometimes there was no extra information (the control) and other times we tried to reduce search costs by labeling one of the options a "special" or by providing beer advocacy scores for each option (these are akin to a quality rating by a reliable third party).  

The results are summed up in the following graph:

Thus, we found that the excessive choice effect was alive and well in a real-life purchase setting (people were more likely to NOT buy a beer when there were 12 options as compared to 6), but only when no extra information was provided.  The effect reversed itself when the menu included beer advocate scores. These results show how the excessive choice effect might be turned on and off by companies manipulating search costs.  

One of the main lessens here is that it would be a mistake to take a finding of a supposed "behavioral bias" (like the excessive choice effect) in a laboratory experiment to make grand claims for large government interventions without also considering how consumers and businesses themselves might react to those very same biases in the course of everyday life.  

Do you plan to spend more or less eating out in the next two weeks?

The title of this post is based on a question I ask of food consumers every month in my Food Demand Survey (FooDS).  If I had to guess your response, I'd go with "spend less."  Why?  Because every month, for almost four years, that has been the average response to the question (the exact question is: "Do you expect to spend more or less on food bought during grocery shopping in the next two weeks as compared to the previous two weeks?" and response categories are: "I plan to spend about . . . 10% less, 5% less, the same, 5% more, or 10% more").   

FAFHanticipate.JPG

Here is the problem with the above results.  They are almost certainly false.  If people are continually, month after month, saying they plan to spend less on food away from home, the cumulative effect would ultimately be a negative amount of spending.  

Moreover, another question I ask on the survey relates to how much the respondent says they spend (in dollars) on food away from home (exact question wording: "What has been you (or your household's) usual WEEKLY expense for meals or snacks from restaurants, fast food places, cafeterias, carryout or other such places?"  The response categories are: less than $20, $20-$39, . . . $140-$159, $160 or more).  

In the most recent issue of FooDS, we estimate the average level of spending on food away from home in January 2017 was $53.26/week.  The average answer from the previous month (December 2016) was $50.89/week.  So, in terms of stated expenditure, there was a $53.26-$50.89=$2.37 increase (or a (2.37/50.89)*100=4.66% increase). Yet, (and here is the problem), In December 2016, people said they planned to reduce spending on food away from home by, on average, -0.59%, and in January 2017, they said they plan to reduce spending on food away from home by, on average, -1.47%.

Here is what I get if I calculate "actual" changes in reported levels of spending on food away from home against people's stated plans to increase or decrease spending (the blue bars are the same blue bars as in the above graph, they just look different because the vertical axis has been re-scaled).       

So, what is going on here?  One possible answer is that consumers suffer from a type of self-control problem.  We tell ourselves we want to reduce the amount we're spending on food away from home in the future.  But, when the future arrives, we forget our plans and have fun eating out with our friends and keep spending as usual.  If this is correct, eating out is a sort of "guilty pleasure" - something we enjoy but wish we could force our future selves to cut back on.     

The propensity of an individual to say they plan to reduce spending on food away from home relates to a variety of demographic variables (even after controlling for the month-to-month effects that may be driving changing spending patterns).  Income is a major determinant.  Lower income people are much more likely to say they plan to reduce spending on food away from home than higher income respondents.  Indeed, for the highest income households, there is no consistent upward or downward bias in planned spending patterns for food away from home.  Other (smaller) determinants include gender, age, and participation in food assistance programs with women, older, and SNAP participants being more likely to say they plan to reduce spending on food away from home.

A less nefarious explanation for the above phenomenon might be that our survey is conducted in the middle of the month, and if people are paid at the beginning of the month (or at the end of the previous month), then there might be less remaining in the food budget for "splurges" like spending on food away from home by the time the middle of the month arises and they rationally plan to spend less in the following two weeks.  

I doubt this is true for two reasons.  The first is that results from other surveys back up the "self control" explanation.  For example, this article in the Wall Street Journal a couple years ago pointed to a survey of higher income consumers that asked what kept them from saving more money each month.  The most common answer, given by 68% of respondents, was "dining out".  The second reasons is that we observe no such phenomenon in our survey for stated changes in spending on food AT home.  Here is the average response each month for how consumers expect to change spending on food at home.  As can be seen, the value goes up and down and is neither consistently negative or positive.     

If you have other explanations for why people consistently say they plan to spend less eating out next month, I'd love to hear them.