Blog

What is "Natural"?

I recently completed a survey of over 1,200 U.S. consumers to find out exactly what they think “natural” means when evaluating different foods. The full report is available here and topline results for all questions asked are here (the survey also covered consumers’ perceptions of “healthy” claims, which I’ll blog on later).

Here is the motivation for the study:

While food companies are allowed to use a “natural” label or claim, the Food and Drug Administration (FDA) has refrained from defining the term. One consequence has been a large number of lawsuits in recent years in which plaintiffs claim to suffer harm from being misled about food product contents or ingredients when accompanied with a natural label (Creswell, 2018). In 2015, the FDA requested public comment on the use of the term natural in food labeling, signaling a potential move to define the term. Such events suggest the need for more information about how food consumers perceive and define the term natural.

One of the initial queries was an open-ended question which asked, “What does it mean to you for a food to be called ‘natural’?” Here is a word cloud constructed from the responses.

wordcloud_natural.jpg

Words like artificial, additive, chemical, and organic were most commonly mentioned. More than 10% of respondents specifically mentioned the word artificial. A non-trivial share of respondents suggested the word was meaningless, marketing hype, or that they did not know what the word meant.

Respondents were also provided a list of possible words/definitions and asked which best fit their definition of natural. No preservatives and no antibiotics/hormones topped the list.

natural1.jpg

Despite associating preservatives with lack of naturalness, when asked about specific preservatives, responses are more nuanced. Preservation by canning and with sugar/salt/vinegar were perceived by more people as natural than not-natural, whereas preservation with benzoates/nitrites/sulphites was not.

To hone in on which processes/foods people consider natural vs. not natural, they were shown the following figure. Respondents were asked “Which of the following foods or processes do you consider to be natural? (click up to 5 items on the image that you believe are natural).” The question was repeated except “natural” was replaced with “NOT natural.”

natural2.jpg

You can find some colorful heat-maps of the resulting clicks in the full report. Here, I’ll just note that about half of respondents (47.1%) clicked on the image of the raw commodities as being natural. The next most commonly clicked areas, chosen by between 20% and 30% of respondents, was grits/oatmeal, wash/clean, and wash/grind/slice. Even after showing the processes involved, 19.8% clicked vegetable oil as natural and 13.3% clicked flour as natural. By contrast, “Bleach” was most most frequently clicked (by 33.8% of respondents) as not natural, followed by “Crystalize”, and then alcohol, syrup, and sugar.

A curious result revealed is that, in many case, final foods are often considered more natural than the processes which make them. For example, more people clicked alcohol as natural than clicked fermentation as natural. Vegetable oil was perceived as more natural than pressing or bleaching, both processes which are used to create this final product. Similarly, sugar is perceived as more natural than crystallization, but of course, the latter is necessary to produce the former. These findings suggest that it is possible for a final product to be considered natural even if a process used to make the product is not.

I also asked questions about crop production processes and perceptions of naturalness.

natural3.jpg

About 80% more respondents said organically grown crops were natural as said such crops were not natural. Crops grown indoors and that are hydroponically grown were, on net, seen as more natural than not. All other crop production practices were rated as not natural by more respondents than were rated as natural. Thus, the results suggest consumers are skeptical of the naturalness of most modern crop production practices. Curiously, this is true for use of hybrid seeds. Crops produced with biotechnology were much more likely to be considered not natural than natural. Consumers perceived organic as natural, but not the pesticides used in organic agriculture or the methods (i.e., mutagenesis) used to create many organic seeds. Again, these findings suggest that it is possible for a final product to be considered natural even if a process used to make the product is not; in this case, the finding is likely to result from a lack of knowledge about organic production practices.

On the topic of misperceptions, just because a federal definition of natural exists does not mean consumers know or understand the definition. The USDA currently defines “natural” for meat products, and it is primarily defined as “minimally processed.” However, only about a quarter of respondents in this survey (26.6%) correctly picked this definition when asked how the USDA defines the term. More than 30% of respondents incorrectly believed the USDA definition of natural implies “no hormones” and 23.8% thought a natural label implies “no antibiotics.” These data suggest more than half of respondents are misled by the USDA definition of natural, a result supported by the other recent academic research.

There is a lot more in the detailed report, including more information on question wording and methods of analysis. For example, analysis of correlations between responses (via factor analysis), suggests “natural” is not a single monolithic construct in consumer’s minds, but rather is multidimensional. A food or process can be considered natural on one dimension but not another, as shown in the following figure.

natural4.jpg

Thanks to the Corn Refiners Association, who funded this survey. They gave me free reign to ask the questions and analyze the data as I wanted. You can see their interpretation of the results and their policy recommendations here.


Crop Yields and Taste

That modern agriculture is incredibly productive - much more than the past - is undeniable. These USDA data, for example, suggest we produce about 170% more agricultural output now than in the late 1940s. I have argued that these these increases in agricultural productivity are signals of improved sustainability. Some people believe the the productivity improvements have been accompanied with offsetting externalities or degredations in animal welfare. A different kind of critique is that modern crops - despite being more productive - aren’t as high “quality.” For example, this piece in Politico by Helena Bottemiller Evich, titled “The great nutrient collapse” discuses evidence that vitamin content of crops has fallen as yields have increased, and there is the often-heard complaint that tomatoes don’t taste as good as they once did.

There is some biological basis for these latter concerns. If a crop breeder selects plants for higher yields, they are selecting plants that are spending their energy and nutrients into producing bigger seeds and fruits, which is energy that could have gone (in lower yielding plants) to growing leaves or roots or other compounds that affect taste and vitamin content.

I had these thoughts in the back in my mind when I came across the Midwest Vegetable Trial Report put out by researchers at Purdue and other Midwestern universities. The report compares different vegetable varieties in terms of yield and other output characteristics. I noticed for a couple vegetables - green beans and sweet corn - there were also measures of taste for each variety. Granted, these were not full-on scientific sensory evaluations and they involved small numbers of tasters, but still I thought it would be useful to test the conjecture that higher yielding varieties taste worst.

Some researchers from University of Kentucky put together the green bean report. They compared the performance of 19 different varieties of green beans. The most productive variety (named “Furano”) yielded 785 bushels over six harvests, whereas the lowest yielding variety “Slenderette” only produced 233 bu/acre in six harvests. As the image below reveals, however, there was only a weak correlation between taste and yield. The correlation was negative (-0.26), but not particularly large. About 6.6% of the variation in yield is explained by taste. The best tasting variety “Opportune“ had a taste score of 4.1 (on a 1=poor to 5=excellent scale) and a yield of 557; the worst tasting variety “Bronco” had an average taste score of 2.3 and a yield of 543. So, the best tasting bean had better yield than the worst tasting bean. Overall, the results below provide some weak support for a yield, taste trade-off.

greenbean.JPG

The report also provided production and taste data on supersweet corn (this part was authored by Purdue researchers Elizabeth Maynard and Erin Bluhm). They compared 16 different types of bicolored supersweet corn (they also evaluated two varieties of white and two varieties of yellow, which I’m ignoring here). They had tasters rate “flavor” on a 1 to 5 scale. As the figure below shows, there is actually a positive correlation between flavor and yield, as measured by ton/acre. The correlation is 0.15, but the relationship is weak. The authors also report yield in a slightly different way, ears/acre, and by this measure the correlation is slightly negative (-0.09).

cornyieldflavor.JPG

These results don’t necessarily negate the idea that the taste of vegetables has declined over time as higher yielding varieties have been adopted, but they do suggest that in 2017, among the particular varieties tested and among the few tasters asked, there is only a very weak correlation between taste and yield for green beans and supersweet corn.

Arbitraging the Market for Food Fears

A couple weeks ago, the best selling author Michael Lewis was on campus, and I went to listen to him talk. I’ve read several of Lewis’ books, and it was interesting to hear him talk about some of the underlying themes that united them.

In his 2017 book, the Undoing Project, Lewis writes the history of Kahneman and Tversky and the development of behavioral economics, a field that posits people do not always make rational decisions. In an earlier book, Moneyball (published in 2004), a few stat/econ types realized baseball teams were leaving money on the table by ignoring data on what really drives team wins. One team manager, Billy Beane, attempted to arbitrage the market for players by buying “undervalued” players and putting them to higher-valued use. In another earlier book, the Big Short (published in 2010), Lewis talks about the people who made big bucks on the financial crisis by recognizing that markets were “mispricing” the risks of systemic mortgage failures. In some ways the books are out of order because Lewis’s earlier books described how various people made serious money from the sorts of behavioral biases that Kahneman, Tversky, and others uncovered.

What’s this got to do with food?

Many of the systematic biases that lead people to mis-price baseball players and mortgage-backed securities are likely leading people to mis-price foods made with new technologies. Take GMOs. A Pew study found 88% of scientists but only 37% of the public thought GMOs are safe to eat. Is it possible scientists are wrong and the public is right? Sure, but if you had to place a bet, where would you put your money?

Or, let’s take at a widely studied behavioral bias - the tendency for people to exaggerate the importance of low-probability risks. The propensity to overweight low probability events was one of the cornerstones of prospect theory, which was introduced by Kahneman and Tversky. This theory is sometimes credited as herding the birth of modern-day behavioral economics, and the paper was a key contributor to Kahneman later winning a Nobel Prize. If there is a 1% chance of an outcome occurring, when making decisions, people will often “irrationally” treat it as a 5% or 10% chance. There are many, many studies demonstrating this phenomenon.

Oddly, I have never seen a behavioral economists use this insights to argue that fears over growth hormones, GMOs, pesticides, preservatives, etc. are overblown. However, there are many food and agricultural scientists who argue that many of our food fears are, in fact, irrational in the sense that public perceptions of risk exceed the scientific consensus.

Now, getting back to Michael Lewis’s books on the people who figured out how to profit from behavioral biases in fields as divergent as baseball players and mortgage-backed securities, if we really think people are irrationally afraid of new food technologies, is it possible to put our money where our mouth is? Or, buy fears low and sell them high?

Here are a few half-baked thoughts:

  • If people are worried about the safety of food ingredients and technologies, shouldn’t they be willing to buy insurance to protect against the perceived harms? And if consumers are overly worried, they should be willing to pay more for insurance than it actually costs to protect against such harms. If we believe this is the case, then creating insurance markets for highly unlikely outcomes should be a money-making opportunity. On the plus side, such markets might also take some of the fear out of buying foods with such technologies since people can hedge their perceived risks.

  • Let’s say your Monsanto (now Bayer), Syngenta, BASF, or another seed/chemical company. What can you do to assuage consumers’ fears of your technologies, particularly if you believe the perceive risks are exaggerated? Why not offer up a widely publicized bond that will be held in trust in case some adverse event happens within a certain period of time? (This is like when contractors or other service suppliers attempt to gain trust by being bonded). If it is really true that consumers’ fears are exaggerated, the bond won’t be paid out (at least not in full), and will revert back to the company.

  • Did you know that it is possible to invest in lawsuits? Investors, whose money is used to front the legal bills, earn a portion of the payout if a plaintiff wins a settlement against a corporation or other entity responsible for some harm. The “price” of such investments is likely to rise the greater the public’s perceived odds of winning the case, which presumably related to perceptions of underlying risks. I can imagine institutions or markets arising that would enable investors to short such investments - to make money if the plaintiff losses the case. The current Monsanto-glyphosate verdict not withstanding, shouldn’t it be the case that one could profitability short lawsuits surrounding the safety of food and farm technologies if the fears around them are indeed overblown?

Other ideas?

Look at Me, I'm Buying Organic

That’s the title of a new paper I co-authored with Seon-Woong Kim and Wade Brorsen, which was just published in the Journal of Agricultural and Resource Economics.

We know consumers have a number of motivations for buying organic food - from perceptions about health, taste, safety, and environment to perceptions about impacts on smaller farmers. Whether these perceptions are accurate is debatable. In this paper, we were interested in an all together different motivation: the extent to which consumers feel social pressure to buy organic.

In our study, people made simulated choices between apples or milk cartons, where one of the characteristics was the presence or absence of the organic label. People were divided into one of four groups:

1) The control (CTRL): no manipulation.

2) The eye (EYE) treatment. This is going to sound crazy, but following some previous research, we showed an image of a person’s eyes on the screen as people were making their apple and milk choices. Prior research suggest that exposure to an image of eyes creates the aura of being watched, which increases reputational concerns and cooperative behavior.

eyes2.JPG

3) The name (NAME) treatment. Just prior to the apple/milk choices, people in this group were asked to type in their first name, and we ask them to confirm if they lived in the location associated with their IP address. The idea was to remove the perception of anonymity and increase social pressure.

4) The friend (FRND) treatment. Here we used a vignette approach. Jut prior to the apple/milk choices, people were told, “Now, imagine you are in the specific situation described below. Your good friends have family visiting. They’ve asked you to help out by taking their sibling, whom you’ve never met, to the grocery store. While you’re there with your friend’s sibling, you also need to do some shopping for yourself.”

If social pressure is a driver of organic food purchases, willingness-to-pay for the organic label would be expected to be higher in the EYE, NAME, and FRND treatments relative to the control.

Here are some summary statistics showing the percent of choices made by consumers in each treatment group in which a product with the organic label was chosen.

organic_seon.JPG

We see some support for the idea that organic is more likely to be chosen in the social pressure treatments (EYE, NAME, FRND) than the control. The effects are statistically different, but that aren’t huge for every treatment considered. However, the above table doesn’t consider price differences. When we convert the choices into a measure of willingness-to-pay, we find the biggest effect is for the vignette (FRND). For this treatment, we find willingness-to-pay for organic is about 88% higher for apples and 82% higher for milk than the control.

For all groups, we found that education levels moderate the relationship between the social pressure treatment variables and willingness-to-pay for organic. In particular, social pressure is higher for more highly educated consumers. The effect was particularly large in the EYE treatment, where more highly educated consumers valued the organic label between 150% and 200% than less educated consumers when exposed to eyes.

organic_seon2.JPG

These results provide evidence that at least a portion of organic consumption is likely driven by a form of conspicuous consumption. Some might call it a form of conspicuous conservation, but that’s a whole other can of worms.

Are Consumers Eating Out Less Frequently?

According to this Grub Street article, the answer is yes.

The average American’s restaurant visits reached a 28-year low this year, falling from an average of 215 a year in 2000 to 186 a year in 2018. Data gathered by the NPD Group shows a particular and precipitous decline since 2008. Today, 82 percent of meals in America are made at home.

And, they speculate on causes in the slow-down on restaurant spending:

Going out to restaurants doesn’t seem like such a good idea when you’re saddled with student debt and contending with wage stagnation. (In fact, many Americans saw their wages decline over the last year.)

Meanwhile, restaurants are becoming increasingly more expensive compared to eating in.

I’m a little skeptical of these data? Why? Well, here’s data from the Bureau of Economic Analysis (BEA) data on personal consumption expenditures (these are the data that feed into calculation of GDP) for food at home and away from home.

spending.JPG

In inflation-adjusted terms, all consumer spending is up about 36% since 2001. Spending on food at home only rose about 24% over this time periods, but spending away from home increased 54%. (I’ve also shown spending on clothing for reference). Spending on food away from home fell during the Great Recession, but it has significantly rebounded since.

Now, it’s not impossible for both of these statistics to be simultaneously true - one can eat out less frequently but spend more money on each trip, and total expenditures could still rise.

But, the BEA also reports quantity indices, which provide an estimate of the volume of food sold away from home. Here are those data. These data suggest little evidence for a slowdown in the amount of food consumed away from home.

quantity_index.JPG

The headline on the Grub Street article asks “Should Restaurants be Worried?” The BEA data suggest the answer is “no.”