Blog

Reducing Food Waste - Where's the Incentive?

In recent years, there has been a lot attention being focused on reducing food waste. While I have argued for more nuance than one often sees in popular exhortations to reduce waste, the issue is important: it would be nice to find ways to save all the resources that go into producing food that ultimately winds up in the garbage.

A number of discussions over the past couple months have led to an aspect of this problem that hasn’t received much attention. Namely, what is the incentive of food producers and manufacturers to reduce waste? Or, what are the most effective mechanisms to reduce waste?

One way of reducing waste is what we might call the “demand side” strategy. Try to convince consumers to consume all of what they buy and throw out less. Our stomachs and pantries are only so large, and as a result, this presumably means consumers would ultimately buy less food. In economic terms, this leads to a downward shift in demand, which results in lower prices and less food sold. For producers, this is certainly a bad outcome: selling less food at lower prices means lower revenues and profits. From the perspective of a food producer, all they care about is whether the product sells. What you do with it after you buy it is of little consequence to the seller. As such, one might wonder how much incentive food producers and sellers have to reduce waste, at least via this demand side strategy. To boot - we don’t know for sure whether consumers are better or worse off. They pay lower prices but also buy less food, and as a result the impacts on consumers is ambiguous.

A different way to try to reduce food waste might be called a “supply side” strategy. One challenge with popular conceptions of food waste is that they seems to imply there are large inefficiencies in food supply chains. That some people seem to indirectly imply that farms, food manufacturers, and grocers are losing or throwing out food that they could profitably sell. To be sure, there are likely some inefficiencies in the food supply chain, but food and ag are tend to be competitive, low margin businesses which makes it hard to believe they’re leaving dollar bills lying around that they could easily pick up. To incentivize these firms to reduce waste, loss, and spoilage, something has to change to reduce the cost of preservation. That “something” is likely investment in research and the creation of technologies that enable farms and food manufacturers to affordably make use of food that might otherwise have been unsalable. An old example might be the advent of canning or refrigerated rail cars. More modern examples might include better grain storage bins or storage management practices, vacuum packaging, high pressure pasteurization, etc.

In economic terms, these technologies can be conceptualized as shifting the supply curve downward shift - i.e., lowering the marginal cost of delivering a given quantity of food to the market. Such a shift would lower the price of food while enabling more more food to be sold. Consumers are definitely better off: they get to have more food at lower prices. Whether producers as a group are better off from the supply shift depends on how sensitive producers and consumers are to price changes, but producers who are early adopters of the new technology are almost certainly better off.

Whether the demand-side or supply-side strategy leaves “society” better off (at least as defined by producer profits and consumers’ economic well-being) is not completely predictable because it depends on relative elasticities of supply and demand for the foods in question, among other factors. Ignoring any externalities from food that is thrown out, I would generally expect the “supply side” strategy to be better: we know it makes consumers better off and likely makes producers better off too (though not always). But, it ultimately results in more food being sold and potential (and perhaps ironically) more consumer waste. So, the big unanswered question is the nature and size of the “externality” of food thrown away.

Why don't we vote like we shop?

Readers of this blog will know I’ve been interested in the divergence of voting and food shopping behavior for some time (e.g., see here, here, or here). Much of this interested was prompted by the stark example provided by Proposition 2 in California back in 2008, when about 64% of Californian’s went to the polls and outlawed the production of a good they routinely bought (caged eggs) resulting in a so-called vote-buy gap. I think it was Glynn Tonsor who I first heard refer to the outcome as an unfunded mandate: voters required producers to adopt costlier production practices for which shoppers had already revealed they were unwilling to provide sufficient compensation in the marketplace.

While we know this phenomenon exists, it isn’t clear why. Along with Glynn Tonsor, Bailey Norwood, and Andrew Paul, we set out to tackle this question by conducting some real-food, real-money experiments. The resulting paper is now forthcoming in the Journal of Behavioral and Experimental Economics.

What did we do? We recruited people to participate in a series of decision making exercise, where they first made a shopping choice. The shopping choice options were: A) a more expensive cookie made with cage free eggs, B) a less extensive cookie made with conventional eggs, C) a snack without animal products, or D) refrain from buying anything. Then, in a second step, people were placed into groups (of small or large size), where they voted on whether to ban option B (the cookie made from conventional eggs) for the people in their group. Finally, if the vote passed, people re-chose from the constrained set of options. This basic set-up was repeated in several conditions that varied information, group size, and more to test different reasons for vote-buy gap.

The first result is that we can replicate the vote-buy gap in our experimental setting. We note:

approximately 80% of the individuals who chose snacks with caged eggs when shopping subsequently voted to ban snacks with caged eggs. The finding rules out the suggestion that the vote-buy gap is an illusion or statistical artifact, as it can be re-created in an experimental lab setting at an individual level.

As the above results suggests, we are immediately able to rule out one explanation for the vote-buy gap - something we call the non-buyer hypothesis. The non-buyer hypothesis suggests the vote-buy gap is something of an illusion because, proverbially speaking, apples (voters) are being compared to oranges (shoppers). In the “real world”, many consumers of eggs are voters; however, not all voters are egg consumers. For example, individuals who are vegan do not buy eggs, but they may vote in favor of initiatives similar to that of Proposition 2. But, in our study we can compare each individual’s shopping choice to their own vote, and as the foregoing quote indicates, 80% of people switch.

We find a bit of support for the following, which is tested by giving one group more salient information about which options used cage and which used cage free eggs:

Knowledge Hypothesis: The vote-buy gap is caused by the fact that consumers themselves believe, or perceive that other consumers believe, that they are buying cage-free eggs when in fact they are buying cage eggs; better, more salient information about housing practices when shopping will reduce the vote-buy gap.

But, even in the condition where clear-transparent information was given about the types of eggs used to make each cookie, the share of people who voted to ban cage eggs was 15 to 20 percentage points higher than was the market share of purchased snacks made with cage free eggs.

I was personally most excited to test two hypotheses that related to group size (people voted in groups of roughly five or fifty). Here were those hypotheses, both of which suggest greater likelihood of voting “yes” to ban snacks with cage eggs in larger groups as compared to smaller groups.

Public Good Hypothesis: The vote-buy gap is caused by the fact that more animals and people are impacted by a ban than the impact of a single shopping choice for cage-free eggs; as the size of the group affected by a vote increases, individuals are more likely to vote in favor of the initiative if it has a desirable public good component, thereby increasing the vote-buy gap.

Expressive Voter Hypothesis: The vote-buy gap is caused by the fact that in large groups, an individual’s vote is unlikely to be a deciding factor, privileging expressive preferences over instrumental preferences; as a group size increases, and the likelihood that an individual’s vote is consequential and decisive falls, individuals are more likely to vote in favor of the initiative, thereby increasing the vote-buy gap.

Both hypotheses conjecture that the vote-buy gap will be larger in large groups than small groups. However, our data shows that when moving from small to large groups, the vote-buy gap is actually larger in the small groups than the large group, exactly the opposite of what is predicted.

In the end, we can be fairly confident the vote-buy gap is real and replicatable, but alas we still don’t have a good answer as for why. Here’s how we leave it in the paper:

A residual hypothesis that remains if all others fail to explain the gap is inconsistent preferences. That is, people may have different preferences when shopping as compared to voting. This sort of preference inconsistency is at the heart of the so-called consumer vs. citizen phenomenon. The thought is that people adopt more public-minded preferences when in the voting booth but rely on more selfish motives when shopping privately. ... While this explanation is perhaps not intellectually satisfying, it is perhaps consistent with one long strain of economic thought, De gustibus non est disputandum, while contradicting another, fixed and stable preferences.

But, make no mistake: just because we don’t know the answer to the “why” doesn’t imply the vote-buy gap is inconsequential. Recall the unfunded mandate? Here’s what happened in our experiment.

many individuals who originally chose to purchase a cookie decided, after the ban, to select “none” in their second selection when only higher-priced cookies were available. Depending on the treatment, anywhere from 21.74% to 43.38% of consumers who purchased cookies prior to the ban no longer did so after the ban. These lost purchases are precisely the worry of egg producers.

What is "Natural"?

I recently completed a survey of over 1,200 U.S. consumers to find out exactly what they think “natural” means when evaluating different foods. The full report is available here and topline results for all questions asked are here (the survey also covered consumers’ perceptions of “healthy” claims, which I’ll blog on later).

Here is the motivation for the study:

While food companies are allowed to use a “natural” label or claim, the Food and Drug Administration (FDA) has refrained from defining the term. One consequence has been a large number of lawsuits in recent years in which plaintiffs claim to suffer harm from being misled about food product contents or ingredients when accompanied with a natural label (Creswell, 2018). In 2015, the FDA requested public comment on the use of the term natural in food labeling, signaling a potential move to define the term. Such events suggest the need for more information about how food consumers perceive and define the term natural.

One of the initial queries was an open-ended question which asked, “What does it mean to you for a food to be called ‘natural’?” Here is a word cloud constructed from the responses.

wordcloud_natural.jpg

Words like artificial, additive, chemical, and organic were most commonly mentioned. More than 10% of respondents specifically mentioned the word artificial. A non-trivial share of respondents suggested the word was meaningless, marketing hype, or that they did not know what the word meant.

Respondents were also provided a list of possible words/definitions and asked which best fit their definition of natural. No preservatives and no antibiotics/hormones topped the list.

natural1.jpg

Despite associating preservatives with lack of naturalness, when asked about specific preservatives, responses are more nuanced. Preservation by canning and with sugar/salt/vinegar were perceived by more people as natural than not-natural, whereas preservation with benzoates/nitrites/sulphites was not.

To hone in on which processes/foods people consider natural vs. not natural, they were shown the following figure. Respondents were asked “Which of the following foods or processes do you consider to be natural? (click up to 5 items on the image that you believe are natural).” The question was repeated except “natural” was replaced with “NOT natural.”

natural2.jpg

You can find some colorful heat-maps of the resulting clicks in the full report. Here, I’ll just note that about half of respondents (47.1%) clicked on the image of the raw commodities as being natural. The next most commonly clicked areas, chosen by between 20% and 30% of respondents, was grits/oatmeal, wash/clean, and wash/grind/slice. Even after showing the processes involved, 19.8% clicked vegetable oil as natural and 13.3% clicked flour as natural. By contrast, “Bleach” was most most frequently clicked (by 33.8% of respondents) as not natural, followed by “Crystalize”, and then alcohol, syrup, and sugar.

A curious result revealed is that, in many case, final foods are often considered more natural than the processes which make them. For example, more people clicked alcohol as natural than clicked fermentation as natural. Vegetable oil was perceived as more natural than pressing or bleaching, both processes which are used to create this final product. Similarly, sugar is perceived as more natural than crystallization, but of course, the latter is necessary to produce the former. These findings suggest that it is possible for a final product to be considered natural even if a process used to make the product is not.

I also asked questions about crop production processes and perceptions of naturalness.

natural3.jpg

About 80% more respondents said organically grown crops were natural as said such crops were not natural. Crops grown indoors and that are hydroponically grown were, on net, seen as more natural than not. All other crop production practices were rated as not natural by more respondents than were rated as natural. Thus, the results suggest consumers are skeptical of the naturalness of most modern crop production practices. Curiously, this is true for use of hybrid seeds. Crops produced with biotechnology were much more likely to be considered not natural than natural. Consumers perceived organic as natural, but not the pesticides used in organic agriculture or the methods (i.e., mutagenesis) used to create many organic seeds. Again, these findings suggest that it is possible for a final product to be considered natural even if a process used to make the product is not; in this case, the finding is likely to result from a lack of knowledge about organic production practices.

On the topic of misperceptions, just because a federal definition of natural exists does not mean consumers know or understand the definition. The USDA currently defines “natural” for meat products, and it is primarily defined as “minimally processed.” However, only about a quarter of respondents in this survey (26.6%) correctly picked this definition when asked how the USDA defines the term. More than 30% of respondents incorrectly believed the USDA definition of natural implies “no hormones” and 23.8% thought a natural label implies “no antibiotics.” These data suggest more than half of respondents are misled by the USDA definition of natural, a result supported by the other recent academic research.

There is a lot more in the detailed report, including more information on question wording and methods of analysis. For example, analysis of correlations between responses (via factor analysis), suggests “natural” is not a single monolithic construct in consumer’s minds, but rather is multidimensional. A food or process can be considered natural on one dimension but not another, as shown in the following figure.

natural4.jpg

Thanks to the Corn Refiners Association, who funded this survey. They gave me free reign to ask the questions and analyze the data as I wanted. You can see their interpretation of the results and their policy recommendations here.


The Coming Meat Wars

By now, I suspect many of you have seen the report by the EAT-Lancet Commission on Healthy Diets from Sustainable Food Systems, which was released on January 16th.

Among other things, the report recommends a dramatic reduction in consumption of meat and animal products. Here is their recommended plate.

new my plate.JPG

Much has been made on Twitter and other places about the size of the small meat and animal product proportions suggested (e.g., 1/4 egg per day), and the fact that more added sugar is suggested than most meat products.

Rather, than going line-by-line through the report, I think it’s useful to take a step back and see this report as another front in what seems to be an escalating war on meat and animal food products (recall the debate surrounding the scientific advisory report on dietary guidelines back in 2015? Here were my thoughts then). What I thought I’d do in response is to provide some broader thoughts about some of the debates that have arisen about meat consumption. My purpose isn’t to defend meat and livestock industries, but to help explain the consumption patterns we see, add some important context and nuance to these discussions, and help ensure consumer welfare isn’t unduly harmed. (Full disclosure: over the years, I’ve done various consulting projects for meat and livestock groups such as the Cattlemen’s Beef Board, the Pork Board, and the North American Meat Institute. All of this work was on specific projects or data analysis related to labels or demand projections, and none of these groups support writing such as this, but I mention it here for sake of transparency).

Here are my thoughts.

  • These debates can be contentious because meat, dairy, and egg production is big business and critically important to the economic health of the agricultural sector. For example, these USDA data show in 2017 in the U.S. the value of cattle/calves was about $67 billion, poultry and eggs about $43 billion, diary about $38 billion, and hogs about $21 billion, for a total of $176 billion at the farm gate. Contrast this with the value of corn ($46.6 billion), vegetables and melons ($19.7 billion), fruits and nuts ($31 billion), or wheat ($8.7 billion). In many ways, livestock/poultry can be see as “value added” production because these animal products rely on corn, soy, hay, and grass

  • Given the farm-level statistics, it shouldn’t be surprising to learn that consumers spend a lot on meat, dairy, and eggs. Data from the Bureau of Labor Statistics, Consumer Expenditure Survey suggest that in 2017 consumers spend about $181 billion on animal products eaten at home. This doesn’t count food away from home, which is 43.5% of food spending according to these data (spending on food away from home isn’t segregated into food types as is food at home). Of total spending on food at home, 32% goes toward meat, dairy, and eggs.

  • If anything, data suggest demand for meat (i.e., the amount consumers are willing to pay for a given quantity of meat) has been steady or rising over the past decade. For example, see these demand indices created by Glynn Tonsor. His data also shows there has been a steady increase in demand for poultry for the past several decades. At the same time, my FooDS data suggests a slight increase in the share of people who report being vegetarian or vegan over the past five years - going from around 4% in 2013 to around 6% in 2018. So, aggregate demand for animal products is up, although there seems to be increasing polarization on both ends of spectrum. We also find that meat consumption is increasingly related to political ideology, with conservatives having higher beef demand than liberals.

  • There are important demographic differences in meat consumption, but the results highly depend on which meat cuts we are talking about. For example lower income households have higher demand for ground beef and lower demand for steak than higher income households. Broadly speaking, meat consumption is a “normal good”, which means that consumption increase as incomes rise. This is particularly true in developing countries. One of the first things people in developing countries add to their diet when they get a little more money in their pockets is animal protein.

  • Given the high levels of aggregate meat consumption indicated above, the evidence suggests strong consumer preferences for meat and animal-based products. Taxes on such products will harm consumer welfare, and will be costly if, for no other reason, because of the size of the industry. Stated differently, consumers highly valuing having animal protein in their diets. This study shows the average U.S. consumer places a higher value on having meat in his or her diet than having any other food group.

  • Calls for taxes are often predicated on the notion that there are externalities from meat, egg, and dairy production that need to be internalized (otherwise, this would amount to little more than “nannying” or paternalism). The externalities on the health care front presumably come from the fact that we have Medicare and Medicaid, which socialize health care costs. As I’ve written about on many occasions (e.g., see this paper), these “externalities” do NOT create economic inefficiencies because they simply represent transfers from healthy to the sick. Any inefficiencies that arise occur because of moral hazard (i.e., people eating unhealthy because they think the government/taxpayers will foot the bill), and the solution to this insurance problem is typically to require deductibles or risk-adjusted insurance pricing, which nobody seems to be proposing as a solution. As for environmental externalities, the key is to ensure prices for inputs such as water or energy, or outputs such as carbon or methane, reflect external costs. In this sense it isn’t the cow or chicken that is the “sin” but the under-priced water or carbon. Here the goal is to adopt broad policies that apply to all sectors (ag and non-ag) and that encourage and allow for innovation to reduce impacts.

  • On climate impacts of animal agriculture, it is important not to confuse global figures of climate impacts with U.S. figures, which tend to be much lower (e.g., see my piece in the WSJ a few years ago on this topic). Why would climate impacts be lower in the U.S.? Because we tend to be more intensified and productive than elsewhere in the world. I know it sounds counter-intuitive, but more intensive livestock operations (because of the massive productivity gains) can significantly reduce environmental impacts when measured on a per unit of output (e.g., pound of meat or egg) basis.

  • As for carbon impacts, the big culprit here is beef and to a lesser extent (due to the smaller cattle numbers), dairy. Why? Because cattle are ruminants. The great benefit of ruminants is that they can take foodstuffs inedible to humans (e.g., grass, hay, cottonseed) and convert them into products we like to eat (e.g., cheese, steak) (see further discussion on this here). The downside is that ruminants create methane, which is a potent greenhouse gas (GHG). The good news is that the GHG emissions from beef production have significantly fallen over time because of dramatic productivity gains (see this paper), but they’re not zero. It’s also important to note that not all greenhouse gasses are created equal, and while methane is a potent greenhouse gas, my understanding is that the impacts from livestock are less persistent in the atmosphere than are other types of greenhouse has emissions. While we can cut GHG emissions by eating less beef, at least in the U.S., the impacts are fairly small (the EPA puts contributions from livestock at around 3-4% of the total), we can also make strides by continuing to increase livestock productivity.

  • While cattle are more problematic on the GHG front, it is important to note that there are likely tradeoffs (real or perceived) on the animal welfare front in comparison with other species. Most beef cattle live most of their lives outdoors on a diet of grass or hay. Cattle often make use of marginal lands that would be environmentally degrading to bring into row crop production. By contrast, most pork and poultry live the vast majority of their lives indoors on a diet of corn and soy. See my book with Bailey Norwood on the topic of animal welfare.

  • There are some interesting innovations happening on the “lab grown meat” and “plant-based protein” space, which aim to replace protein from animal based sources. I haven’t seen these innovators make many claims about relative health benefits, but they often suggest significant benefits in terms of environmental impacts. I hope they’re ultimately right, but they’ve got a long way to go. Lab-grown meat isn’t a free lunch, and all those cells have to eat something. As I’ve also noted elsewhere, it is curious that these products (plant- or cell- based) are still more expensive than conventional meat products. If these alternative proteins are really saving resources, they should ultimately be much less expensive. Time will tell.

  • Despite the excitement around the alternative protein sources, I don’t think we’ll see an end to cattle production anytime in the near future. Why? Well, there is the aforementioned marginal land issue; many agricultural lands aren’t very productive for use in other activities other than feeding cattle or housing other livestock or poultry. Another issue is that cattle and other livestock are food waste preventing machines. A big example here is distillers grains. What happens to all the “spent” grain that runs through ethanol plants or beer breweries? Its feed to livestock. The same is true of “ugly fruit”, non-confirming bakery items, and more. Also, without animal agriculture, where will organic agriculture get all it’s fertilizer, which currently comes from the manure of conventionally raised farm animals?

  • Back to the EAT-Lancet commission, one of the big arguments for reducing meat consumption is health. While there are many studies associating meat consumption with various health problems, the strength of evidence is fairly weak. One big problem is that it’s really tough to do dietary-impact studies well and a lot of the evidence comes from fairly dubious dietary recall studies, but the other issue is that there is generally little attempt to separate correlation from causation. As I’ve written in other contexts, “Its high time for a credibility revolution in nutrition and epidemiology.”

  • The EAT-Lancet report focuses both on health and sustainability issues. However, as I noted with regard to the 2015 dietary guidelines, which initially aimed to do the same, this conflates science and values. As I wrote then, “Tell us which foods are more nutritious. Tell us which foods are more environmentally friendly. But, don't presume to know how much one values taste vs. nutrition, or environment vs. nutrition, or price vs. environment. And, recognize that we can't have it all. Life is full of trade-offs.”

  • Finally, I’ve heard it suggested that we need new policies and regulations to offset bad farm policies, which have led to overproduction of grains and livestock. This view is widely believed and also widely discredited. For example, see this piece by Tamar Haspel in the Washington Post. In the U.S., beef, pork, broilers, and eggs receive no direct production subsidies. Yes, there are various subsidies for feedstocks like corn and soy, but there are also other policies that push the prices of these commodities up rather than down (why would farmers want policies that would dampen the prices of their outputs?). Large scale CAFOs (confined animal feeding operations) must comply with a host of rules and regulations that raise costs (it should be noted that the government provides some funding, through the Environmental Quality Incentives Program (EQIP) program, to incentivize certain practices by CAFOs thought to improve environmental outcomes). If U.S. farm bill was completely eliminated, there would not doubt be some change, but it wouldn’t do much to change the volume of meat, dairy, and egg produced.

That’s more than enough to chew on for now.

Trends in Farm Land Acreage

I hear a lot of talk about the impacts of federal farm policy on our food system. It is sometimes suggested that farm policy is to blame for “cheap food” and thus obesity (see this nice twitter response by Tamar Haspel) or that many of our purported modern day farm and food ills can be traced back to Earl Butz, who as Secretary of Agriculture in the early 1970’s encouraged producers to plant “fence row to fence row.”

One way to evaluate these sorts of claims is to look at how much (or little) crop acreage in the U.S. has changed over time. Here is data according from the USDA, National Agricultural Statistics Service on the amount of land planted to nine major commodity crops over time (note: vegetable acreage, which comprises only about 1% of all acreage is not included; nor is fruit or nut acreage, which is also a very small share of the total).

The figure below shows the cumulative acreage in the U.S. planted to nine major commodity crops over 93 year time period from 1926 to 2018. Over the entire time period, there was an average of 246 million acres planted to these nine crops each year. Seven out of the 10 highest planting years were prior to 1937 with the remaining three being in 1980, 1981, and 1982.

The coefficient of variation (the standard deviation divided by the mean) is only about 7.5%, implying relatively low variation over time (usually a figure less than 100% would be considered low variation). Since 1990, there have been relatively small year-to-year changes. Over the most recent 28 year time period, about 225.7 million acres are planted each year to these nine commodity crops, with a coefficient of variation of only 1.8%. This lower variation in recent years is interesting because farm policy has been much more market-oriented since 1996, and this is precisely the period over which there has been more stability in planted acreage.

Total land devoted to farming (or crop acreage) today is about 12% lower than the highs of the 1930’s and the early 1980’s. This is amazing in many ways given that the U.S. population is now 130% higher than it was in the 1930’s. Stated differently, twice as many people are now being fed on fewer crop acres.

cum_acres.JPG

Moving away from total acreage, it is instructive to look at the mix of acreage (see the two following figures). Here, we can see some significant changes in which crops are planted in the U.S. over time. For example, in 1926, there were only 1.9 million soybean acres but in 2018, for the first year in history, more acreage (89 million acres) was planted to soybeans than any other crop. Prior to that corn had been king every year except 1981-1983, when more acres were devoted to wheat than corn.

Another big change was a reduction in the number of acres planted to oats. Prior to the 1960’s, more than 40 million acres of oats were routinely planted each year. In 2018, only 2.7 million acres were in oats. Why the change? One big reason is that there aren’t as many mules and horses that need to be fed. Cotton also experienced a precipitous reduction in acreage from the late 1920’s to the early 1960s, stabilizing a bit thereafter.

acres_bycrop.JPG

The following figure shows the same data, but with acreage dedicated to each crop expressed as a percentage of total acreage in a given year.

Taken together, these three figures suggests the big change hasn’t been the total farmland planted but rather the change in which crops are planted to the acres. Moreover, this crop mix issue (the rise of soy and the decline of oats) probably had little to do with farm policy.

acres_composition.JPG

Given all the concerns expressed these days about mono-cropping, it might be interesting to look at the variation in planted acreage (in terms of the mix of crops planted) today than in the past. To see this, I calculated the coefficient of variation across the number of acreages planted to each of the nine crops in each year. This gives a feel for how much crop variation there in a given year. Here are the results plotted over time.

cropvariation.JPG

The coefficient of variation ranges from about 87% to 138%. Comparing this to the coefficient of variation for total acreage planted (which was 7.5%), implies there is more variation in which crops are planted to which acreages in a given year than there is variation in the total planted acreage over time.

The figure above shows that the crop-mix variation (at least among these nine crops) has been increasing since the 1960s, and the variation is higher in the past decade than at any point in the preceding 80 years.