Every so often, the people seem to get excited about the farmer’s share of the retail dollar – particularly when USDA updates the figures or a news article mentions the issue. A couple months ago, for example, the National Farmer’s Union issued a press release decrying the fact that farmers “only” receive 14.8 cents of every dollar consumers spend on food. About the same time, the Food Tank put out this tweet.
The widespread implication seems to be that a lower share of the retail dollar is an unambiguous sign that farmers are worse off. But one has very little to do with the other. Let me try to illustrate with an example.
Suppose there are two countries where the farmer’s share of the retail dollar differs dramatically. In Country A, the share is only 10% and in Country B, the share is 90%. So, when a consumer spends $1 on food, the farmer in Country A receives 10 cents and the farmer in Country B receives 90 cents. On a dollar-spent-on-food basis, it thus looks like a farmer would much prefer to live in Country B than Country A. But, let’s dig a little deeper.
Suppose the farmers in our two countries actually produce the same value of agricultural output. To make the math easy, let’s say farmers in Country A produce $100 billion worth of ag output and farmers in Country B do the same.
What are consumers in the two countries spending on food? By definition, consumers in Country A are spending $100 billion/0.1 = $1,000 billion and consumers in country B are only spending $100 billion/0.9 = $111.11 billion. By definition, for a fixed value of ag output, a smaller value for the farmer's share of the retail dollar implies a larger total food economy. As I'll show in a minute, it matters a lot if you're selling into a $1 trillion market or a $111 billion market.
Why might consumers in Country A spend so much more on food than consumers in Country B despite the same volume of ag output in both countries? Well, it could be there is more market power with greedy agribusinesses and retailers siphoning off profits in Country A than B (that seems to be the common layman’s interpretation). But, it could also be that consumers in Country A have the preferences or ability to pay more for better packaging, increased food safety, better working conditions in food processing, more convenience (they pay the processor or a restaurant to do more of the preparation for them), etc.
So, what happens if there is a 10% increase in consumer demand for food in both Country A and Country B? This could happen, for examples, if the populations increase in each country or if the respective food industries run advertisements or there are post-farm innovations that increase quality.
Now, let’s construct a very simple economic model (such as the one we use in this paper), where, in both countries, the elasticity of demand is -0.8 and the elasticity of supply is 0.2, and the farm product is supplied to the retail sector in fixed proportions.
In this situation, a 10% increase in consumer demand in country A (with only a 10% farmer’s share of the retail dollar) will increase farmers' profits by $29 billion. However, in country B, where farmers “get” a full 90% of the retail dollar, that same 10% increase in consumer demand only increases farmers' profits by $8.8 billion. So, for the same percentage increase in consumer demand, farmers in country A are more than 3x better off than farmers in country B despite the fact that their share of the retail dollar is only 10% instead of 90%.
So, here’s a fundamental lesson: a small share of a big number can be much higher than a larger share of a smaller number.
Now, none of this means that one cannot construct scenarios in which producers are worse off when the farmer’s share of the retail dollar falls. That’s easy to do too. But, as I’ve shown here, I can easily do the opposite.
The point? Changes in the farmer’s share of the retail dollar are meaningless insofar as telling us whether farmers are better or worse off.
Don't believe me? Listen to other agricultural economists. Here is Gary Brester, John Marsh, and Joseph Atwood and colleagues writing in a 2009 journal article: