Blog

Craig Gundersen on Food Stamps

Last week, the Washington Post ran an interview with the agricultural economist Craig Gundersen on the food stamps program (officially known as the Supplemental Nutrition Assistance Program).  Craig has been researching this topic for decades and has a lot to say - much of which runs against what we regularly read in the media.  

Here's one little snippet:

So maybe people aren’t selling their EBT cards — but could they buy groceries with benefits and then sell them for cash? I hear from readers, for instance, who say they see people buying baby formula and reselling it.

There’s a couple things going on here. The first thing, with infant formula — I think they are probably thinking about the Special Supplemental Nutrition Program for Women, Infants and Children (WIC). Theoretically, you could have someone who doesn’t tell the WIC person they’re breastfeeding so that they still get infant formula. But those are totally separate programs. Sometimes people lump the two things together.

With respect to SNAP — I can’t make sense of that, from a rational standpoint. Presumably, in this scenario, people are buying groceries with SNAP and then reselling those groceries at a lower price than what they cost in the store. Most off-SNAP purchases are inframarginal, meaning that they spent more than their benefit amount on food. So if someone sold their benefits, they would be giving up the opportunity to purchase food, and would have to purchase food with other funds.

The only group of people who this might make economic sense for are those who are not inframarginal — meaning they only spend $100 per month on food and they get $120 per month in SNAP benefits. They could sell that extra $20 of food they get on benefits. But even if that was happening, it would be such a small proportion of people. Who cares?

The whole thing is worth a read.

Solar Radiation and Crop Yields

My last post discussed some recent research we conducted on the impacts of biotechnology adoption on corn yields.  A reader forwarded a link to a recent paper in the journal Nature Climate Change raising an issue I hadn't yet heard about.  Using some simulations, the authors argue that increased solar radiation, which led to brighter skies, has had a big impact on recent increases in corn yields.  

Here's the abstract:

Predictions of crop yield under future climate change are predicated on historical yield trends, hence it is important to identify the contributors to historical yield gains and their potential for continued increase. The large gains in maize yield in the US Corn Belt have been attributed to agricultural technologies, ignoring the potential contribution of solar brightening (decadal-scale increases in incident solar radiation) reported for much of the globe since the mid-1980s. In this study, using a novel biophysical/empirical approach, we show that solar brightening contributed approximately 27% of the US Corn Belt yield trend from 1984 to 2013. Accumulated solar brightening during the post-flowering phase of development of maize increased during the past three decades, causing the yield increase that previously had been attributed to agricultural technology. Several factors are believed to cause solar brightening, but their relative importance and future outlook are unknown, making prediction of continued solar brightening and its future contribution to yield gain uncertain. Consequently, results of this study call into question the implicit use of historical yield trends in predicting yields under future climate change scenarios.

I don't know enough about the issue to speak to the credibility of the authors' findings.  However, I not sure that this is much of a confound for our study on biotech adoption because our estimated effects are (partially) identified by using variation in yields across states that have differential adoption rates (and yet are presumably exposed to the same solar radiation).  To the extent that identification our effects of biotech adoption come about from comparisons of yields in the same counties over time (where solar radiation varied over time), this could be an issue, but again, the time trend included in our models should pick up this effect as well.  

The Adoption of Genetically Engineered Corn and Yield

Many readers of this blog will probably remember the article by Danny Hakim (and the associated infographics) that ran in the New York Times back in October about the "broken promises" of GMOs.  The article prompted some cheers and some legitimate criticisms.  

After that article came out, one thing bugged me a bit.  We have scores of experimental studies showing GMOs (particularly the Bt varieties) increase observed yield, so why don't we see a pronounced effect on aggregate, national yield trends?  At about the same time these thoughts were swirling around in my head, I received a note from Jesse Tack at Kansas State University asking the same.  As it turns out, we weren't the first to wonder about this.  The most recent, 2016, National Academies report on GMOs noted the following:

the nation-wide data on maize, cotton, or soybean in the United States do not show a significant signature of genetic-engineering technology on the rate of yield increase

Here is a figure illustrating the phenomenon (from our paper I'll discuss more in a moment).  Despite the massive increase in biotech adoption after 2000, national trend yields don't appear to have much changed.

So we began speculating about possible factors that could be driving this seeming discrepancy between the national, aggregate data on the one hand and the findings from experimental studies on the other, and wondered if aggregation bias might be an issue or whether the lack of controls for changes in weather and climate might be a factor.  One of Jesse's colleagues, Nathan Hendricks, also suggested variation in soil characteristics, which when matched up with different timings of adoption in different areas, might also be an explanation.  

To address these issues, Jesse, Nathan, and I wrote this working paper on the subject, which will be presented in May at conference put on by the National Bureau of Economic Research (NBER).  Here are some of the key findings:

In this paper, we show that simple analyses of national-level yield trends mask important geographic-, weather-, and soil-related factors that influence the estimated effect of GE crop adoption on yield. Coupling county-level data on corn yields from 1980 to 2015 and state-level adoption of GE traits with data on weather variation and soil characteristics, a number of important findings emerge. First, changes in weather and climatic conditions confound yield effects associated with GE adoption. Without controlling for weather variation, adoption of GE crops appears to have little impact on corn yields; however, once temperature and precipitation controls are added, GE adoption has significant effects on corn yields. Second, the adoption of GE corn has had differential effects on crop yields in different locations even among corn-belt states. However, we find that ad hoc political boundaries (i.e., states) do not provide a credible representation of differential GE effects. Rather, alternative measures based on soil characteristics provide a broad representation of differential effects and are consistent with the data. In particular, we find that the GE effect is much larger for soils with a larger water holding capacity, as well as non-sandy soils. Overall, we find that GE adoption has increased yields by approximately 18 bushels per acre on average, but this effect varies spatially across counties ranging from roughly 5 to 25 bushels per acre. Finally, we do not find evidence that adoption of GE corn led to lower yield variability nor do we find that current GE traits mitigate the effects of heat or water stress.

To get a sense of the heterogeneity in yield effects, here is a graph of the estimated impacts of adoption of GMOs for counties that differ in terms of their soil's water holding capacity.   

There's a lot more in the paper.

When Consumers Don't Want to Know

Since I first started working on the topic of animal welfare, I've had the sense that some (perhaps many?) consumers don't want to know how farm animals are raised.  While that observation probably rings intuitively true for many readers, for an economist it sounds strange.  Whether we're talking about GMO labeling, nutritional labels, country of origin labels on meat, or labels on cage free eggs, economists typically assume more information can't make a person worse off.  Either the consumer uses the information to make a better choice or they ignore it all together.    

There is a stream of literature in economics and psychology that is beginning challenge the idea that "more information is better."  One simple explanation for the phenomenon could be that consumers, if they know for sure they will continue to consume the same amount of a good, could be better off ignoring information because the information could only lower their satisfaction (perhaps because they'll feel guilty) for doing something they've already committed to doing.  In this paper by Linda Thunstrom and co-authors, 58% of consumers making a meal choice chose to ignore free information on caloric content, a finding that Thunstrom calls "strategic self ignorance" arising from guilt avoidance. 

Another possible explanation that I've previously published on is that, when people have limited attention, more information on topic A might distract people from a topic B, even though topic B ultimately has a larger impact on the consumers well-being.  

It may also be the case that people want to believe certain things.  They derive satisfaction from holding onto certain beliefs and will avoid information that challenges them.  These ideas and more are discussed by Russell Golman, David Hagmann and George Loewenstein in a nice review paper on what they call "information avoidance" for the Journal of Economic Literature.

A graduate student in our department, Eryn Bell, has been working with Bailey Norwood to apply some of these concepts to the topic of animal welfare.  They conducted a survey of 1,000 Oklahomans and asked them one of the two simple questions shown below.  Depending on how the question was asked, from 24% to 44% of respondents self declared that they would rather NOT know how hogs are raised.  The primary reasons given for this response were that farmers were trusted (a belief consumers may prefer to hold), that there are more important issues to worry about (limited attention), and guilt aversion. 

In the same survey, Bell and Norwood also included a set of questions based on some ideas I suggested.  The question gave respondents the option to see a picture of how sows are raised or to simply see a blank screen for a certain period of time.  People were divided into three groups that varied how long they had to see the blank screen.  The idea was that we could use the waiting time as a "cost", which would allow us to ask: how long are people willing to wait to NOT receive free information?  As it turns out, people weren't very sensitive to the waiting time.  Nonetheless, regardless of the waiting time, about a third of respondents preferred to see an uninformative blank screen as opposed to a more informative screenshot of sow housing.  These findings suggest at least some people, at least some of the time, would prefer not to know.  

Does a Good Diet Guarantee Good Health?

To be sure, dietary factors contribute to bad health at least some of the time for some people.  But, how large a role does diet play?  Stated differently: even if you eat well all the time, are you guaranteed to be free of cancer, heart disease, and diabetes?  Far from it according to two recent studies.  

The first was published Friday in Science by Tomasetti, Li, and Vogelstein, who investigated cancer causes.  When discussing the things that can cause cancer, causes normally fall into one of two broad categories: nature (environmental factors) or nurture (inherited genetic factors).  These authors, however, point to a third factor: as we grow, our cells naturally replicate themselves, and in the process, unavoidable DNA replication errors occur which ultimately lead to cancer.  The authors calculate that these replication errors or  

mutations are responsible for two-thirds of the mutations in human cancers.

Secondly, I ran across this interesting paper published a couple weeks ago in the Journal of the American Medical Association.  The authors attempted to ferret out how many deaths from heart disease, stroke, and type 2 diabetes (what the authors call "cardiometabolic deaths") that result each year annually come about from over- or under-consumption of certain types of foods.  As this critic pointed out, it is important to note that the authors estimates are associations/correlations NOT causation.  As such, I'd suggest caution in placing too much interpretation on the impacts from different types of food.  Nonetheless, there were a couple of other less-well-publicized results which I found interesting.

First, the authors found:

In 2012, suboptimal intake of dietary factors was associated with an estimated 318 656 cardiometabolic deaths, representing 45.4% of cardiometabolic deaths.

Stated differently, 54.6% of deaths from heart disease, stroke, and type 2 diabetes seems to be caused by something other than diet.   

The other result that I found interesting from this study is that there has been a big decline in so-called cardiometabolic deaths.  The authors write:

Between 2002 and 2012, population-adjusted US cardiometabolic deaths per year decreased by 26.5%.

Some of this decline, they argue, is due to reduced sugar consumption and increased nut/seed consumption from 2002 to 2012.

Why does all this matter?  Because these statistics help us understand the impacts of dietary and lifestyle changes.  To illustrate, let's take the above cancer statistic: 66.7% of cancers are caused by unavoidable replication errors. That leaves 33.3% of cancers, some of which are diet and lifestyle related and some of which are caused by inherited genetic factors.  For sake of simplicity, lets say you have zero risk from inherited genetic factors. Also note that the National Cancer Institute suggests that the chances of getting a new cancer in a given year are 454.8 per 100,000 people (or a 0.45% chance).  

Putting it all together, your chance of getting cancer from random errors in DNA replication is 0.667*0.45%=0.30%, and your chance of getting cancer from diet and lifestyle factors (assuming no inherited risks) is 0.333*0.45%=0.15%.  So, even if you could completely eliminate the cancer risk from diet and lifestyle factors, you'd go from a 0.45% chance of getting a new cancer to a 0.30% chance, a reduction of 0.15 percentage points.