Blog

More on Soda Taxes

The Huffington Post just ran a piece I wrote in response to prior post in the same outlet by a doctor advocating for soda taxes.  Here are some excerpts: 

Jeff Ritterman, Vice Present of the Board of Directors for his local chapter of the Physicians for Social Responsibility recently wrote on The Huffington Post that we should "Tax a Cola, Save the Planet."

I've read lots of editorials advocating soda taxes, but this one beats them all in promising what a soda tax will deliver. He argues that:

"A simple policy change like the Soda Tax can help us waste less water, lower our GHG production, and lessen the pollution of our air, water and soil. At the same time, it can fund vital programs in our schools, parks and neighborhoods to improve nutrition and physical education opportunities for our children. It's a win-win-win: a win for the environment, a win for our children, and a win for our communities."

Wow, one tax will do all that! H.L. Mencken reportedly said that "for every complex problem there is an answer that is clear, simple, and wrong."

After discussing the literature showing how price changes and soda taxes cause people substitute toward other beverages and foods, I argued:

Ritterman argues that a soda tax will results in less soda being consumed, and he's probably right. But does that mean that there will be less water and less aluminum consumed? Well, it depends what consumers drink and eat instead of soda. If people instead drink more milk or more beer, as previous studies have suggested, will water consumption really be cut? If more cows are needed to produce more milk due to the increased demand caused by soda taxes, will greenhouse gases really fall?

Ritterman is right to suggest that enacting a soda tax can raise revenue for the government. But, that hardly makes it an economically efficient thing do. A tax is akin to reducing in one's income. No one likes having less income. Using taxes to direct people to buy goods they didn't purchase before the tax cannot make people better off. 

Meat expenditure shares

It seems there is a constant barrage of studies, books, and media critical of animal agriculture. The negative publicity is multifaceted and ranges from concerns about animal welfare, health impacts, food safety, climate change, environmental impacts, water usage, food security, and on an on.  

Just to given one representative example, here is James McWilliams writing in the New York Times in a 2012 article entitled The Myth of Sustainable Meat:

 THE industrial production of animal products is nasty business. From mad cow, E. coli and salmonella to soil erosion, manure runoff and pink slime, factory farming is the epitome of a broken food system.

He argues there, and in a more recent 2014 editorial in the same outlet that we the best solution is to give up eating meat.  I used McWilliam as an example, but I could have picked any number of high profile books (e.g., here or here), academics, advocacy groups (e.g., here or here), or news stories that paint conventional animal production industries in a less than favorable light.

Here is my question: how much impact, if any, has this had on consumers' demand for meat, dairy, and eggs?  

To indirectly get at this question, I turned to some data collected by the Bureau of Economic Analysis (BEA) on Personal Consumption Expenditures.  The BEA reports total expenditures on food at home in a variety of categories going back to 1959.  I took this data and calculated the share of total expenditures on food eaten at home (what the BEA calls " Food and beverages purchased for off-premises consumption") attributable to beef, pork, poultry, eggs, and dairy products.  For reference, total expenditures on food eaten at home was about $61.5 billion in 1959 (in 1959 dollars) and was about $884 billion in 2013 (in 2013 dollars).  

There was a remarkable downward trend in the allocation of consumers' food budget away from dairy and beef from 1959 till the early 1990s, and an uptick in poultry.  Consumers went from spending about 12-14% of their food budget on beef and another 12-14% on dairy in the early 1960s down to about 5-8% on each in the early 1990s.  Stated differently, consumers just about halved the proportion of their food budget going toward beef and dairy in a 30 year time period.  

There were a lot of reasons for these changes.  These industries became much more productive and prices fell, so consumers could allocate less of their budget to these items but still consume the same amount or more.  The price of poultry fell much more rapidly than the price of beef, and thus some of the downward trend reflects substitution away from beef toward poultry.  There were other consumer concerns during that period related to cholesterol, saturated fat, E coli, etc. that led to less consumption of beef and dairy.  

Despite, all that, it is remarkable how resilient meat demand has been over the last 20 years in light of the large amount of negative publicity mentioned earlier.  To illustrate, here is the graph  just from 1993 to 2013.

The lines are essentially flat.  People are allocating just about the same amount of their food budget to beef, pork, dairy, poultry, and eggs today as they did 20 years ago.  

It may be the case that all the aforementioned negative publicity in recent years will eventually cause consumers to allocate their food budget away from animal products.  But, at least so far, it doesn't seem to have had much of an impact.

Really? No conflicts of interest?

It is becoming the norm in academia to provide lists of conflicts of interest when submitting an article for review at a journal or sometimes even when speaking at conferences.  By and large, I think the move toward transparency is a good one.  

But as one set of authors point out in the Journal of the Royal Society of Medicine, many academics have too narrow a view of what represents an "interest."  Richard Smith and colleagues write:

People who work for public sector institutions regard themselves (and are often regarded) as being neutral, disinterested, and unbiased supporters and defenders of the public interest. There is, however, a large literature by economists and political scientists known as ‘public choice theory’ (that even has its own scholarly journal, Public Choice) that demolishes this pretension.3 Public institutions and the individuals that work for them are found to be self-interested, much like private institutions and their employees. Government bureaucracies seek to maintain and expand their scale and influence, a reality which is captured in arguments against the ideal of impartial civil servants in the Weberian bureaucracy.4–7 United Nations agencies fight over territory and mandates. Individuals working for public institutions with a certain culture (such as the London School of Hygiene and Tropical Medicine, where one of us [RF] was the dean) know that their career prospects may be advantaged by being a part of that culture rather than iconoclasts. As others have noted, being a ‘public servant’, or an ‘international public servant’ or the employee of a university does not make one un-self-interested or un-conflicted.8

Academics, especially in applied fields such as global health and medicine, often have numerous relations with not-for-profit organizations – including governments, foundations, non-governmental organizations and United Nations agencies. These relationships typically include some combination of remuneration for advice or assistance, research funding (which may also include salary support for the principal investigator) and travel support. More generally, these relationships are likely to be career enhancing, as when an academic has multiple relations with the World Health Organization (WHO) and is frequently called upon by WHO for services of various kinds. Many of the organizations with which the academic has relations have stated positions on issues affecting public health and indeed many other topics. Surely there is potential here to influence an academic's expression of views – in other words a potential conflict of interest worthy of declaration.

They conclude:

Our message is simple: we must recognize that we are all conflicted and declare accordingly.11 A view of the world that sees employees of private for-profit companies as conflicted and doctors, or employees of public or academic bodies, as not, is naïve, potentially deceptive and likely to distort reader response to new information

Bt resistance

A couple recent studies have raised concern that certain corn rootworms are becoming resistant to the Bt produced by biotech corn.  See for examples this paper in the Proceedings of the National Academies of Science by Iowa State University entomologists published last week (some of the same authors seem to have a similar paper published in 2011 in PLOS ONE) and this paper published in Nature Biotechnology this summer.  The most recent study has prompted quite a bit of attention on the web from outlets as varied as Wired and Grist.  

Any pesticide (biotech or not) has the potential to become ineffective over time due to the development of genetic resistance in insect (or weed) populations.  Plant genetic companies, knowing this, tried to implement several strategies to slow the spread of resistance: such as developing several types of Bt that produced different insect-killing proteins (which appears to have had only limited effectiveness) and the planting of refuges.  Refuges refer to the planting of non-Bt corn near Bt-corn, which reduce the selective pressure on rootworms and other pests, and thus potentially increases the length of the effectiveness of the Bt trait.  Originally, corn farmers were supposed to plant a certain percentage of their acreage in non-Bt corn as a refuge, and more recently, we have seen refuge in a bag - the Bt seed is delivered to farmers premixed with non-Bt seed.  

Some sources place the blame on the development of resistance on biotech companies lobbying for lower refuge requirements or on farmers for failing to observe the requirements.  That may be partially true.  Any individual farmer likely faces an incentive to free-ride off their neighbor's refuge (something that can be eliminated with the "refuge in a bag" concept), but it strikes me as incredibly short sighted that biotech companies would willfully advocate for policies that would reduce their long-run profitability (or it may be their interest to allow Bt resistance to develop if they have other products in the pipeline that become more valuable as Bt resistance develops).  

As I see it, the real challenge here is Mother Nature herself.  Agriculture is inherently a struggle against nature.  We have become so accustomed to seeing crop yield gain, that sometimes it is easy to forget that one of the biggest challenges is simply trying to keep up with nature's adaptations to the latest varieties.  The natural state of affairs is yield decline - not yield increase.  Seen in this light, science and technology seldom offer a one-time fix.  It is a constant struggle. We find a solution.  Nature responds.  We try to find another solution.  Nature adapts again.  And on and on it goes.  

No doubt there are many who argue that we should step off this technology treadmill.  We probably can find ways to better work with (or at least accept some drain in efficiency from) natural pests, and that may be one of our adaptions.  But, I think it is foolish to think we can ever really step off the treadmill.  There never was or will be some perfect ecosystem equilibrium.  Bacteria, insects, weeds have been and always will be evolving to get the upper hand on their competitors (that's us and our food crops) and we will do the same.  Our best bet is to try to stay one step ahead knowing our natural competitors won't be far behind.