Feeding the world nutrition over calories, plants over animals
MFA Livestock and Food security finished
Without a doubt, the most commonly repeated idea in the food security literature is the challenge of feeding the 9.7 billion people we are projected to have by 2050. To do this, we will need +50% to +69% more food[^1] in 2050 than we had in the early 2010s.
Whilst anyone going hungry is regrettable, this problem may get more attention than more pressing food security issues. As of 2022, more than 90% of the world ate enough calories, and most LMICs now simultaneously struggle with both overweight and undernutrition (the “double burden of malnutrition”). One review of 25 future scenario models found that overall, humanity is on track to being able to feed itself enough calories in 2050 without taking drastic measures. Many models highlighted that we need strong increases in crop productivity, but that this seems achievable. Many models also projected the need to expand agricultural land across the world, particularly in An influential model by Nelson et al., (2018) simulated what the world’s food situation would look like in 2050, under various assumptions about global economic growth, but also damage due to climate change. They found that even assuming climate change will reduce agricultural yields, it is likely that most of the world will be able to afford enough calories and protein in 2050. The % of the average household budget spent on food in low income countries was expected to drop from 44% to 17-22%. This increased affordability may indicate that food availability and food access are on track to be met by 2050. Of course, any simulations of the world decades into the future should be taken with a large grain of salt, and success is far from guaranteed. Nonetheless, there may be cause for optimism on this particular issue.
Feed-food competition
Some researchers go further, arguing that we are able to feed 9.7B people now. To do this, all we need to do is stop feeding human-edible crops to farm animals. If we stopped using crops for feeding animals and simply fed those crops to humans, we could potentially feed an additional 4 billion people and increase calories available to humanity by 70%. This was also found by Berners-Lee, Watson and Hewitt in 2018, who shockingly note that in 2013 we produced nearly 6000 calories of human edible crops, per person, per day. This is far more than the 2400 calories per day required to sustain the average human as declared by the FAO. Even if humanity did not increase crop yields at all between 2013 and 2050, feeding everyone enough calories is still possible. 1700 of those calories are fed to farm animals rather than humans, a phenomenon known as "feed-food competition". Most of these feed calories are not converted into meat or egg calories available that we can eat, as animals must spend calories to grow and live.
Feed-food competition reduces food availability, as crops are fed to farm animals rather than humans. By increasing the need for intensive crop farming, it can speed up soil degradation, compromising the stability of entire food system medium-long term. It can also potentially reduce food access, even when food is available. Livestock farmers may bid up the price of grains, resulting in higher prices for everyone. The USDA has noted that the high food inflation seen in 2021 and 2022 was likely made worse by livestock farmers competing to purchase limited grains. A warning about this argument is that despite it being intuitive, I was unable to find any report or academic paper directly demonstrating that feed-food competition has resulted in higher food prices.
Feed-food competition is a critique regularly leveled at industrial animal agriculture. While persuasive, there are 3 critiques worth bearing in mind:
- As the sections above point out, we are on track to provide enough calories for the world in 2050 even without reducing the crops we feed to animals. As such, the food availability and food access risk may be low.
- Food security researchers now acknowledge that the availability and access of calories are arguably less important to food security than getting adequate nutrition (food utilization). This is explored below.
- This argument assumes that farmers who produce crops for animal feed would simply switch to selling their crops as human food. It is unclear if this would happen, because large increases in the supply of food crops could drive prices down, reducing their profitability. Farmers may switch to growing other plants, or sell their crops for biofuels.
In response to the issues created by Feed-food competition, Some now advocate for a "livestock on leftovers" approach. Intuitively, this strategy involves only feeding livestock crop residues and by-products that humans cannot eat, as well as uneaten human food.
Some version of this strategy is already widely applied informally across the world; it is common for ‘backyard livestock’ to be fed food scraps. It is less common in more intensive agricultural systems; in Europe it is illegal for animal agriculture to feed livestock with untreated human food waste. Livestock may only be fed by-products from food production (such as whey powder) or food that was produced but could not be sold for whatever reason (i.e. surplus production or defective). This is because of potential disease risks; it is thought to have caused the 2001 outbreak of Foot and Mouth disease in the UK for example. Despite this, a 2018 survey found that over 75% of British pig farmers supported re-legalization, indicating that Livestock on leftovers may be quite popular among farmers.
There is clear potential for livestock on leftovers to provide a variety of benefits. One review explored the specific case of feeding farmed animals uneaten human food that would otherwise go to waste, specifically focusing on stale bakery products. They found that food safety for animals was generally satisfactory, but the effects of low fiber, sugary carbohydrates on livestock gut microbiomes were not well understood. They found that this method of feeding livestock this way could reduce greenhouse gas emissions as otherwise the food would go to landfill.
However, large scale assessments suggest that the benefits have substantial tradeoffs. A simulation by Shrader et al. found that for the world to produce enough food (and protein) in 2050 using a pure livestock on leftovers approach, global meat consumption would have to decrease 71% on current levels. Similarly, another simulation found that Nordic countries would have to cut meat consumption by 89% if no livestock were fed crops, or drastically increase food imports. Considering that most of the world is increasing meat consumption, such a dramatic change seems unrealistic. Shrader et al., also noted that meat produced per animal decrease anywhere between 0-40%, as many leftovers are less calorie dense than the concentrated animal feed typical of industrial animal agriculture. Finally, they found that ruminants fed by grazing and leftovers had higher greenhouse gas emissions per animal, though this was of course completely negated by reduced emissions from dramatic meat reduction. Surprisingly, one analysis of multiple studies concluded that livestock on leftovers would free up 25% more arable land than if humanity did not eat any animal products at all (i.e. world veganism). This is because feeding livestock human-inedible food would convert inedible calories into edible ones, meaning that we would need to grow fewer crops, sparing land. Once again however Europe and America would have to reduce meat consumption by 60-70%, perhaps as high as 80% in order for Asia and Africa to increase their meat consumption. Whilst a Livestock on Leftovers approach could reduce food waste and would likely be safe for animals, the risk of greatly reduced yields and higher GHG per animal means that it would only be feasible if society was already in the process of making large reductions in meat consumption.
Malnutrition is a greater problem than not enough calories
Humanity might have enough calories to survive in 2050, but will we all have enough nutritious food to truly thrive? Put in food security terms, future food utilization looms larger than future food availability. While only a small fraction of the population today cannot afford enough calories, just under half the world cannot afford a nutritious diet. Globally, it’s estimated that over half of children under five years old are micronutrient deficient in at least one of iron, zinc, or vitamin A. These deficiencies can result in ‘stunting’, where a child is too short for their age, and indicates that they are not growing healthily. Stunting is associated with a variety of lifelong negative health and cognitive effects, and so prevention is incredibly important. In addition, over two-thirds of non-pregnant women of reproductive age are micronutrient deficient in at least one of iron, zinc, and folate (aka vitamin B9).
These statistics are alarming, and unfortunately, not trending optimistically. Between 2017 and 2021 the absolute number and the % of the world suffering from malnourishment have both increased. The simulations for 2050 by Nelson et al., (2018) found that in many scenarios, substantial proportions of the world remained deficient in calcium, folate and vitamins D and E even as they reached sufficiency in calories. There was also a significant risk that even with projected growth in wealth across the developing world, many people would still not be able to afford a diet rich in vital nutrients such as Iron, potassium, or zinc. This is despite the poorest countries doubling their intake of both meat and vegetables. Likewise, Berners-Lee, Watson and Hewitt 2018 found that while we currently produce enough calories to feed everyone in 2050, we do not produce enough fruits and vegetables to meet micronutrient needs. The data show global diets need to change course to ensure that all countries achieve food utilization through good nutrition.
Animal products and malnutrition
It is widely believed by major development organizations (the FAO, the World Bank etc) that the world's poor need to eat more animal products. This is based on the belief that animal products are uniquely good sources of key micronutrients. But is this true? Are animal products the only viable solution to micronutrient deficiencies in low and middle income countries?
It is true that animal products contain high levels of bioavailable micronutrients like iron, vitamin A, vitamin B12, iodine, and zinc. Crucially, these micronutrients are often the ones that the global poor are most deficient in, and it is not obvious that these deficiencies can be met with plant-based diets. Most notably, there are no significant vegan sources of B12 (see further discussion of this below). A 2024 systematic review of 14 studies found that reducing animal products in favor of plant forward diets in the west was often associated with decreasing amounts of some of these nutrients, notably, B12, zinc and calcium (though increases in iron). These studies were carried out in western populations that consume large amounts of animal products, so we should be careful generalizing to the global poor. However, they at least tentatively support the argument that animal products are in fact significant sources of these micronutrients.
There is more direct evidence that increasing consumption of animal products may boost micronutrient levels. One particularly large study of 130,000 children across 49 countries found that consumption of animal products decreases stunting. A systematic review of 14 randomized controlled trials found that infants (6-24 months) in LMICs whose diet is supplemented with animal products grew significantly larger. Crucially, slow growth between 6-24 months is a strong signal that a child will grow up stunted. Overall, there is good evidence that small amounts of animal source foods can fight malnutrition in the world’s poor.
It is important to clarify what these findings do and do not imply. Firstly, just because animal products can be significant sources of micronutrients, does not mean they are the best source for addressing food insecurity worldwide. Second, it is clear that no country requires American/European levels of animal product consumption. There is a growing consensus that most citizens of high income countries (HICs) eat more animal products than are necessary for good health.
One argument that is commonly made is that animal foods are better because they are some of the most “nutrient dense” foods we have. This is true. Micronutrient density can be defined as the amount of micronutrients that are found in 100 calories of a food, or 100 grams of food. Foods with low levels of micronutrient density might be called "nutrient-sparse" foods. Nearly all animal products score highly on these measures of nutrient density.
But upon closer inspection, these metrics are often irrelevant to food security in developing countries. The (hypothetical) statement that "per 100 calories, beef has higher levels of vitamin A than potatoes" is only useful to someone trying to maximize their vitamin A intake while also keeping calories as low as possible. Similarly, the statement "per 100 grams, fish contains more iodine than 100 grams of spinach" is only useful to someone trying to maximize their iodine intake in as little quantity of food as possible.
Neither of these metrics are relevant to food insecure people. They are not looking to minimize calories or weight of food. For those that are hungry as well as nutrient deficient, they may actually be looking to also increase calories and grams of food (higher weight of food will decrease feelings of hunger). Food insecure people care far more about how easy foods are to produce, or how cheap they are to buy. 200g of beef might meet one's daily iron needs, compared to 500g of spinach. But if 200g of beef is 5 times the price of 500g of spinach, or 500g of spinach can be easily grown and a cow cannot, then spinach provides better food access than beef does (note this example uses fictitious numbers for illustration purposes). This is regardless of beef's superior nutrient density. More relevant food security metrics might be, for example, "calcium content per Rupee at the local market" or "zinc content per hour of labor". It is not clear that animal products are favorable on either of these metrics.
I could find only 1 study that quantified this, and data was hard to come by. One study used FAOstat data to show that on average in low and middle income countries, getting 10g of protein from pulses and nuts is cheaper than getting the same amount of protein from any animal source, even taking into account lower even when taking into account plants’ lower protein bioavailability. I conducted a bespoke analysis using food price data from The World Food Program, and found that of 191 locations around the world, beans are a cheaper source of iron (95% of comparisons in over 250 markets) and zinc (57% of locations) than beef is (See Appendix 2 for methods). The analysis takes into account the lower bioavailability of zinc and iron. For example, In the Gatore region of Rwanda, 1kg of beef contains 1.3 times more iron and 3.8 times more zinc than dried beans. However, because beef is 5.5 times more expensive than beans, it is still cheaper to meet a family’s iron needs through beans, not beef. In the town of Kerben, Kyrgyzstan, beef is 3.5 times more expensive, but has the same beef and bean nutrient levels. As a result, beef is a cheaper source of zinc than beans, but a more expensive source of iron.
There is one extreme scenario where nutrient density matters: when someone eats a diet mostly consisting of very nutrient-sparse foods. In this case, it is possible for someone to eat enough calories and feel full whilst suffering from micronutrient deficiency. An intuitive example might be an unrealistic diet consisting entirely of cake. Many LMICs overconsume nutrient sparse foods, and this exacerbates micronutrient deficiency. However, this is not necessarily an argument in favor of animal products. It simply shows that there is a minimum micronutrient density that we need to exceed. It doesn't logically follow that we need to maximize nutrient density. If animal products are expensive or difficult to produce, or have additional environmental/health externalities, it may be more tractable to solve micronutrient deficiencies using plant sources, even if they are less nutrient dense per gram or calorie.
“Future Smart Foods” as a potential answer to nutrition insecurity
If animal products are not the solution to global micronutrient needs, what are the alternatives? As seen in the review above, simply substituting animal products with plants, especially grains, can risk micronutrient deficiencies. It is important that we explore low animal diets that are high in micronutrients.
One promising option is Future Smart Foods (FSF), championed by the FAO and Bill Gates. These are neglected and underutilized species (NUS) that are also nutrient dense, climate resilient, economically viable, and locally available or adaptable (FAO, 2018). Examples are given in Figure 2. By definition, these foods are selected to tackle food security from multiple angles:
- They improve food access for smallholders in remote rural areas. This is because they require few inputs; remote rural farmers may not have reliable access to specialist seeds, veterinarians and farming and processing machinery.
- They improve food utilization by being high in micronutrients. For example, lupin contains five times more protein, eight times more dietary fiber, four times more iron and 44 times more folate than rice. Additionally, fewer than 50 calories of Amla (Indian Gooseberry) are required to meet daily vitamin C needs.
- They bolster food stability by improving the diversity of crop species cultivated in an area, as well as being locally adapted and drought tolerant.
One example of a Future Smart Food is the Persian Moringa tree. It is able to grow in arid and desert regions and the leaves and seeds contain high levels of vitamins A and C, as well as calcium and potassium. Another example is Chaya, or tree spinach, that is popular and native to Mexico and Central America but is grown as far as Cambodia. It grows fast, is drought and disease resistant, and high in protein, iron and vitamin C. it contain 2.2x more iron than cabbage, which is considered a good plant source of iron.It may also provide a good source of income to growers as it is cheap to produce and appealing to tourists from high income countries.
Other examples of promising Future Smart Foods are millets such as Finger Millet, Teff or Fonio. These are collectively referred to as “Nutricereals”. Teff is used to make the popular sour flatbread Injera that is the hallmark of Ethiopian cuisine. Notably, it is the only grain high in vitamin C. Millets typically contain higher levels of vitamin A, calcium, iron, zinc, riboflavin, and folic acid than staple crops. The Gates Foundation alongside CGIAR see millets as playing a promising role in addressing micronutrient deficiencies across the world. They are currently enjoying widespread popularity, with the UN declaring 2023 "The Year of Millet". This followed the Indian Government taking a great interest in millets as a food security solution and declaring a nationwide year of millet in 2018.
![][image3]
Figure 2: examples of Future Smart Foods that excel in mountainous regions where soil is often poor. From Li and Siddique (2020)
Despite having strong advantages, there are some barriers to widespread adoption of FSFs. Gates notes that some types of millet are difficult to scale because we haven't yet developed machines to process them efficiently. Knez et al., (2023) present 7 case studies of FSFs including buckwheat, lentils, green leafy vegetables, sow thistle, grass pea, cucumber melon, and eggplant, and the key barriers to increasing their adoption. Common barriers include lower yields and lack of genomic sequences which prevent rapid breeding improvements. There is also a cultural element: they are often perceived as "old-fashioned" and "food for the poor ''. This negative image in the eyes of consumers can result in low market prices. This was found in a study of farming villages across Nepal and Bangladesh, who are farming less buckwheat and millet and more cash crops such as fruits and coffee. One potential solution to this issue is for governments to indirectly make Future Smart Foods more profitable by subsidizing them, or guaranteeing minimum prices. The Indian Government does for several crops including varieties of millet (see also), and the government can also reimburse farmers who are forced to sell for below this price. The scheme has seen mixed success, partly due to low farmer awareness, but if implemented assertively, could incentivise growing food security-boosting crops. As such, public education campaigns (such as those seen in India) on the numerous benefits of these foods, are key to ensuring that we reap the full potential of these foods to improve food security.
Biofortification of fruits and vegetables to address malnutrition
Biofortification is the selective breeding (or genetic engineering) of crops, to increase their micronutrient levels. Generally these are staple crops such as rice, maize and potatoes. This contrasts to standard fortification of foods, which involves adding micronutrients to the food after the fact. An example of this is adding Iodine to table salt, a common practice around the world. The most prominent program in biofortification is The CGIAR HarvestPlus Program which focuses on iron, zinc, and vitamin A. HarvestPlus partners with commercial agricultural companies as well as non-profits to develop and deploy seeds to those that need them. HarvestPlus has, at the time of writing, deployed 293 varieties to over 100 million people in 30 countries (See their website for an interactive map). Notable examples of biofortified crops are shown below:
![][image4]
Figure 3: examples of biofortified crops and their benefits when eaten regularly. Source: Dwivedi et al., (2023)
Compared to Future Smart Foods, biofortification requires minimal changes in diet or farming practices, which can ease adoption and improve food utilization. Farmers can continue to grow and eat rice/maize as they have done for decades, but benefit from increased vitamins and minerals. Similarly to FSFs they have few inputs so can be implemented in hard to reach rural populations, improving food access. Additionally, as exemplified by Iron Pearl Millet in Figure 3, FSFs themselves can be biofortified, enhancing their already formidable nutrient profiles.
How effective have biofortified crops been? A review of HarvestPlus programs between 2003 and 2016 indicated that biofortification programs are cost-effective and crops significantly reduce micronutrient deficiencies. When eaten regularly, Biofortified crops can contain substantially more nutrients than their unmodified counterparts, even when compared to the naturally occurring strains with the highest levels. Zinc wheat contains 50% more zinc than best performing wheat cultivars and some strains of Iron Pearl Millet contain 62% more iron than the most nutrient dense naturally occurring millet. In both cases there are no yield losses. If HarvestPlus is to be believed, regular consumption of biofortified crops can meet up to 100% of daily vitamin A requirements, 80% of iron requirements and 70% of zinc requirements.
Critics of the biofortified crops movement argue that evidence on the effectiveness of biofortified crops has slow to accumulate. This is despite $500M being invested in projects such as HarvestPlus in the last few years. They argue that biofortified crops have lower genetic diversity which makes them vulnerable to disease. Moreover, biofortification reduces the need for dietary diversity, which may lead to fewer crops being cultivated in a region, which could threaten food stability. It seems that "biofortification vs dietary diversification" is a hotly debated topic amongst scholars. Critics also point out that biofortifying crops typically reduce yields somewhat, though this is often also true of Future Smart Foods. Lastly, biofortification through genetic manipulation (known as transgenic biofortification) has encountered significant political opposition from anti-GMO stakeholders in the EU and China.
While all these critiques have some grain of truth, none seem particularly damning, provided biofortified crops are not treated as the only solution to food security. Furthermore, biofortified crops still compare favorably to industrially farmed livestock in several ways. The disease risk from lower genetic diversity in biofortified crops likely pales in comparison to the known genetic diversity risks in industrial farm animals (see this section). Non-GMO biofortified crops should be considered as another 'tool in the box', alongside traditional fortification and supplementation, as well as dietary diversification towards Future Smart Foods.
The case of vitamin B12
As noted above, there are no significant plant sources of vitamin B12. I was also unable to find any case studies of Future Smart Foods or biofortified crops that contained B12. This means that ensuring the global population gets sufficient B12 without consuming amounts of animal products may be challenging. The 2 viable solutions include:
- Traditional fortification: It is common in High income countries to fortify breakfast cereals and milk alternatives with Vitamin B12. Many foods can be fortified with B12. It seems possible that through high levels of fortification, 100% of a person’s B12 requirements could be met with as few as 4 slices of fortified bread (See Appendix 1).
- Supplementation: this is common in Higher income countries. If B12 tablets are unsuitable for certain populations, it is possible to meet B12 requirements with a twice-annual injection. Many people are unaware the industrially farmed animals only contain B12 because they themselves are given B12 supplements.
It should also be noted that B12 deficiency is less of a concern among the global health (for example, see The WHO) community than other nutrients such as iodine, iron, zinc and Vitamin A. Whilst an important issue to tackle, we should not miss the bigger picture that nearly all micronutrient deficiencies worldwide can be met through plant sources.
The risk of foodborne illness from animal products
Another vital but underappreciated aspect of food security is food safety. The global poor often face a “food security catch 22”: they are deficient in key micronutrients, but the foods that would nourish them could instead make them sick or even kill them. The World Bank's Safe Food Imperative estimates that foodborne disease (FBD) costs low and middle income countries $110B a year in medical expenses and lost productivity.
Unsafe food can directly compromise food utilization and health, often through causing diarrhea. A study in 137 developing countries found that diarrhea is the 3rd biggest cause of stunting in children, responsible for 13.2% of cases. When nutritious foods cause diarrhea, any health benefits are lost. Lab-testing reveals just 10 pathogens may cause 61% of diarrhea cases, including Shigella, Rotavirus, E-Coli, Campylobacter jejuni (aka C-coli) and Norovirus (though not all of these are transmitted through contaminated food).
Animal source foods (ASFs) are key sources of foodborne disease. As a result, nutrients from animal products may be lost to diarrhea. ASFs account for approximately 35% of the total burden of foodborne diseases. This means that over ⅓ of the sickness, death and economic damage caused by foodborne disease can be traced back to animal products. As animal products make up only 19% of humanity’s calories (and 38% of protein), this arguably suggests that ASFS are calories that carry outsized disease risk compared to other calories. Even this difference underestimates the risk as ASFs represent a much smaller percentage of calories and protein in the countries that suffer the highest levels of foodborne disease. Dairy alone may account for over 4% of global food-related sickness. 90% of the health burden of Campylobacter can be traced to ASFs, which as noted above is a key cause of all food related illness. A study of 48 countries found that countries who produce more meat have more deaths from FBD than countries with similar wealth and population size. Common diseases caused by tainted or spoiled ASFs include non-typhoidal Salmonella, the aforementioned Campylobacter and Taenia Solium. Together, these three pathogens account for 70% of the total global burden associated with ASFs.
Unsurprisingly, the majority of this burden is concentrated in lower and middle income countries. This is because food safety standards and food hygiene are often quite poor. For example, one review revealed some startling statistics: in India just 6% of Pork in Nagaland were found to meet food safety standards, and 0% of milk samples in Assam. Another study found that only 2% of meat samples in Nigeria complied with safety standards.
While LMICs may struggle to produce animal products that are safe to consume, importing from richer countries with (presumably) higher standards may also not help. This is because HICs typically export their lowest quality animal products abroad. Brazil, the world's largest beef exporter, had a scandal in 2017 where some companies were caught bribing inspectors to ignore quality issues like using chemicals on expired meat. It is unknown just how much FBD this caused, but it is likely to be significant. Because imported meat is typically frozen, countries with poor cold-supply chains present multiple opportunities for FBD to develop.
In sum, if countries choose to go down the path of meeting nutritional deficiencies through animal products, they must also be prepared to make strong investments in food safety. Otherwise, any food utilization benefits may be nullified. When considering potential nutritional benefits of increased ASF consumption in LMICs, decision makers should also factor in possibly counteracting effects of foodborne illness. Alternatively, it may be preferable to invest in plant-based nutrition solutions that are less likely to have this problem.
Other human disease implications from animal agriculture
Industrial animal agriculture can also undermine global food security by exacerbating antimicrobial resistance (AMR) and increasing the risk of pandemics. In intensive farming systems, animals are regularly given antibiotics to keep disease at bay and increase productivity. This widespread use of antibiotics accelerates the development of AMR, where bacteria evolve to withstand antibiotic treatments. As a result, infections that were once easily treatable become life-threatening, posing a severe public health risk. The World Health Organization (WHO) has warned that AMR could lead to 10 million deaths annually by 2050, surpassing the mortality rates of cancer and other major diseases. This public health crisis can strain healthcare systems and divert resources away from food production and distribution, further destabilizing food security.
Moreover, there is substantial evidence that the high density of animals in intensive systems facilitates the rapid spread and mutation of viruses, increasing the likelihood of zoonotic outbreaks that spill over into human populations. 75% of all emerging diseases are zoonotic, meaning that they came from animals. As was painfully clear during the COVID19 pandemic, zoonotic outbreaks can disrupt food production and supply chains, leading to reduced food availability and increased food prices. The economic fallout from such health crises can also reduce household incomes and purchasing power, reducing food access. Thus, the practices inherent in industrial animal agriculture not only threaten public health but also undermine the stability and resilience of global food systems.