Tuesday, September 17, 2019

A Brief History of Nutrition in the West

I had the recent opportunity to contribute to my friend David Hauser’s new book, Unstoppable: 4 Steps to Transform Your Life. You may know him as the co-founder of Grasshopper, and he’s also an incredibly knowledgeable source on health and wellness.

The book is out now, and to celebrate I wanted to share the excerpt with you all. Pick up a copy of the book to learn more about how to optimize your life starting with your health, and enjoy!

***

To distill the myths that fat is bad and that red meat causes cancer, it’s vital that I give you a brief history about nutrition in the West, since this will help you to understand where some of the conventional wisdom surrounding these topics originated. What you’ll see is that many of the nutrition problems we face today are directly related to what David was talking about earlier: the meteoric rise of sugar intake among Americans coinciding with the meddling of special interests and big business into our food supply. These major shifts, which I’ll cover in a moment, had significant influence on the way we think about food, and how we consume it.

The prophets of the early 1800s. Thanks to the work of people like William Banting, a popular English undertaker who found a dietary solution for his own obesity, the general consensus during this time for someone who was overweight was to reduce his or her sugar intake. It was also recommended to decrease the intake of starches—potatoes, and bread—and focus more on meat, butter, cream, and vegetables. People employed both personal experimentation, like Banting, and experimentation in clinical settings and found that the low-carb, no-sugar approach did indeed work. All of this was conventional wisdom and noncontroversial. It wasn’t until the 1950s that people started to shift away from these recommendations, which were actually on the right track.

The early 1900s and the rise of industrialization. For a long time, there weren’t any dietary recommendations in Europe or the United States. Dietary recommendations, however, finally started to emerge in the early 1900s when people started understanding that a number of diseases, like goiter—the abnormal enlargement of the thyroid gland—were due to deficiency issues, like low iodine. There were a number of other conditions that were caused by various B vitamin deficiencies. And so, we started learning about vitamins and potential nutrients. We started fortifying some foods. In many ways, things improved, and problems were addressed. But it was also during this time that a number of toxic things happened simultaneously, including politics getting into bed with the industrialization of our foods. There were some researchers at the time who were making some progress in studying low-fat, vegetarian diets—for moral reasons, more than anything else—encouraging people to eat more seed oils and more grains. But their findings, which were pure in origin, suddenly became politicized and economically incentivized, as the government started distributing farm subsidies with a focus on producing long-shelf-life products from refined grains.

The mid-1900s and Ancel Keys. I recently wrote a piece called “Lies, Damn Lies, and Statistics” to go along with one of my books. The focus of the piece is to highlight how everything became derailed in the mid-1900s regarding dietary recommendations (my fellow contributor, Dr. Zoë Harcombe, will explore this more later). All of this starts with the well-known researcher and biochemist Ancel Keys. Keys had a charismatic personality and was a dominating force in the academic world. He produced some compelling research about how dietary fat intake was increasing cardiovascular disease but also omitted a fair amount in the details of this research. The ideological nemesis of Keys was another researcher named John Yudkin, who was producing some fascinating research saying just the opposite of his rival—that it was sugar causing cardiovascular disease, not fat. Keys, however, was a pugnacious individual who was very outspoken and intimidating, and managed to get in the ears of some folks who were on the governmental committee that was tasked with making dietary recommendations. Keys was also one of the first academic researchers that started to use personal attacks on his rivals, instead of attacks based on facts. Because of his strong personality and research, Keys shouted the loudest, and so, his voice was heard. Amidst all this hoopla, Yudkin’s research went very much ignored. The rise of Keys was an interesting confluence of some dodgy science and a quasi-religious dietary approach, along with some people having been in the right place at the right time to push forward health policy that said a low-fat (meaning low-animal-fat) diet was the solution to being overweight.

The 1960s and 70sfarm subsidies and junk food. There was a real countercultural movement in the 1960s where people started to move away from traditional American foods like steaks, animal fats, and butter—these were foods that were held in low regard with the rise of Ancel Keys’ findings. It was also around this time that, in order to get elected, Richard Nixon, desperate to secure the conservative vote, promised to dramatically expand the farm subsidies program, a program that had been languishing since World War II, when farmers were given aggressive incentives for the war effort. When Nixon became president, farmers started producing a ton of corn and wheat, food that was high in carbs, funded by the government. This increase in production, however, resulted in too much food. Around the country, there were storehouses of food just rotting away. It was also around this time that some researchers in Japan perfected a process for extracting high fructose corn syrup out of bulks of corn and began to use that as a very inexpensive sweetener—something that was far cheaper than sugar cane and about one and a half times sweeter than regular sugar. Cheap and tasty. So, once again, this time period was an interesting confluence of charismatic individuals in the scientific community suggesting to the public that they reduce fat and eat more grains (as long as food was “low fat,” it was deemed to be okay), the production of more grains because of government subsidies, and the subsidization of a whole process that created artificial sugar that was inexpensive and could be funneled into the junk food industry.

The 1980sRobert Atkins and Dean Ornish. Robert Atkins, known for the famous Atkins diet, first learned about low-carb dieting from a US Air Force manual that was written for pilots who were too overweight to fly and needed help slimming down. A low-carb diet meant removing things like potatoes, bread, and beer, and eating more animal fats. For Atkins, it worked, and he became a megaphone for this type of dieting. He had a decent run, too; that is, before Dean Ornish stepped onto the scene near the end of the twentieth century, who piggybacked off of the work of Nathan Pritikin, another researcher who was an advocate for low-fat, high-fiber dieting. Ornish conducted some integrative studies where he performed some cardiac imaging on people who had stopped smoking, were regularly meditating, regularly exercising, and were on a low-fat diet. It was suggested in the imaging that they had found a formula to reverse cardiovascular disease. The problem, however, was that the error in this imaging was greater than the claim of improvement in the cardiovascular-diseased state. In other words, if you took one person and performed this cardiovascular imaging ten times, you would have little variation within each individual sample. This should have been shot down immediately—and many in the scientific community did raise concern—but once again, there was a reinforcing movement and understanding that eating low-fat foods was the way to defeat heart disease. Big assumptions were made. Counterpoints were ignored. Sugar intake increased under the banner of low-fat dieting. This was also the origin of vegan and plant-based diets that had the backing of large religious organizations and even hospitals with religious roots.

Gary Taubes and the 2000s. Gary Taubes made a splash in the early 2000s when he published a piece called, “The Soft Science of Dietary Fat,” where he explored the claims that had been made about cardiovascular disease and fat intake. Taubes found what so many others had: that those claims about low-fat dieting didn’t measure up to the science. His hypothesis was that the main driver of obesity—and, by extension, cardiovascular disease and Type 2 diabetes—were solely related to insulin. Insulin is mainly driven by carbohydrate intake, so he became an advocate of low-carb dieting. I think that on a prescriptive level the recommendation of a low-carbohydrate diet is really on point, in that it’s a way that you can get people to reduce their total caloric intake. Taubes made some claims that calories didn’t matter if you kept carbohydrates within certain levels, which I don’t believe is backed up by science, and flies completely in the face of our evolutionary biology. No matter, through his work and findings, Taubes has helped to bring a number of truths back into a conversation that has been hijacked by big business and politics.

Paleo and keto, the low-carb, high-fat diets. So, in the context of this history, let’s briefly touch on the two primary low-carb, high-fat diets today that I believe to be effective, paleo and ketogenic (“keto”):

Paleo. Modern researchers and medical professionals who learned about the paleo approach asked a simple question: what if features of our modern world are at odds with our ancient genetics? The so-called paleo diet concept was born through the observations of dozens, if not hundreds, of anthropologists and medical experts. They realized that hunter-gatherer groups were largely free of modern degenerative diseases. These people were remarkably healthy even despite an almost complete lack of modern medical interventions. By exclusion, the paleo diet suggests one should generally minimize or avoid grains, legumes, and dairy. Why? Because these foods are “evolutionarily novel.” This means they’re relatively new to our species and therefore may present problems for some people. By inclusion this means the diet is comprised of lean meats, seafood, fruits, vegetables, roots, shoots, tubers, nuts, and seeds.

Ketogenic. To make a long story short, researchers in the 1920s and 1930s noticed that patients with severe epilepsy had remarkably fewer seizures when they were fasting. That’s because when they fasted, it depleted liver glycogen (a stored carbohydrate), shifting the body into a state of ketosis. In this state, ketone bodies (produced from fat) are used in the place of glucose for most energy needs, but in particular by the brain. Your brain can shift nearly two-thirds of its normal glucose dependent metabolism to one fueled by ketones, which provide a much more stable energy source. Although the ketogenic diet was born of a need to help epilepsy, many people observed that low-carb diets were exceptionally effective for fat loss. Names like Banting, Atkins, and others have popped up over the years, offering both effective weight loss strategies and controversy. A low-carb, high‑fat diet has generally contradicted the recommendations of many health authorities and governmental agencies.

Now that you have a snapshot of our dietary history, I’ll now break down the myths that fat is bad and that red meat causes cancer. You’ll see that they’re directly related to where the West has gone wrong in its scientific history as it pertains to nutrition.

Myth: Fat Is Bad

The problem with a lot of the conclusions about fat that have been made in our short history as humans is that the research that has been conducted showed correlation between fat intake and cardiovascular disease, but not causation. This type of research—which is called “epidemiological research”—requires large, diverse samples be collected that are meant to reflect the whole. This was incredibly useful for the study of tobacco because it revealed an insanely powerful correlation between tobacco and cancer. The impact of fat intake on cardiovascular disease, however, was much more nuanced. The conclusions that were made about causation were irresponsible, and motivated by serious financial incentives.

Here’s an example of the nuance and complexity behind the epidemiological research gathered about fat intake: in our history, people tended to eat more fat—and more animal fat—as they became wealthier. But as they became wealthier, they also exercised less and also tended to drink more alcohol because they could afford it…and they also tended to smoke more because they could afford it…and they tended to eat more sugar because they could afford it. It was true that these people were more at risk for cardiovascular disease; but this was because of the array of foods and substances they were able to consume because of their wealth, not necessarily because of their fat intake. Nonetheless, this was the only thing that was focused on in the studies of researchers who wanted to demonstrate causation. This no-fat propaganda was also pushed by a quasi-religious, vegetarian, and vegan-oriented agenda on the part of people like Ancel Keys, and then supported by shifts in governmental orientation and the subsidization of the American food supply, making for a multifactorial story as to how this “conventional wisdom” of decreasing fat intake emerged.

Reducing fat and saturated fat resulted in increases in consumption of carbohydrates, sugar, and vegetable oils. It’s generally true that people have increased fat consumption over time, but what’s missed in this narrative—and what’s new to our story as human beings—is that fat is also part of processed carbohydrate foods: things like baked goods, snack foods, and pretty much anything that is processed, filling the aisles at supermarkets. It’s much more complex than people just adding butter to a steak or broccoli. These refined carbohydrates are in the “low-fat” things that we eat. The US Dietary Guidelines—driven by the farm subsidies programs, academia, dietetics, and medical associations—have historically bought into the notion that low-fat diets are the most beneficial, healthiest ways to eat. It’s a multifactorial problem.

For example, do you remember SnackWell’s? It was a “zero-fat food” but was made of flour, high-fructose corn syrup, regular sugar, and some cocoa. It was the furthest thing from healthy, but had the backing of a major health association because it didn’t have any fat. Well, people listened to what “experts” concluded about these foods and consumed products like this because they were endorsed by the government as healthy. In other words, people listened to the recommendations, and food producers made sure lots were available based on those recommendations.

All of this has resulted in an increase in obesity, chronic disease, and a higher mortality rate among relatively young people. Some might argue that this is primarily driven by our carbohydrate intake and chronically elevated insulin levels, which lead to insulin insensitivity, but the best science that we have at this point, in my opinion, is that people are overeating. And the drivers for overeating are these complex, hyper-palatable, highly processed foods that are high in sugar and carbs. For example, in my second book, Wired To Eat, I talk about a principle prevalent in the processed foods industry that I call “Doritos Roulette,” the idea that within every Doritos bag are chips that taste somewhat different. You might even see a sentence on the bag that reads something like, “Caution: Some chips are extremely hot.” So, some chips are hot, some are mild, and some are medium. This is intentional on the part of the manufacturer, exploiting within you what has been called the “Buffet Effect” or the “Dessert Effect”—the idea that if people have more variety in food consumption, people will inevitably eat more. You have probably been to a dinner party before where you felt completely stuffed but then still, somehow, made room for dessert. Similarly, because of the variety, Doritos Roulette makes their product incredibly addictive, which leads to overeating. Never in the history of civilization have we had so much variety so readily available to us for every meal or snack.

In lockstep with the refining of carbohydrates, particularly corn, what we ended up with was cornstarch, which was then used to produce high-fructose corn syrup and corn oil. Whereas before, butter or coconut oil might’ve been used on something like popcorn, they were replaced with highly polyunsaturated vegetable oils (canola oil, corn oil, safflower oil) because saturated fats and animal fats were deemed unhealthy. However, since these types of vegetable oils go rancid easily, a process called hydrogenation was developed, which could turn these polyunsaturated fats into hydrogenated saturated fats and partially hydrogenated saturated fats. The types of fats that were created through this process were called “trans fats.”

Biology doesn’t make trans fats other than some very rare circumstances, like conjugated linoleic acid in bovine dairy and cow dairy, which is actually a highly beneficial fat, a fat that you get from both the meat and dairy products of grass-fed animals. But those are the only kinds of trans fats that human bodies had experienced; and so, all of a sudden, people shifted from eating no—or little—trans fats in their diet, to eating massive portions of trans fats. To make things worse, these polyunsaturated fats tend to be loaded with the short-chain omega-6 fats; humans haven’t historically eaten huge amounts of these either. Our biology hasn’t prepared us for the direction of the food industry. So, whether they’re hydrogenated or not, humans are suddenly eating an increased amount of these types of fats, which we now understand are very pro-inflammatory and cause a lot of metabolic problems. Yet they were incredibly inexpensive, fantastic in baked goods, helped to stabilize shelf life, tasted really good, and had a very benign, mild flavor, which wasn’t overpowering like something like coconut oil can be. They were very beneficial in the food manufacturing scene, but ended up being a disaster from a health perspective. Once again, it was money that was running the show and calling the shots.

Science has shown all along—with more focus lately—that natural fats that are solid at room temperature and don’t require processing are actually good for the body, and might even be protective to the body. But all along, America’s dietary recommendations have been derailed.

In 1961, Americans saw Ancel Keys featured on the cover of Time, which essentially communicated that fat was bad, dangerous, and even lethal. In 1984, it was a cover of a breakfast plate with two eggs as eyes and a strip of bacon as a frown, warning readers about cholesterol and its relation to fat consumption. And then, in 2014, it was a cover with a yellow headline that read, “Eat Butter,” with a sub-headline saying, “Scientists labeled fat the enemy. Why they were wrong.” If you take a closer look at the studies that were used to show that fat is bad, what you’ll find is that many of them show a stronger correlation between heart disease and sugar intake rather than fat. And the truth is that we still do not have the answers, but it’s apparent that conventional wisdom is far from 100 percent correct, as the Time covers reveal.

Myth: Red Meat Causes Cancer

The myth that red meat causes cancer can also be traced back to a poor epidemiological study. The study was rooted in data from questionnaires where people were asked a variety of questions about their diet, like, “What did you eat yesterday?” or “What did you eat last week?” or “What did you eat more than anything else last year?” All the data hinged upon people self-reporting, and answering all kinds of vague questions, where their answers were most likely affected by what they thought the researcher wanted them to answer. Studies based around self-reported feeding logs should not even be considered legitimate. And, once again, causation was argued, even though correlation itself would have even been an irresponsible claim.

The claims that have been made to support this myth are interesting, to say the least. One was based on research in China that argued that, as the Chinese got wealthier, they ate more red meat, and that is why there was an increase in cancer. Once again, there could be a correlation, but causation is a foolish conclusion. As I discussed earlier, as people industrialize, they get wealthier, don’t exercise as much, don’t go outside as much, have the financial flexibility to consume alcohol, tobacco, or drugs, and tend to eat not only more meat, but also more sugar and more refined foods. As you can see, there are so many different factors.

Another claim that has been made through a recent study is the connection between red meat and colon cancer. However, there is a real difference between relative risk and absolute risk. To give some background here, within the United States, everybody, in theory, has a 5 percent risk of developing colorectal cancer in their lifetime. The claim within the anti-meat scene is that eating red meat or processed meats increases your likelihood of colorectal cancer by 20 to 25 percent. But in saying this, what they do, for the sake of making headlines and portraying things as scarier than they actually are, is ignore the absolute risk.

Well, guess what?

The difference between 5 percent and 6 percent is about 20 percent—this is the distortion that can happen when you alternate between absolute and relative numbers. That is to say, it’s easy to make a number sound very large by quoting the percentage increase or decrease. In this case, to go from 5 percent to 6 percent would most likely entail consuming large amounts each day for the rest of your life, which virtually no one does. A lot of these studies have clear motives behind them from proponents of vegan and plant-based diets, which have become socially connected with morality and a lot of money behind these campaigns to influence popular culture.

A related claim has also been made that eating animals is not only bad for people but is also bad for the environment. Addressing this claim can be a daunting task, as many who challenge this notion—one that, by the way, is directly related to global warming—are immediately viewed as right-wing wackos who are ignorant of science. Hopefully you can tell by now that I’m actually trying to keep the science accountable and point out the legitimate flaws in a lot of the claims that have been made. So, to dive in, there’s a claim that the production of animal products is a big vector for the different types of greenhouse gases, carbon dioxide, and methane. There’s some truth to that. But this is mainly in the context of the industrialized feedlot process. When they put cows on the feedlot system, what they’re doing is feeding them grains—and these grains have been grown using fossil fuels.

How this works is that farmers grow corn, wheat, barley, or whatever it may be; it gets raised; it gets processed; it gets shipped around the country or around the world; and then the leftovers get put into these feedlots. But when you look at the flip side of this and consider a holistically managed practice of raising ruminants on grass, you have a very tight loop: the sun shines on the grass, the grass sequesters carbon dioxide and other nutrients, those nutrients grow, and then ruminants—animals with two stomachs and a biological process made to break down vegetation that humans can’t consume—eat the grass and also grow. And then, eventually, they’re either eaten by humans or predators—or the animal just dies.

When you look at the total energy input and carbon footprint of pastured meat, and holistically managed meat, in animal products, it is completely different than that which is used in conventional animal husbandry. Part of what happens when the plants are taking in sunlight and using carbon dioxide and water to make sugars and starch is that the sugar and starch goes underground and feeds bacteria and fungus. This is soil. Part of what the fungus does is mine minerals that it shares with the bacteria, which it then shares with the plants. And so, in these areas where there are grasslands, like in the steppes of Siberia, which are very similar to the North American grasslands, the roots of these plants can sometimes venture hundreds of feet into the ground. The American Plains used to have hundreds of feet of topsoil, and now it’s down to about twelve to fifteen feet of topsoil because of conventional farming practices, which are row-crop based, which is the centerpiece of the vegan diet—comprised of grains and legumes. The way that grains and legumes are grown is unsustainable in row-crop fashion. So, the irony is that a vegan’s recommendation would accelerate climate change and accelerate the loss of topsoils. The United Nations put out a report suggesting that the world at large has sixty years of topsoil left. And, once the topsoil is gone, our ability to feed ourselves would effectively be gone.

But, if people understood the way that this holistic management could occur, we could be producing huge amounts of food and sequestering carbon, potentially returning carbon levels to pre-industrialized atmospheric levels in this virtuous cycle of food production and soil restoration. But, it’s a really complex story, and it is virtually political suicide to even suggest that you could use effective animal husbandry to address climate change. Anyway, all of this is related to the myth that red meat causes cancer and that eating animals is bad for both humans and the environment.



from The Paleo Diet https://ift.tt/305xRyz

No comments:

Post a Comment