The Thousand Secret Ways the Food is Poison: Part II
Auxotrophs, autotrophs, and some remarkably compelling statistics.
Last time, we talked a bit about the biochemistry of how glyphosate (also known as Roundup) works, why you probably shouldn’t be eating nearly as much of it as you are, and the beginnings of how to cut it out of your diet. Eventually, this series of posts will get into other kinds of food contaminants—things like phthalates, mycoestrogens, other endocrine disruptors—but so far we’re only halfway through #1 of 1000.
To recap: glyphosate is a systemic herbicide, or weed killer. It works by blocking up an enzyme that humans don’t have in our genomes, which is why it’s assumed to be safe for use on food crops: in theory, we should be able to eat as much of it as we want without ill effects. In a system as complex as the human body, however, theory and practice are often worlds apart.
The trouble is that a lot of our gut bacteria DO have the enzyme it interferes with, and it turns out that it’s necessary for some very important things—like producing the essential precursors for serotonin, dopamine, and norepinephrine.
But how much are we really eating? To get a sense of this, it helps to understand the way it’s used in agriculture.
Consider the Following
If I treated an apple tree with enough roundup to kill it, and then plucked one of the withered, dead apples from its branches and offered it to you…would you eat that apple? But if I sprayed a field of sugarcane with enough roundup to kill the whole thing, then chopped it down, processed it into sugar, and sold you a bag—you might stir some into your tea, if you didn’t know what I’d done to it.
This is how things are done in modern agriculture, under the euphemism of “preharvest desiccation”. It’s used for grains, oilseeds, beans, lentils—if it’s a crop that can be harvested “dead”, you kill it with weed killer first, because it reduces spoilage and improves yields.
It’s easy enough to understand why your gut bacteria have a hard time growing on something harvested this way: a plant which has absorbed enough enzyme-inhibitor to kill it necessarily contains the chemical at high enough concentrations to inhibit the enzyme that it targets. Glyphosate is a pretty stable molecule, so storage, cooking, and processing don’t seem to change that fact.
Herbicide-tolerant GMO crops present a similar issue: the only difference is that now, the crop stays healthy-looking even if it’s absorbed an amount of glyphosate that would otherwise be lethal to it. And the whole point of hacking herbicide tolerance into your crops is so that you can spray the entire field with enough to kill the weeds.
The upshot is that tens of thousands of metric tons of glyphosate make their way into the food chain every year, and into our bodies.
The Epidemiology
The introduction of roundup into the food supply en masse began in about 1993, and really kicked into high gear around 1998 with the widespread adoption of “roundup-ready” crops. This coincided with a sharp increase in the incidence of a startling number of chronic diseases. The authors of this paper enumerate 22 in total, including dementia, diabetes, autism, hypertension, gastrointestinal infections, deaths due to stroke (which I guess is only “chronic” in the sense death is the most chronic of all conditions), and several types of cancer. In a lot of these diseases, the incidence roughly doubled in the 20 years between 1990 and 2010. For some indicators, like the number of age-adjusted deaths from Alzheimer’s, there was a four-fold increase.
So the study’s authors pulled data from USDA and public health agency databases, looked at the amount of glyphosate applied to food crops, and compared it to the incidence of these various diseases.
For all of them, they found linear relationships. That is to say: when the amount of roundup in the average American’s diet changed from year to year, the incidence of the disease changed by a more-or-less proportional amount.
The correlation coefficients on these relationships are almost unbelievably strong. A correlation coefficient tells us how tightly linked two variables are, how much of the change in Y can be explained as a function of the change in X. A correlation coefficient of 0 implies no relationship at all, while a correlation coefficient of 1 is a straight line, which implies that either Y or X is totally dependent on the other, or they’re both totally determined by some third thing.
Now, obviously we’ve got to pause here to give heed to the skeptic’s refrain: correlation does not prove causation. With a large enough dataset (and careful choice of parameters, like which years you include in your analysis), you can find surprisingly strong correlations between things that almost definitely have no bearing on each other.
This sort of sleight-of-hand is easiest when two things are both increasing at a relatively constant rate over time, like in the graph above, because any two perfectly straight lines going in the same direction have a correlation coefficient of 1. Natural fluctuations from year to year help weed out the strongest false signals, but still leave room for some.
In any case, this is what the p-value is for, in statistical analysis. It tells us the odds that a correlation of a certain strength would arise by chance. The customary cutoff for deciding whether an association is “statistically significant” is 0.05, or one in twenty.1
When you see correlations like the one shown above, they’re usually hovering right around this significance threshold. More than anything else, the above graphs represent a lesson not to unquestioningly trust “p<0.05” as a marker of legitimacy: someone might, just for kicks, be comparing massive numbers of variables in hopes of defaming Nicolas Cage. To hit p<0.05, you might only need to check twenty random datasets.
So typically, when someone is trying to foist a bogus correlation on you, either the correlation coefficient will be so-so, or the p-value will be right around 0.05, or both. So what do these figures look like in the glyphosate<>disease dataset?
The p-values on all of these correlations are all better than one-in-ten-thousand, many lower than one-in-a-hundred-million. That is to say: if we wanted to find a random variable that correlated this well with the amount of roundup in the food by chance, we’d have to look at several million datasets. For each of these diseases independently to have correlation coefficients and p-values that strong, I’m comfortable saying the odds that we’re seeing something coincidental here are zero.
These days, it seems like everyone only remembers the first half of the skeptic’s refrain, and leaves off the coda, the caveat. Correlation does not prove causation, it only suggests a causal link.
That link can be complicated, but the whole point of statistics is that it lets us de-noise the data to determine when we’re seeing a genuine signal. Statistics don’t lie when they say “there’s something here”. Figuring out what that thing is, that’s the part that’s up to us. That’s where narratives get introduced and models of causality have to be sorted through.
Consider the fact that, in 2014, the number of times Jennifer Lawrence’s name was mentioned in the media correlated pretty nicely to the Dow Jones Industrial average.
From looking at the graph, you can see that it’s not as simple as “two perfectly straight lines pointing the same direction”—there are dips and rises, and they genuinely do seem to track one another. It’s “statistically significant”.
So: Does Jennifer Lawrence’s stardom control the future of the stock market? Should we be subsidizing films featuring her, to ensure the country’s economic prosperity? Probably not.
But what if it’s the other way around? There’s only 24 hours in a cable-news day, and coverage of genuine news (like a sudden downturn in the economy) has to supplant something. Puff pieces about the celeb du jour are a logical choice.
So it’s probably not “X influences Y”, but “Y influences X” seems plausible. Even more likely is a common-cause model, where media coverage of whatever is causing the Dow to tank is also forcing out coverage of Jennifer Lawrence.
My point with all of this is that it is probably not quite so simple as “roundup causes these 22 diseases”. Maybe for some of them, it’s the mercury in high-fructose corn syrup2, and the rise of GMO agriculture facilitated the switch from sucrose to HFCS. Maybe the weird folate derivative you get when you bake enriched wheat flour with fructose impairs your body’s ability to methylate DNA and silence it, leading to those cancers. Maybe the cumulative influence of sub-toxic doses of roundup just makes you more susceptible to GI infections, which means more antibiotics, which means more multiple sclerosis.
Just like with the Jennifer Lawrence<>Dow Jones connection, understanding the mechanisms at work in the system helps us assess the potential explanations of causality for their plausibility. Which is why it’s time for…
The Biochemistry
We talked previously about tryptophan, tyrosine, and phenylalanine, the three main molecular products of the enzyme pathway that glyphosate interferes with—the shikimic acid pathway. As far as we know, glyphosate’s lethality to plants comes from its inhibition of their ability to produce these amino acids. But bacteria are different from plants in a crucial respect: bacteria have the ability to import amino acids from the environment.
Microbes are lazy (some might say efficient), so generally they won’t make something from scratch if they can just pick it up from the soup they’re swimming around in. Provided there’s adequate tryptophan, etc. in your food, you might think the amount of roundup it takes to kill a plant wouldn’t necessarily kill a bacterium.
And, to an extent, the research bears this out: a 2018 paper showed that adding in tryptophan and tyrosine to culture broth prevents roundup from being toxic to E. coli. They also did a two-week feeding study in rats, and found that—even at doses much higher than the official tolerable intake levels—it didn’t have significant effects on the abundance of most bacteria in the gut. So: case closed?
Not really. For one thing, it’s hard to generalize to humans from a two-week feeding study on rats, which eat precisely one thing—pellets of fully nutritionally balanced “chow”.3 There's also a difference between tube-feeding a rat a bunch of the stuff once a day and having it incorporated into the matrix of their food. A more recent study, which dosed glyphosate in the drinking water, found that the same doses used in the previous study DID change the gut metabolite profile and selectively alter the abundances of certain bacteria.
Another study analyzed both the fecal-microbial gene pool, as well as the transcriptome—the landscape of genes that are actively being produced at a given time. On the genes side, they found that “most gut bacteria don’t have a complete shikimic acid pathway”. The problem with this is the word “most”.
Let’s say there are thirty trillion bacterial cells in my gut, from 100 different species…but twenty trillion of them are all from one Bacteroides species. Even if that’s the only species in my gut that’s sensitive to glyphosate, it’d be wrong to describe it as “1% of my microbiome”. This isn’t just a made-up example, by the way—those are close to what my actual percentages were, the one time I got sequenced.
Your gut probably looks somewhat similar: Bacteroides is the dominant genus in most Westerners’ microbiomes. I’m a little atypical in that it’s not usually a majority, just a plurality—more abundant than anything else. And yes, it’s got a full shikimic acid pathway—meaning roundup is not good for it.
As for the finding that the genes weren’t “turned on” in most poop samples, the problem is that the transcriptome varies from day to day. The same is true of relative abundances of various taxa. It’s unwise to draw conclusions like “ehh, they weren’t using it anyway” just because only 1 in 3 samples you tested has an active shikimic acid pathway. Scientists are starting to insist that a good microbiome study should take data at a minimum of two time points, because a gene might be inactive one day, but active the next in the same person. Consider: if the turd you’re sampling from started out as a steak, the genes for amino acid synthesis will be turned off, while salvage and uptake are going to be active. If it was an apple, which contains plenty of carbs and fiber but very little tryptophan, you’re likely to find the full biosynthesis pathway in the transcriptome of your sample.
This is part of the point of the microbiome—to act like a biochemical flywheel, to smooth out fluctuations in nutrient availability, enabling more flexibility in your diet and providing a slow-drip of things like tryptophan even if all you’ve had to eat that day is apples. In fact, because of the way the body handles tryptophan, there’s no “banking it up”—any more than a certain amount gets destroyed by the liver before entering the bloodstream, so you kind of need that slow-drip.
This brings us to an important point about the physiology of depression. Remember that tryptophan and tyrosine are the essential precursors to serotonin and dopamine, respectively. Lately, people have started to question the serotonin theory of depression, but I think it’s unwise to give up the model for dead just yet. Part of the reason I say this rests on one of the most intriguing techniques in psychopharmacology: tryptophan depletion.
It's pretty much exactly what it sounds like: you take a person and put them on a tryptophan-free diet for a while. If they’re generally healthy, this won’t do too much to them; they might get a little crabby. But in patients whose depression has been put into remission by an SSRI, it’s devastating. Full relapse, it’s like the Prozac was never there. To my mind, this is consistent with a model where your average healthy person has some degree of endogenous tryptophan supply, whereas the depressed patient is fully dependent on dietary tryptophan for their serotonin production.
Depression aside, think about the implications of impairing a person’s ability to produce serotonin and dopamine except when they’ve just eaten tryptophan or tyrosine, from a classical-conditioning perspective.
You might expect something like a food addiction.
But let’s assume the most optimistic case. Let’s say we’ve got enough tryptophan, etc. in the food to keep glyphosate from being lethal to our gut bacteria, even at concentrations that inhibit EPSP synthase. And let’s say we’ve got enough left over for our bodies to make the serotonin we need. Even in this case, it’s still not a pretty picture—because amino acids aren’t the only things produced along the shikimic acid pathway.
Vitamin K
In 1967, a study on the nutrient requirements of long-term hospitalized patients found something remarkable: vitamin K is only a vitamin for some people. After a month of a fully vitamin K-free diet (patients were being fed intravenously), only some of the patients showed signs of a clotting disorder, the telltale symptom of a vitamin K deficiency.
All of the patients who got the clotting disorder were patients who’d had antibiotics during the course of their hospital stay.
As it turns out, a healthy microbiome produces enough vitamin K that it’s not necessary in the diet.4 It should be said that clotting isn’t the only thing that vitamin K is necessary for; that’s sort of the bare minimum, and higher levels may be necessary for processes like bone mineralization and a healthy antitumor immune response. Vitamin K is a downstream product of the shikimic acid pathway—so even if your microbes have survived glyphosate by scavenging the tryptophan from your diet, it might be keeping them from contributing to your circulating pool of vitamin K.
Granted, you can also get vitamin K from certain foods—but insufficiency is surprisingly common. Also, the bacteria don’t just produce it for our benefit: vitamin K and the related compound CoQ10 are both important parts of the electron transport chain, which is a big part of how gut bacteria get energy.
PABA and Folate
Have you ever wondered what happened to the other B-vitamins? If you’ve ever counted, you’ll notice some conspicuous absences—they skip right from B3 (niacin) to B5 (panothenate), and Bs 8, 10, and 11 are also AWOL. Chemistry and nutrition science is tricky business, and this was doubly true in the days before we could just blast any ol’ thing through a mass spectrometer and smash it to smithereens to see what it was made of. There’s a lot of interesting history among the “missing” B-vitamins, but one we ought to talk about here is the substance formerly known as vitamin B10: PABA.
PABA (or para-aminobenzoic acid), is an interesting one. It’s a byproduct of chorismate, which means that roundup prevents its biosynthesis. For a while, it was considered a vitamin, but got bumped from the list because—as far as I can tell—nobody could pin down exactly what it does in humans.
It was one of the first chemical sunscreens, though—and since it’s a natural compound found in foods (and is produced by certain bacteria that live in the gut or on the skin), maybe there’s something to the notion that it’s part of the body’s natural defenses against UV. It lets UVA through while blocking the higher-energy UVB, meaning you can still get a tan while being protected from the most intense DNA damage.
But we know perfectly well what PABA does in bacteria and plants: it gets built into folate.
And besides being “that thing pregnant women need a lot of”, folate is a massively, critically important molecule in just about every life form.
Any time your body needs to methylate something (i.e. move a single carbon atom from one place to another), folate is the thing that makes it possible. There’s DNA methylation—part of the epigenetic process by which your body turns genes off and on. There’s catechol-O-methyltransferases, the enzymes that deactivate adrenaline and enable you to calm down after a stressful event; a similar enzyme that acts on serotonin enables your body to produce the melatonin you need for a good night’s sleep.
All this happens via an intermediate called S-adenosylmethionine (or SAM-e), which is commonly sold as a supplement—but it won’t do shit for you if folate issues are your real problem, because SAM-e gets the methyl groups that it gives away from folate.
Problems with folate metabolism are generally more complicated than a simple deficiency, but we do need it in the diet. It’s typically found in leafy greens, hence the name—folate, from foliage, i.e. leaves.
Because a lot of the processes that it’s used in (like DNA and RNA synthesis) are universal across all domains of life, bacteria need folate too. Many can make it from scratch, but a lot of human gut bacteria have lost one part or another of the folate synthesis pathway, the same way that we have. Lactobacilli, for example, can only make folate if they’re given PABA as a precursor. Because unfiltered beer supposedly has a substantial amount of PABA, it could be argued that eating cheese and beer together is effectively a salad, because the Lactobacillus in the cheese will eat the PABA in the beer, and turn it into folate. (This isn’t likely to convince anyone, but it could be argued.)
Broadly, though, this is a common trend across the kingdom Bacteria. Organisms that live in symbiosis with a larger creature tend to lose genes that they’d need if they were on their own. It’s a little bit of tragedy of the commons: An individual species can lose the gene for producing folate, and it’ll become fitter—it’s not wasting energy making something that it can just pick up from its environment. This is called auxotrophy, needing to scavenge something, as opposed to autotrophy—making it for yourself.
Since we’re folate auxotrophs too, a human gut bacterium can be assured of a steady supply from our diet and other microbes in their vicinity. If every species in the gut lost those biosynthesis genes in favor of scavenging it, you might have a problem—selfishness is only rewarding until you crash the entire ecosystem. So a balance is struck: some microbes like Bacteroides (which, remember, is a major component of most people’s microbiomes) remain autotrophs, meaning they’re likely either neutral or a net positive in terms of your total folate pool.
But that balance requires an ecosystem in relatively stable equilibrium: if your Bacteroides get crowded out by a folate auxotroph that happened to be resistant to an antibiotic you took, it could be bad news. That isn’t just a hypothetical: sulfonamides, the first kind of antibiotic that was ever deployed at large scale in human medicine, work by inhibiting folate biosynthesis in bacteria—meaning that folate auxotrophy is the mechanism of resistance. In other words, taking these drugs creates a huge selection pressure in the gut, in favor of folate consumers rather than producers, turning the microbiome into a “sink” for the folate in your diet, rather than a source. And as it turns out, this is one of the hallmarks of the Parkinson’s disease microbiome.
This is the other thing about doing laboratory feeding studies on things like glyphosate: something that has a relatively minor effect in a healthy microbiome may have its impact amplified exponentially by things like antibiotics. Unless we want to test every possible drug interaction as part of the safety evaluation process for a new agrochemical, we need to make decisions based on the epidemiological evidence that reflects the actual way these things are impacting people’s health.
The Chorismate Siderophores
If you’ve made it this far, to the deepest chamber of the first of the Thousand Secret Ways, I commend you. All the products of the shikimic acid pathway that we’ve talked about so far, tryptophan, tyrosine, folate, vitamin K—all of these are things you can theoretically get from your diet. But here, that’s not the case.
Behold: Enterobactin.
This is one of my favorite molecules. It’s what they call a siderophore—part of how microbes trap iron for use in their enzymes and proteins. If you’re a bacterium, you spend a lot of energy trying to get hold of iron: it’s the anvil on which many chemical bonds are forged and broken, and it’s often the rate-limiting factor in bacterial growth.
So, when iron is scarce, certain kinds of bacteria produce this little molecular fishing net and send it out into the environment. When it comes near an atom of iron, (shown in green in the diagram above) the “arms” of the molecule catch it by forming a chemical bond, and then swivel so that the other two arms can do the same—holding it tight. This changes the properties of the compound’s exterior surface, so that it’ll bind to receptors on the bacterium’s surface and get pulled back in with its payload. (interestingly enough, your mitochondria also have a binding site for enterobactin, and it’s recently been suggested that this is part of how they get the iron they need.) It’s an elegant system, and it works incredibly well—so well that there’s no way to get the iron out without taking the whole thing apart.
But siderophores don’t just fetch iron, they also chaperone it to make sure it doesn’t cause trouble. The same thing that makes iron so useful—its ability to catalyze chemical reactions—also makes it dangerous. Cells—both ours and our microbes’—are a little bit like cars, in that they’re powered by oxidation of chemical fuels. And like in an engine, it takes a lot of coordination among moving parts to make that happen in a productive way. If there’s fire anywhere in your car besides the pistons, it’s generally bad news. “Free” iron in a cell, whether bacterial or human, is like a lit match in the passenger cabin. The same goes for copper, cadmium, lead, and other heavy metals.
These other metals are also abundant in the environment and in our diets, but most are either less useful than iron, or even more dangerous. Bacteria have figured out ways to deal with them, by repurposing the machinery that they use for iron acquisition: when hit with high levels of copper, enterobactin producers like E. coli express an enzyme that oxidizes the “arms” of enterobactin to produce something that can bind up copper instead, sequestering it and protecting their cells—and ours— from damage.
The microbiome is our first line of defense against environmental toxins of all types, metals included. That’s why, when you give antibiotics to mice, they suddenly start retaining drastically more dietary mercury.
So isn’t it a funny coincidence that the number of deaths from Alzheimer’s—a disease that is clearly associated with failure of the body’s ability to handle iron and copper—rose by about 4x, right around the time we all started eating a lot of glyphosate…a molecule which should disrupt the microbiome’s ability to help us handle iron and copper?
No. It’s not funny. And it’s not a coincidence.
This piece is already long enough—tune in next time for the definitive guide to avoiding roundup in your diet, i.e. which foods it’s worth buying organic and which it doesn’t really matter for. If you’re not already on the mailing list, you can sign up below.
-🖖🏼💩
This is arbitrary, but widely accepted, and it assumes a degree of good faith on the part of the person reporting their results. In theory, it allows one in twenty erroneous results to slip through, so a malfeasant scientist could just repeat the same experiment twenty times and only report the result he wants—but if you’re that dedicated, it’s usually a lot easier to just lie.
That’s way #2 of 1000.
Fun fact: the study’s authors acknowledge that, even in the control group, they found glyphosate in the animals’ urine—because chow is made from the same things we’re all eating—making it very difficult to do a proper negative control.
This raises a sort of funny semantic point about the definition of vitamins, which holds that they are substances “normally not produced endogenously (within the body) in amounts adequate to meet normal physiologic needs”.
Is the microbiome “within” the body? I’ve heard it argued that the human body is a lot like those weird wiggly tubes full of glitter water that became a huge fad in the late ‘90s. When you stick your finger through it, is your finger “within” the tube? Or is only the glitter and water truly “within” it?