5 Fundamental Fallacies About Genetics and Epigenetics

Pop science and official ideology misrepresent the truth about DNA.

Posted Jan 23, 2019

As I suggested in previous posts, many media, popular science, and official pronouncements about genetics, epigenetics, and genomics are sometimes at best misleading, and at worst, down-right wrong. As such, they are an obstacle to what you might call genetic literacy, and ought not to go unchallenged. The basic fallacies can be itemized as follows:

Flickr
Source: Flickr

1. DNA reproduces organisms. 

In fact, this is the exact opposite of the truth. According to the modern, “selfish-gene” view of genetics, organisms evolved to copy their DNA and to be its biodegradable packaging, or vehicles. Those organisms which preserved and passed on their DNA became the ancestors of all living things alive today.

This immediately explains why the quantity of DNA preserved and passed on is vastly greater than that needed to generate the organism and why it contains so much apparent genetic “junk”: notably disabled genes, genes copied in by viruses, repetitive sequences, and all kinds of other parasitic DNA inessential to the organism. And even if, as some claim, up to 80% of this may have some function in the human case, the quantity of DNA in an organism appears to bear no relation to its size or complexity: toads and mice have about the same amount of DNA as humans, chicken about half as much, corn almost twice as much, newts more than five times more, lungfish forty times more, and Amoeba dubia, a bacterium, two hundred times more! This is inexplicable if DNA existed to reproduce the organism but makes perfect sense if you see organisms as having evolved to copy their DNA.

2. Epigenetic effects can “rewrite DNA” and make acquired characteristics heritable.

In mammals, all a female’s DNA is copied into the precursors of her egg cells while she is still in her mother’s womb, so there is no way that it can be changed thereafter. And although males produce sperm by the million daily, there is no known or even probable way in which they could selectively edit their DNA to pass on acquired characteristics. As Sir Francis Galton rightly realized (but his cousin, Charles Darwin notably did not):

Wikimedia commons
Sir Francis Galton (1822-1911) anticipated the modern selfish-gene view of heredity.
Source: Wikimedia commons

"We shall … take an approximately correct view of the origin of life, if we consider our own embryos have sprung immediately from those embryos whence our parents developed and these from the embryos of their parents, and so on for ever."

So-called epigenetic markers, which modulate gene expression, admittedly are re-set at the beginning of development. But these are themselves under genetic control coded in DNA. For example, the DNA facilitating the change of colour caused by diet seen in agouti mice was copied in by a virus, as its discoverer explains.*

Additionally, the intra-uterine environment can affect gene expression. So in principle an event in the grand-maternal womb could have an effect seen in that grandmother’s grandchild via epigenetic effects imprinted in a daughter’s DNA at the time it is copied. Again, in previous posts I have speculated that X-chromosome epigenetics might be accidentally passed on to offspring from mothers—and to sons in particular, given that they inherit only one X.

But all this is a world away from Lamarckian or Lysenkoist inheritance of acquired characteristics and gives no grounds to resurrect it. On the contrary, epigenetics should be seen as derived from epigenesis understood as the non-preformed, DNA-directed generation of the organism from a set of instructions.

3. Given that thousands of genes are involved in any complex trait such as sex or mentality, single genes are irrelevant.

Here the fallacy also relates to epigenesis because, although it is true that being male or autistic ultimately involves the expression of thousands of genes (to cite two traits which often go together), the epigenetic process begins when a single gene such as SRY (perhaps appropriately read as Sorry by my computer's text-to-speech program) triggers whole cascades of other genes, which produce the final outcome.

Here single genes are crucial in just the same way that a single instruction is in a recipe or set of directions: One wrong turn in the latter will usually take you further and further away from the desired destination; and a cake recipe which misprinted 20 hours cooking time for 2  would in fact be one for a brick. Single-point mutations in DNA have similar effects and have been widely described in classical, Mendelian diseases, such as muscular dystrophy or sickle-cell anaemia.

Indeed, this is the only way that evolution could happen: multiple simultaneous mutations could not occur in thousands—or even just a few—genes to produce complex adaptive traits. But copying errors in the form of single-point mutations changing just one letter of the DNA code, or accidental deletions, duplications, or inversions of entire sequences can and do happen. And although mostly deleterious, any such changes which promoted their own survival better would be likely to be retained, producing evolution as we observe it. 

4. Nature and nurture are equally important, and can’t be distinguished.

Three things need to be said about this:

First, what passes for nurture rather than nature—everything which is not under the control of a person’s DNA—is often itself affected by heredity, as a recent, authoritative account makes clear.  To cite the crucial case, the home environment seen as the epitome of nurture is in fact normally populated by persons who share their DNA closely, such as parents and siblings. And other environments to which a person may gravitate may be chosen to suit their genetics (as I speculated in my own case in a previous post).

Second, nurture as rigorously and quantitively defined as parental investment is under evolved, genetic control, as I have explained in some previous posts. Indeed, as I pointed out in one of them, conflicts over nurture seen as parental investment are fundamental and part of our evolved heritage of behaviour, most notably in demonstrably causing parents to demand twice as much altruism (or half as much selfishness) as children are selected to demand of themselves. This classic conflict is usually seen as epitomizing nature versus nurture, but is in fact a wholly natural one.

Finally, recent revelations about epigenetic differences between identical female twins suggest that such discrepancies are in fact the result of nature, not nurture. As Galton also rightly concluded:

In the competition between nature and nurture, when the differences in either case do not exceed those which distinguish individuals of the same race living in the same country under no very exceptional conditions, nature certainly proves the stronger of the two.

5. Genetic determinism rules out choice and free will.

The idea of the individual as a puppet with strings pulled directly by DNA fails to take account of the fact that genes could not possibly legislate in advance for every behaviour of an organism which needed to move about to find food, a mate, or avoid predators. But as I have pointed out in a previous post, DNA could readily encode simple epigenetic rules in much the same way that computer code can control free-ranging floor-cleaning robots, such as the avoid-the-stairs rule, which infants also display.

In the human case, the pleasure-pain principle is perhaps the first and foremost, and others demonstrably exist for choice of mate, food preference, co-operative behaviour, and so on. Perhaps most importantly, DNA motivates a person to find environments which promise the most and punish the least. In fact, genetic determinism of behaviour does not mean that any particular behaviour is hard-wired (save reflexes), or that there is anything which in principle you could not do if others can. What it does mean though is that the more a behaviour goes against the grain of your innate proclivities, the harder it will be to achieve success and gratification. Not surprisingly, most people learn not to try, but instead to realize the potential of their true genetic endowment.

Isn’t that what life is all about? 

* With thanks to Prof Randy Jirtle, who added the following in an email to me: "It is clear that the initial change that occurred in the Avy mouse that enabled environmentally-induced epigenetic marks to alter coat color and disease susceptibility was genetic - the DNA insertion of a viral retroposon. Point mutations at CpG sites could also cause similar changes in the epigenetic regulation of gene transcription."