April 02, 2008


From Brett Selph:

Re: Drive to complexity seen in animal evolution (March 17): Why increasing complexity?

This principle is very familiar to programmers: Computer code tends to become more complex, and when a subroutine or module becomes useless, you usually just stop pointing to it, much like “junk DNA” gets set aside. Logic tests that were formerly meaningful may remain present in the code itself (perhaps a multi-way test of input values and responses to those input values), even after certain of those input values become logically impossible to encounter. Typically, the programmer is not quite sure that any given test has become impossible to “satisfy” (trigger) under all possible conditions, and he doesn’t have the time to prove his suspicion correct. So he leaves it in. If he’s a “neatnik”, he may promise himself and his boss that he’ll get around to removing any and all “dead code” some day, but other emergencies come along, and “later” turns into “never”.

The evolution of genetic code shows similar patterns: the impetus to evolve new code is very strong, but the benefit of getting rid of (as opposed to merely “silencing”) unneeded code, is far weaker. The presence of “dead” code in the genome is thus not surprising. In fact, getting rid of it is often quite difficult, much as old computer code can become entangled with new code and a subroutine or a function originally “evolved” for one purpose, becomes useful to a second or third purpose. If the original purpose disappears, the code can’t be deleted until and unless the secondary and tertiary purposes disappear as well.

Human programmers DO have an aesthetic sense, so sometimes we DO get around to clearing out the deadwood. But we have to make a special effort. And even while he is removing dead code, remember that programmers aren’t fired if they make a change that doesn’t work (a “dead” program!) They just have to keep trying, and then succeed in a reasonable amount of time. Living organisms don’t have such a grace period. No mulligans, no “gimmes”, no do-overs. All of your ancestors MUST survive long enough to reproduce! As ugly as computer code sometimes gets, it’s not nearly as ugly (or as wonderful) as Mother Nature’s. Mother Nature never saw a nasty kludge She didn’t like... if it arrives on the scene before the “pretty” fix can evolve.

More importantly, old code and new code get entangled. We get “partial” silencing of code -but the original code is still there, because later things built upon it are still needed. The homeobox genes and the transitional structures that gave gills to fishes are still “used”, after a fashion, in mammalian embryos to make organs and structures ultimately needed by adult mammals. Ernst Haeckel famously proclaimed “Ontogeny recapitulates phylogeny”. This was an observation about appearances, and a suggestion that there were deeper connections, but it was NOT a narrow and incorrect statement (as some imputed to him) that embryos are little “film clips” replaying their own evolution. And the “deeper connection” is that more advanced (or evolutionarily more recent) structures are built upon their more ancient antecedents, the formation of adult structures remains forever dependent on at least a partial “replay” of evolution. The embryo is a “little computer” that must necessarily execute a certain amount of phylogenetically shared and very, very ancient “common code”.

Stephen Jay Gould, a man with a silver tongue but few ideas of his own, is one of a number of wannabe great men who put words into the long-dead mouths of his betters, explaining that what they meant was “X”, but the facts are “Y”. (The dead guy is never around to refute it). Gould did this so that he could “correct” the great man and take his own jealous place among the scientific pantheon by providing a “new” idea. Haeckle didn’t actually say what Gould and some others said he said, so Gould’s “correction” is wrong. Likewise, Darwin didn’t say that the rate of evolution was smooth, only that it takes a long time relative to human lifespans. In fact Darwin anticipated “punctuated equilibrium” on page 551 of his Origin of Species (5th edition): “the periods during which species have undergone modification, though long as measured in years, have probably been short in comparison with the periods during which they retain the same form.” Gould made almost a whole career out of an idea Darwin tossed off on one page.

As for increasing complexity, “But some have dis­put­ed that such an over­all trend exists. The late Har­vard bi­olo­g­ist Ste­phen Jay Gould ar­gued that the as­sump­tion of ever-in­creas­ing com­plex­ity is an il­lu­sion, aris­ing from the fact that life started with the sim­plest form­s—as it al­most had to. Life could only get more com­plex from there. But this does­n’t rule out that more elab­o­rate forms can ran­domly fluc­tu­ate up­ward and down­ward in com­plex­ity.” If this excerpt gives a fair representation of Gould’s position on trends in complexity, then Gould is wrong once and possibly TWICE in one statement! Firstly, it isn’t an “illusion” that life gets more complex because it can’t get any simpler than its (simple) origins. This would be a logical necessity (and thus a fact), not an “illusion”. Secondly, (and I’m not quite sure if I’m putting words into Gould’s mouth here, because he had a habit of dancing around the point). While it is true that the “more elaborate forms will fluctuate upward and downward in complexity”, there are two major problems to resolve: Firstly, apparent loss of complexity in (“de-evolution” in the phenotype) does not imply corresponding loss of complexity in the genotype. And at the genetic level, even if there is an actual “reversal” genetically identical to a ancestral state, it wouldn’t be accurate to call the likelihood of such fluctuation “random” except at the “first iteration” of base-pair changes. Random has a special meaning and it cannot apply wherever random is actively selected against. It’s possible the flip “heads” ten times in a row, unless we always throw out the third “heads”. I wouldn’t do that, but a hungry tiger might. Assuming a trillion base-pairs in an organism, then, from a purely “random” and mathematically naive standpoint, there might be a one-in-a-trillion chance that the NEXT random base-pair change will exactly reverse the previous one, but not a chain of such reversals. The equivalence disappears when chains of successive mutations are considered. Natural selection simply doesn’t allow it, because a series of creatures must SURVIVE for a series of base-pair changes to reverse themselves. ALL of the reversals must be allowed, not just some of them or most of them. There are tigers out there that eat the slow and the stupid. So “de-evolution”, if we may call it that, is strongly selected against under most circumstances. We don’t see much of it.

On top of this, earlier genetic code changes get “trapped” by subsequent code changes that depend on them. Even if any given SINGLE point-change were equally likely to be reversed, “de-evolving” is considerably more difficult than the reverse because of the code entanglement so familiar to computer programmers.

For example, the vertebrate eye has some unfortunate design elements, such as blood vessels and nerves in front of the light receptors. How shall we evolve “properly configured” vertebrate eyes, without going through some very blind (and non-reproducing) intermediate stages? The “backwardness” of the foundational structures are “trapped” by the substantial (but imperfect) utility of the end product. And likewise trapped is most of the genetic code that programs for it. Even if some rather miraculous improvements to the vertebrate eye were contemplated, and a sound evolutionary “path” to achieving it were devised, it would NOT amount to a simplification of the genetic code for making eyes... even if the final structure were phenotypically simpler and more straightforward in design. For example, we might suppose that by FIRST adding a new light-sensitive layer of cells ABOVE the nerves and blood vessels, it might be possible to evolve a better eye without going through blind intermediate forms (because, in the interim, we’d keep the “old” layer). After the genetic code for this “new” layer was fully in place and well “tuned”, we could then get rid of the now superfluous “old” light-sensitive layer behind the blood vessels. The final, adult structure might bear no overt evidence of the “old” layer, but you can be sure that embryological development would betray the evolutionary history of this improvement... and that the “backwards-retina” genes so necessary for the embryological development of the eye, would still be present in the “new, improved” genome for as long as (indeed, longer than) the descendants of this creature need eyes and vision to survive.

This dynamic applies even when the changes aren’t improvements (better eyes), but simply the loss of phenotypic complexity that is no longer needed. Even while not impossible, truly shedding complexity (as opposed to masking it) is just plain difficult to achieve, and requires strong evolutionary pressure to accomplish.

To paraphrase theorist Isidor I. Rabi (““Who ordered THAT?” speaking of the muon, an “unnecessary” elaboration of the electron)... well who ordered all of that genetic junk? Who ordered increasing complexity in the genome rather than a continuum of increasing and decreasing “ideal” complexity? Well, Mother Nature did, because She is as much a lazy programmer as I am. And we see it in the way programs are born, just as we see it in embryos. Just ask Ernst Haeckel, or if you prefer talking to the living, somebody working at Microsoft.

Pascal once apologized “I made this letter longer than usual because I lack the time to make it shorter”. Equivalent statements have been variously attributed to Mark Twain, George Bernard Shaw, Benjamin Franklin, Voltaire, Winston Churchill, Marcel Proust, Rudyard Kipling, and Pliny the Younger. It applies to all who string symbols together into meaningful patterns. So, to that list of luminaries, I’d like to add the original author, Mother Nature. Nor will I apologize for the length of the present letter... because perfect editing is insanely more difficult to achieve than is “good enough”. I suppose you could say, Redundancy Multiplies -and Code Entropy Rules!

Brett Selph
Winnetka, CA

0 Comments:

Post a Comment

<< Home