Home | Feedback | Links | Books


The Unsuitability of B-Cell
Maturation as an Analogy
for Neo-Darwinian Theory

Royal Truman
  © 2001 Dr. Royal Truman, Ph.D., Germany.  All Rights Reserved.  

Introduction

eo-Darwinian theory (NDT) claims that random mutations followed by natural selection is the major force which produced novel biological functions.  Inexact replication of DNA sequences in germlines supposedly led to thousands of novel genes, large numbers of specialized cells types, and from these complex organs, skeletons, as so on.  The feasibility of this has been examined on statistical grounds[1],[2],[3] and soundly rejected. Dr Max[4] and many biologists have claimed that the fine tuning of antibodies serves as an analogy for NDT.  Only vertebrates are known to develop an acquired immune response[5, p. 736], and at issue is not proof for or against macroevolution per se but whether this makes NDT appear at least more reasonable.

I believe the processes are fundamentally different, and that optimization of antibody affinity reflects a deliberate and intelligently designed problem-solving scheme.

1. Apparent Randomness and Problem Solving Strategies

Proposed evolutionary processes which supposedly produced first bacteria and eventually humans are assumed to not have been driven by intelligent guidance.  We must clearly distinguish between true randomness and a purposeful algorithm to cover a search space to converge on an intended goal.

(A)  Where fired shotgun pellets actually impact is only in an incomplete sense “random”.  The gun barrel, triggering mechanism, explosive mixture, size and number of pellets, etc. are organized to solve a class of problem.  Although the specific target need not be known in advance, the topology of desired outcome (in time and space) is part of the shotgun design.  The design covers a constrained range of possibilities: it cannot kill bacteria nor whales (area), nor destroy satellites (distance) and needs a triggering mechanism (time).  This permits a non-random outcome, such as killing a bird at a specific time and place with a high probability, with little risk of collateral damage.

The designer of the apparatus need not specify the exact picometer each pellet will end up at.  It suffices to ensure within a high probability that when used in the correct context and manner, the “random” behavior of the ensemble of pellets is within the intended tolerance.

(B)  Construction workers sometimes throw debris down a chute when repairing a building.  One cannot predict the precise trajectories nor final location of every object thrown down.  Nevertheless, the range of outcomes is constrained by the design which ensures bystanders aren’t killed.
(C)  In molecular modelling studies one wishes to identify an arrangement of atoms in three dimensions with an absolute energy minimum, which would represent the thermodynamic stablest configuration.  Mathematical algorithms can guide the computer program down one of many (local) energy minima.  How may one ensure there is not a better one?  One trick is to store the best result so far, introduce “random” behavior to knock the settings away from the influence of the (local) minimum energy valley to permit other attempts to be made (yet deeper energy valleys).

Superficially random search strategies are used commonly by intelligent agencies to explore a space of possibilities.  There are many examples. The logic can be programmed into a robot to find its way around a room. Once foraging bees are in the vicinity where scout bees sent them (via a coded "waggle dance" message) to find food, they circle until the target is sighted.  The bee uses its best guess as roughly a central point to initiate a “random” search strategy.  Anyone not having a full picture of what is involved, and only concentrates on the bee's flight behavior while homing in on the target, could be excused for seeing only “random” change.

None of these examples are legimitate analogies for evolution, although superficially based on seemly “random” changes and selection of improved intermediate steps.  As we shall see, B-cell hypermutation to home in on solutions within an acceptable tolerance, is another example of only superficially “random” behavior.  All necessary equipment has been prepared in advance, and the “random” hypermutations begin only after a careful process of preparation to ensure suitable candidates have locked in on the goal, and they are then carefully guided towards the intended target.

There are material differences between NDT models and the immune system processes described by Dr Max[4].  He is not attempting to “prove” evolution by this example, only to provide a conceptual analogy.  I shall summarize first some observations, which are not totally independent of each other, and then put the pieces in a more complete picture together to show why the two processes of mutation and selection are fundamentally different.

2. Material Differences between Neo-Darwinian Theory
and Optimization of B-Cell Affinity

The following observations should not be considered separately.  As a whole they represent a planned convergence algorithm, unlike NDT.

(i)  Performance optimization of B-cells involves somatic mutations.
The mutational rate of the key, variable (V) regions of antibodies is about a million times greater[6, p. 1224] than that of other somatic and germline cells.  Such a mutation rate in the germline would lead to error catastrophe, especially for large, multicellular genomes (such as that of vertebrates which produce B-cells).  A replicating fertilized mammal egg would undergo over 40 mutations on average per gene during fetal development.  Furthermore, NDT mutation rates of such magnitude would reproduce the genomes unrealiably, leaving natural selection with nothing consistent to identify.

This brings us to the major point in this first objection: the long-term survival of B-cells over thousands of years is ensured in spite of having undergone mutational rates which would wipe out whole organisms.  This is because long-term B-cell survival is independent of the B-cell hypermutation process.  The original germlines, which do not undergo B-cell type hypermutations, must be and are maintained.  Any examples in which mutational rates are independent of long term total organism survival are irrelevant for NDT purposes.

(ii)  B-cell optimization is ensured to occur in a relevant time frame for selective purposes.
Generating high-affinity memory and plasma cells occurs in a relevant time frame, literally within days.  As discussed below, this is possible due to many factors which have been prepared in advance to identify the desirable hypermutated variants and deliberately accelerate their replication.  The statistical objection to NDT is that the proportion of useful base pair sequences able to improve biological function in the context of everything going on in the cell is miniscule compared to the collection of harmful or neutral alternatives.

Using relevant germline mutational rates spread out over the whole genome presents a different picture.  The B-cell maturation process is tailored to generate high affinity antibodies with dramatic and biologically measurable advantages during the relevant time frame (a very small fraction of the normal life span of the vertebrate organism it is part of).  This can be demonstrated empirically, even using antigens not found in nature: high affinity antibodies are generated within days.

Mutational rates of 10-3 to 10-4 per base pair are based on the assumption these cells double every 17 to 18 hours.  Should the hypermutation process be limited to a brief time in B-cell fine tuning, the true rate would be still higher[7].

Were the hypermutation rate of maturing B-cells not 10-3 but about 10-10 to 10-8 per base pair (bp) replication as observed today, and upon which NDT depends, the process would not permit selection to help combat an infection.  Since the hypermutations are concentrated precisely at the antibody binding regions, these high mutational rates are designed to prevent drift.

The opposite is true for NDT.  Relying on extremely rare mutations on the same genes to improve protein performance or to create new ones won’t occur in a selectively reasonable time frame.  Note that one cannot simply assume higher mutational rates.  For example, a 20,000 gene genome undergoing 10-6 mutations per bp would generate on average 8,4 Stop codons within the coding region of genes, and have a probability of 0,0018 per cell replication of avoiding introducing any such Stop sequences[23] (assuming an average coding region of 300 codons per gene).

Assuming 40 cell replications during embryo development indicates the new born would have about

(20,000 genes) X (900 bp/gene) X (40 cell duplications) X (10-6 mutations/bp) = 720

mutations in the coding regions plus many damaged gene regulatory sequences.  Evolutionary self-destruction is guaranteed without efficient error correction mechanisms, which then prevents NDT from being a viable theory given the prohibitive rates by which novelty could be generated.

We see that NDT models face a statistically incomparable challenge to that of B-cells: a large number of useful mutations, each occurring about a million times less frequently that the hypermutation process, must accumulate on the same gene during a time frame these can realistically be selected for.  These must offer overwhelming advantages, if that lineage is to fix in the population, for two reasons: the descendants are a miniscule proportion of the population, and risk dying out[3]; and the whole organism is being selected and not merely a particular kind of cell within the host.

Critically different from the hypermutation process, NDT demands survival in the face of predominantly function-destroying mutations throughout the whole genome, whereas B-cells can only benefit from mutations: these are restricted to a miniscule portion of the cell’s DNA; and the original, pristine gene fragments are maintained in the germline lines of the vertebrate population.

The B-cell maturation process easily justifies the penalty of carrying extra genetic material, since the advantage is measurable and dramatic within the organism’s lifetime.  This cannot be claimed for NDT models, where presently superfluous DNA not needed for an immediate critical function must somehow be carried to permit evolutionary experiments over millions or billions of years.

(iii)  B-cell maturation is ensured a starting point which can be refined.
A large pool of non-specialized, non-hypermutated B-cells exist as part of the vertebrate immune system, able to interact with antigens, although weakly.  Some of these are selected as part of the primary response to attack, and undergo clonal replication having the same antibody producing (rearranged) gene[9].

The first class of antibody produced by a developing  B-cell (IgM) forms pentameric structures[6, p. 1210] (unlike the remaining 4 classes with only 2 binding sites).  This enhances bonding to multivalent antigens even though each individual affinity is low.  This permits a subset of useful  B-cells to be identified and multiplied for subsequent refining before random hypermutations occur.  The number of candidates provided opportunities to fine-tune via site-restricted mutations is deliberately increased in advance to capitalize on the later hypermutation process.

Most T and  B-cells flow continuously between blood and secondary lymphoid tissues.  They must squeeze between special endothelian cells, passing through multiple lymph nodes[6, p.1202].  This ensures they will have many opportunities to encounter antigens (and also each other).  Before the hypermutation process (which is proposed to serve as an analogy for NDT) is activated, those members already possessing good antigen—antibody fits will therefor have been prepared in larger proportions.

The context or environment within which hypermutations occur needs to be considered to judge the suitability as an example of Darwinian macroevolution.  NDT assumes random mutations provide variety which can be selected for.  I argue B-cell maturation is an example of a process deliberately designed to accomplish a goal, and the necessary Ausrüstung has been provided to permit the intended convergence to occur.  Let’s examine further the world of B-cells to evaluate any relevance to NDT, for which no preadaptation to a long-term goal is postulated.

Although the [V-(D)-J]-C number and organisation of the gene segments (Figure 1), differ between species, they are specialized to fulfill the ultimate task of defense against bacterial and viral attack.  Spacers of demanding sequences, 12±1 or 23±1 nucleotides long (Figure 2) between the segments used to compose the new B-cell genes (Figure 3), serve as recombination signals[9],[10].  It has been suggested that the recombinase includes two proteins, each using a 12 or 23 base pair spacer[10].  These rearrangements must be limited to the correct cell types.  (Note that any process of evolutionary trial and error would be tinkering with a process dangerous to the organisms as a whole, since recombination can occur over large distances, even across chromosomes[9]).

Base loss or addition of 1-10 nucletides at the recombination sites is common[9], creating genes able to code for a wide variety of proteins at the variable portion of the antibody.  The enzymatic machinery dedicated to the V(D)J type rearrangements in T and C cells is not understood yet.  Scid may be one of the factors involved[9].  Some of the nucleotide addition may be due to terminal deoxynucleotidyl transferase (TdT)[9].

Two V(D)J activation genes, RAG-1 and RAG-2, seem to be expressed only or almost exclusively in lymphoid cells.  RAG-2 expression profiles has been shown to match that of V(D)J recombinase activity.  The mRNA was found only in pre-B and pre-T cells but not in cells earlier in the B lines nor in high-affinity, mature versions[11, p. 1521].  The two genes are adjacent but sequentially unrelated[11].

These are large and complex genes.  For example, mouse RAG-1 (1040 amino acids, AAs) are 90% identical with that of human's (1043 AAs).  Proteins predicted from gene sequences indicate mouse RAG-2[11] (527 AAs) and chicken (528 AAs) are 70% identical.  No homologies are known between RAG-2 and any other proteins or genes[9, p. 367].  The fact that so little variability is found along the long RAG-1 and RAG-2 proteins suggests these are involved in multiple interactions with other members of a complex recombination activity.  These genes must not be permitted to be expressed in other cells or too high a level in inmature B and T cells[9, p. 368].  They only seem to be expressed in pre-B and pre-T cells and transfected fibroblasts.  Both are expressed independently[9, p. 369] but act together in pre-B and pre-T cell lines.  There are no examples yet of V(D)J recombination in the absence of either one[9, p. 368].

Note that complex genes necessary for the fine-tuning process to be viable, which precede and anticipate the hypermutation process, already exist.  Sophisticated cellular equipment has already prepared the ground for the unique hypermutations which will then occur at the right time and place.  This equipment causes specific rearrangements to occur in Ig and TCR (T cell receptor) cells[9, p.  371].  The first gene rearrangement during  B-cell development involves the heavy chain D-JH rearrangement[9, p. 371].  Then no more of such rearrangements seem to occur.  VH to DJH rearrangement occurs next, but only on those alleles which have already undergone D-J rearrangement.  The whole heavy chain machinery is then inactivated and the K light-chain rearrangement is then activated.  Perhaps the IgM polypeptides serves as a signal to coordinate these switches[9, p. 375].

The strategy of joining light and heavy chains to produce a huge variety of B-cells, each with different binding sites for antibodies, ensures to a high probability that most antigens will be recognized.  As suggested already, the affinity optimizing steps can begin immediately because the space of antigen-recognizing possibilities can be covered reasonably well ab initio, upon which a more optimal fit can then be developed.

The millions of B-cell lines produced daily in human bone marrow have different (somatic) antibody producing genes because the generation of such variety, through controlled combinations of V, J, D and C DNA elements, has already been programmed into the host genome.  This permits a large variety of antibodies to be generated concurrently to enhance the chances of producing some which are able to bind to potential antigens.

Since to a high probability some immature antibodies will already be biologically valuable, the subsequent mutations to improve the antibody—antigen fit have already been spared the endless trial-and-error process NDT requires, in which random mutations must accumulate over eons, before any kind of selection could then become possible.  For B-cells this is very different, and in a matter of hours.

The scheme which ensures a starting point for intense selection must not be permitted to destroy the finely tuned genes produced.  Mature B and T cells display at most insignificant recombinase activity, and the RAG-1 and RAG-2 genes are no longer expressed.  This prevents contamination of the high affinity versions and their destruction.  Some kind of feedback inhibition by Ig on the cell surface may be involved[9, p. 377].

Thus, a process has already been designed, very different from crude Darwinian selection on mutant offspring, which permits a specific goal (binding to a certain antigen) to get started as soon as the need arises.  No analogy exists for NDT: neither a random portion of non-coding DNA, nor duplicated genes are guaranteed an instantaneous and novel biological use which can be immediately and relentlessly fine-tuned.  Furthermore, one cannot argue all genes arose from a preceding gene.  There has to be a starting point.  At some time, a section of DNA would have had to exist to permit novel gene families to begin, and these could hardly have been guaranteed an immediate biological use[3].

(iv)  Huge numbers of variants are generated concurrently from the same parent.
From the original DNA provided through a fertilized egg about 5 X 107 naive  B-cells are generated by human[12] and mouse[6,p. 1198] bone marrow daily.  The preimmune antibody repertoire in humans, even in the absence of antigen stimulation, is about 1015 different antibody molecules[6, p. 1212].  A small minority of these are selected to enter the stable, recirculating pool (i.e, the locations of B-cells excluding the germinal centers).  For mice the sequence diversity before fine-tunning by hypermutations has been estimated as between 108 to 1010 different antibodies[13].  Each B-cell uses only one specific heavy and one light chain V region on its surface.

It is known that the number of high affinity B-cells produced in the germinal centers vastly exceed the number which leave[12, p. 56].  This permits convergence to a good fit between between antibody and antigen.  NDT does not postulate the generation of so many mutants concurrently from the same egg that several are guaranteed to identify an external need they are best able to fulfill.  This is particularly glaring when one adds the need to regenerate an identical original genome (to preserve the germline), i.e, the full complement of gene fragments.

(v) B-cell fine-tuning is restricted to a monotonic criteria.
A whole organism faces a wide range of survival challenges. According to NDT, useful germline mutations would have to be identified correctly countless times to generate all the biological functions observed throughout nature, even though survival and reproduction is a very stochastic process.  There are many reasons an organism could die in spite of having a “better” gene or two.  Most of the necessary mutations for NDT would have to be simple base-pair mutations, to generate the thousands of precisely tooled enzymes, with exact 3-dimensional folded protein cavities, which catalyze specific biochemical reactions.  Developing each enzyme would require the existence of huge numbers of mutations, with dramatic selective advantages over their predecessors to exist, to preclude extinction of that evolutionary lineage.  This is unrealistic for NDT, which demands that selection be based on survival and reproductive advantages for the whole organism.

B-cells compete only on the basis of fit to specific antigens in what are otherwise very homogenous microenvironments.  The 105 [6, p. 1206] of so antibody molecules on the plasma membrane, each with the same mutated protein variant, are one of the many factors discussed in this essay which optimize the chances of fine-tuning towards the intended goal based on desirable mutations.

Not all candidates for producing memory cells are dedicated to that purpose.  Some become activated plasma cells, which can begin protecting the vertebrate organism by producing about 2000 antibodies per second [6, p. 1207].  Note the lack of parallel to NDT.  Memory and plasma cells are both necessary for the scheme to work, even though both derived for the same stem cell within hours of each other.  Plasma cells are so specialized to flooding the bloodstream with a specific antibody, they seem incapable of further reproduction and die within days[6, p. 1207].  But without such sacrifice, the vertebrate organisms, within which memory cells form, would have significantly decreased survival changes.  Conversely, the plasma cells would no longer be available to combat the same antigens at a later point of time unless the memory cells were to develop.

(vi) Useful B-cell mutations can be immediately recognized and acted upon.
One can state without exaggeration, that B-cells are solutions waiting for the problem’s details to be revealed.  The antibodies produced by B-cells are triggers behind which a whole cascade of activities are waiting.  Only those cells whose antibodies suitably recognize an antigen are stimulated to reproduce, and in proportion to goodness of fit.  Notice the total absence of analogy with NDT: a particular mutation is not matched to an external condition, and if desirable, causes an immediate and rapid reproduction.

Useful mutations can have dramatic selective advantages.  This is possible given the very narrow goal being optimized and that everything has been prepared to capitalize on the fortunate mutation as soon as it occurs.  A single AA substitution in a CDR (complementarity-determining regions) can result in a factor of 10 increase in affinity[7, p. 1155].  A subsequent second useful mutation can also quickly be taken advantage of.  A hundred-fold difference in affinity due to 13 AA replacement mutations has been reported[14, p. 548].  This leads to ovewhelming survival advantages for this particular line, even though the microenvironment guiding selection is very similar during the whole process.

Note that the large number of mutated B-cells must not be able to destroy the host tissue.  This is carefully regulated, apparently because autoreactive T help cells needed to destroy good tissue has been destroyed already in the thymus[9, p. 552] and that the somatic hypermutations seem to be restricted to dedicated compartments.

(vii) Subsets of mutated organisms are separated and then fine-tuned.
The cloned B-cells which underwent selective, rapid reproduction are targeted for specific mutations.  These kinds and rates of mutations are not only unrealistic for NDT purposes (as discussed above) but are restricted precisely to those regions where on average they can only do good and no harm.  These variable and hypermutational regions correspond to the binding location of the antibody with antigen and to the surrounding areas holding these sites in place.  The proportion of cloned B-cells, mutational region and mutational rate are such that the probability is very high that enough variants will be generated to improve the binding affinity.  For NDT, which involves large, whole genomes, this is unrealistic.

Next, “chance” is manipulated to ensure the desired outcome:

(viii) The microenvironment is manipulated to act upon the successful somatic mutations.
The antigen which stimulates reproduction of those B-cells with suitably matching antibodies are now forced together in a unique structure.  These germinal centers begin with a small number of selected B-cells accompanied by T helper cells, follicular dentritic cells and macrophages, plus trapped antigen[15].  This ensures that the rapidly duplicating hypermutating new B-cells will be tested against a single, monotonic survival criteria in an optimal manner.

Germinal centers are the only locations in the body where antigen is kept for years[9].  They are held by specialized, secondary follicular dendritic cells near B lymphocytes with antibodies on their surface.  Apparently hypermutations in the variable region of the antibodies are stimulated in this special environment[16].

An NDT analogy would be that some agency would ensure that, say, cold weather is maintained in the immediate area of animals which were found to have thicker than average fur, and that precisely this cold weather stimulates extra rapid reproduction.  This is indeed how B-cell maturation works, but most certainly not NDT.

The whole scheme must prevent autoimmune clones.  It seems memory cell precursors go through a stage in which interaction with an antigen in the absence of helper T cell makes them tolerant[16].

Remarkably, antigen-induced proliferation of B-cells at another site, periarteriolar lymphocyte sheath-associated foci, was not associated with somatic hypermutation[17].  There is no analogy in NDT, such that mutated, whole genomes within centimeters of each other would respond totally differently towards an identical external stimulus.

The germinal centers undergo a series of distinguishable stages[12] culminating in high-affinity memory cells which then recirculate in anticipating of encountering the same antigen again later.  The mutated B-cell can carry out its function because it is transported to a microenvironment with many prepared components, which can select subsequent mutant offspring to optimize the antibody - antigen fit.  In a literal sense, the problem to be solved has been anticipated and a complex machinery prepared in advance to solve it.  In NDT a full organism's survival depends on many challenges and no anticipation for a future problem exists to permit a particular germline mutation to be selected for.

To ensure resources are not wasted, another feature is observed in B-cell maturation:

(ix) A process of suicide (apostosis) is observed for the less effective B-cells.
Now, it would be wasteful if B-cells with less precise antibody structures were to be allowed to compete for space and nutrients in the germinal centers.  A high death rate among germinal center cells has been reported[13, p. 19].  The B-cells have an in-built mechanism which leads to suicide if their surface antibodies are no longer stimulated by interaction with antigen[16], [13].  Upon apoptosis they are taken up by macrophages[15].  The mutant variants which are stimulated by antigens appear to use surface receptors, which are not directly part of the immunoglobulin receptor complex[15], which prevents apoptosis.  One sees again, that the process does not resemble NDT, but is carefully guided.

NDT relies on a “struggle for survival”.  In NDT, if every organism not optimally matched to a single of the many survival challenges faced concurrently in some microenvironment were to commit suicide, the population would soon die out.  Multiple nutritional and predator / prey challenges are faced simultaneously by whole genomes.  But in immunology this is exactly what occurs.  The rate of cell-suicide is carefully regulated, according to whether in the pre- B-cell stage or in the germinal center.  Many complex components guide this process.  For example, the bcl-xL transgene has been shown to suppress apoptosis in germinal center  B-cells[18].  In the words of those researchers,

Thus, the GC [germinal center] response appears to be regulated by factors beyond affinity-driven competition and selective apoptosis.  The rise and fall of GCs depend on the presence of antigen, sustained cell-cell interactions, and cues for cellular location.  It is not surprising that this important immunological response is controlled by finer means than that afforded by Darwinian competition alone.[18, p. 407]

Not only is negative selection observed, but positive also appears to occur:

(x) Cells with suitable antibodies are caused to proliferate more rapidly.
Antigen binding causes  B-cell proliferation[5, p. 73].  Some of these differentiate into plasma cell which secrete large quantities of antibody.  The others will be retained to generate secondary immune responses if needed in the future.  After a small number of B-cells are organized to form a germinal center, these reproduce more rapidly than those less specialized recirculating B-cells[15],[10, p. 580].  For example, centroblasts have a cell cycle time of merely 6-7 hours[12, p. 57].  This allows a greater number of opportunities via hypermutations to generate antibodies with better fits.  Isolating a subset of suitable B-cells narrows dramatically the range of AAs left to be fine-tuned by the subsequent hypermutation equipment.

No such mechanism is available for NDT.  On the contrary, for prokaryotic cells (bacteria) to evolve into multicellular organisms the opposite must be explained.  Increased complexity means such genomes must compete against the simpler variants, which every step of the way would reproduce more quickly[3].  Smaller genomes have measurably shorter DNA replicating times[3] and the demands for energy and nutrients to survive are smaller.

(xi) The hypermutations are controlled.
(See the Appendix for more details).  The variable regions of the light and heavy chains are thought to undergo mutations on the order of 10-3 mutations per base pair per cell generation[6], or about a million faster than observed in germline cells (which are what is relevant for NDT).  In fact, in memory  B-cells from a human donor a mutation rate as high as 5 X 10-2 in Ig heavy chain genes was reported[19].  These hypermutations on average lead to at least 1 mutation per cell division[17], subject to intense scrutiny for suitability based on a single criteria.  Like the shotgun shell, where each pellet ends up might seem to be in a superficial sense “random”.  Nevertheless, the range of behavior as a whole is restricted by design to enhance chances of satisfying a specific task.

High mutation rates are also found in a few hundred bp long untranslated region surrounding the V-J and V-D-J segments.  Most the mutations are limited to about 110 AA positions in the variable portions of the light and heavy chains of the antibody.  Within these regions most the mutations are concentrated in three small hypervariable positions [6, p. 1217].  The hypervariable sequences consist of about 25 of the 110 variable-region AAs of the light chain and about 30 of the 120 variable-region AAs in the heavy chain[20].  These are precisely the location, after protein folding, of loops in contact with the antigenic determinants[6, p. 1219].  Thus, more than simply very high selection of random mutantions in the antibody binding region is responsible.  An unusual, well-controlled mutation-inducing process is involved[10, p. 580].

This is an important difference with NDT, since B-cell hypermutations to a large extent only change the length and AAs of three loops which are held in place by the rest of the protein.  The overall three-dimensional folded structure is not disturbed[6, p. 1219].  However, random NDT must follow some feasible path where the intermediate genes produce polypeptides able to fold consistently to generate a biologically useful protein.

Since the highest rate of mutation are found in the CDR, these hot spots are not useful as analogs for truly random mutations able to produce macroevolutionary changes.

Exactly how the signal to initiate hypermutation for these B-cells, at the correct point in time works, is not yet known in detail (Appendix), although it seems to be accepted that enzymes exist designed to direct these hypermutations to the CDR regions[12, p. 58], [21].  It has been reported[21, p. 1725] that V(D)J genes do not mutate at these high frequencies in tissue culture cell lines, implying additional factors and enzymes are required to fine-tune antibodies in vivo.

No such guidance is available to NDT type mutations, to attain a single-criteria goal for which many cellular components are waiting to help.

Discussion

What we observe is an example of a complex machinery capable of adapting to solve a class of problem: recognition of foreign antigens to protect the vertebrate creature.  A vast number of B-cell variants are generated within a single host organism.  These have been designed to not damage self tissue.

Once an antibody interacts with an antigen, a range of processes immediately enter into play.  The cells whose antibodies bind most strongly reproduce more rapidly.  A specialized microenvironment is organized to separate the best variants and associate with the antigen.  Then a hypermutation process is initiated which is targeted to a very narrow portion of the gene to fine tune the binding affinity.

Strong selection based on a single criteria is accompanied by deliberate apoptosis.  The whole scheme virtually guarantees the intended goal of destroying the undesired cell or bacteria in a short period of time.  It is carefully orchestrated to act at the correct time and place without damaging the organism's own tissue.  Only somatic mutations are involved and the following (“host”) generations retain the original germline DNA.  Only in a superficial sense can the such mutations and selection process be called random, contra what is assumed by NDT.

The details are relevant in determining the suitability as an analogy for NDT processes.  It is not helpful to argue mutations have now been shown able to generate something biologically useful.  Creationists do not argue current genomes are perfect.  They were better in the past.  Mutations damaged function, and in principle a fortunate mutation could undo the damage later.  The fact that function improving, information adding examples are not available simply reflects the low probability of this occurring very often.  But the sheer statistical improbability of producing new, complex multi-gene features by random mutations in the germline, as claimed by NDT, with relevant mutational rates and no prepared guidance to support useful variants, is an entirely different matter.

If the fact that germline mutations are not involved is to be neglected, we could ask whether the data is not more suitable as an entirely different analogy.  Generating B-cell variants by combination of gene elements produces many cells which are tested in several microenvironments.  The information to produce these variants was present in a stem cell ancestor.  Various lineages could develop, each tuned to a different microenvironment (antigen).  The fine-tuning occurs very rapidly, unlike NDT proposals, and the range of variability each lineage can now generate is narrower than the stem cell ancestor.

This resembles if anything the creationist proposal that original Biblical 'kinds' had a wider range of genetic potential and the offspring could adapt according to the particular ecological niche. The genetic information has become fragmented in subsequent lineages, leading to dogs, wolves, dingos and foxes, all descendants of a forefather possessing a more “pluripontential” genome.  The reasonableness of this proposal has been tested empirically[8].  In Berlin, a wolf and (large!) poodle were mated.  Both first generation offspring looked the same, and rather non-descript.  But the second generation led to 4 dramatically different pups with clear trait mixtures of their grandparents, consistent with Mendel genetics.  One looked clearly ‘wolf’ (with all his grandmother’s killer instincts!) and one clearly ‘poodle’.

Once the genes are present for multiple traits, these can lead in a very short time to seemingly new characteristics (curly or straight hair; droopy or sharp ears; glinting eyes; black, white, gray or mixed fur; etc.  in the case of the above breeding experiment, although both parents in the first generation appeared indistinguishable).  Like B-cell maturation, specialization could have occurred very rapidly, by combination of pre-existing genes or gene elements, and not through random NDT-like mechanisms.

Summary

It should be apparent that B-cell affinity maturation is not performed by “random” mutations and the selection process has no resemblance to how whole genome Darwinian macroevolution is supposed to have happened.  The necessary information to solve a specific class of problem, binding of antibody to antigen, was already deliberately prepared for the organism to be protected.  This in a example of an intelligently designed scheme to converge on a solution by carefully generating and identifying constrained change.

Truly random mutations would occur all over germline DNA.  They are not focused to where they can only improve.  The proportion of DNA base pair sequences which improve or generate brand new functions is far lower than those worsening or destroying biological functions.  Survival and reproduction are very stochastic even for genomes which are in some sense better.  The net effect of random mutations cannot be to generate vast ensembles of genes tuned to work together; to produce complex bacteria from simpler; multicellular organisms from single; and complex vertebrate life forms from simple multicellular ones.

B-cell maturation cannot serve as an analogy to make Darwinian macroevolution appear more plausible.

Appendix

Additional details about the regulation of B-cell hypermutations.

As already mentioned, the highest mutational rates occur in the complementarity-determining regions (CDRs) [17] and antibodies with lower affinity are quickly eliminated by design.

NDT claims random mutations may occasionally perform a useful task.  In the case of B-cells, cellular equipment actually appears to guide the generation of mutations and to direct these to specific locations.  Rather than spread randomly over the genome, there appear to be hot spots[21] preferentially targeted by the hypermutation process.  Mutations seem to occur independently there[22, p. 481], [22, p. 486] in different lines.

These hot spots may be due partially to misalignment of the template during DNA replication, given the presence of direct or invert repeats within the same gene [22, p. 481].  This leads to a mismatch of the complementary strand which the DNA repair mechanisms would recognize and leads to mutations.  It has been observed that a high proportion of cytosine to thymine mutations occur, which can result by deamination of methylated cytosine bases.  Possibly specific methylation is targeted to certain positions to accelerate mutational rates[22, p. 486].

Not only does there seem to be a propensity for certain positions to mutate but also specific kinds of substitutions to occur[22, p. 486].

An enhancer element is necessary for full activation of the somatic hypermutation mechanism, located to guide where these are to be concentrated[15].  All Ig transgenes which mutate have an enhancer.  Deletion of the enhancer impairs somatic mutation, which suggests initiation of Ig transcription is necessary for hypermutation.  In addition, a K promoter artificially placed upstream of the constant region was shown to accelerate mutations in the constant region[19].

Although all the regulatory details remain to be elucidated, it is observed[21] that somatic hypermutations are limited to rearranged immunoglobulin V, D and J gene elements which combined to form new genes.  Only then does the signal become active.  The hypermutation mechanisms seem to be regulated according to B-cell development stage[21, p. 545], [22, 475], [7, p. 1155].  Once  B-cell differentiates into a plasma cell which secretes antibody the somatic mutations seem to be turned off[7, p. 1155].  Finally, the germinal centers seem to disappear with the release of stable, high affinity memory cells [21, p. 546].

All this seems sensible, since continued high rates of mutation would destroy many cells with high affinity antibodies[7, p. 1155].

Noteworthy is the fact that rapid somatic mutations are not observed in the T cell receptor although it is built from similar genetic elements as for the  B-cell antibodies[7].


References

[1] Yockey, H.P., Information Theory and Molecular Biology, Cambridge University Press, Great Britain, 1992.  [RETURN TO TEXT]

[2] Scherer, S. and Loewe, L., Evolution als Schöpfung? In: Weingartner, P. (Ed.), Ein Streitgespräch zwischen Philosophen, Theologen und Naturwissenschaftlern, Verlag W. Kohlhammer, Stuttgart; Berlin; Köln: Köhlhammer, pp. 160-186, 2001.  [RETURN TO TEXT]

[3] Truman, R. and Heisig, M., TJ, 15(3):115, 2001.  [RETURN TO TEXT]

[4] Max, E. E., “The Evolution of Improved Fitness”

http://www.talkorigins.org/faqs/fitness.html (Date: September 1, 2001).

      Spetner, L., “Lee Spetner/Edward Max Dialogue”

http://www.trueorigin.org/spetner2.php  (Date: 7 October 2001).

       [RETURN TO TEXT]

[5] Karp, G., “Cell and Molecular Biology”, 2nd Ed., John Wiley & Sons, Inc., 1999.  [RETURN TO TEXT]

[6] Alberts, B., Bray, D., Lewis, J., Raff, M., Roberts, K., Watson J. D., “Molecular Biology of The Cell”, 3rd Ed., Garland Publising, 1994.  [RETURN TO TEXT]

[7] French, D. L, Laskov, R., and Scharff, M. D., Science, 244:1152, 1989.  [RETURN TO TEXT]

[8] Junker, R., Scherer, S., “Evolution: Ein kritisches Lehrbuch”,  Weyel Lehrmittelverlag, Gießen, Germany, 4th edition, 1998, p. 39.  [RETURN TO TEXT]

[9] Schatz, D. G., Oettinger, M. A., Schlissel, M. S., Annu. Rev. Immunol. 10:359, 1992.  [RETURN TO TEXT]

[10] Tonegawa, S., Nature, 302:575, 1983.  [RETURN TO TEXT]

[11] Oettinger, M. A.,Schatz, D. G., Gorka, C., and Baltimore, D., Science, 248:1517, 1990.  [RETURN TO TEXT]

[12] Gray, D., Annu. Rev. Immunol., 11:49, 1993.  [RETURN TO TEXT]

[13] Liu, Y.-J., Johnson, G. D., Gordon, J., and MacLennan, I. C. M., Immunology Today, 13(1):17, 1992.  [RETURN TO TEXT]

[14] Kocks, C., Rajewsky, K., An. Rev. Immunol., 7:53, 1989.  [RETURN TO TEXT]

[15] Küppers, R., Zhao, M. Hansmann, M.-L., and Rajewsk, K., The EMBO Journal, 12:4955, 1993.  [RETURN TO TEXT]

[16] Nossal, G. J. V., Cell, 68:1, 1992.  [RETURN TO TEXT]

[17] Jacob, J., Kelsoe, G., Rajewsky, K., and Weiss, U., Nature, 354:389, 1991.  [RETURN TO TEXT]

[18] Takahashi, Y., Cerasoli, D. M., Dal Porto, J. M., Shimoda, M., Freund, R., Fang, W., Telander, D. G., Malvey, E.-N., Mueller, D. L., Behrens, T. W., and Kelsoe, G., J. Exp Med, 190(3):399, 1999.  [RETURN TO TEXT]

[19] Shen, H. M., Peters, A., Baron, B., Zhu, X. and Storb, U., Science, 280:1750, 1998.  [RETURN TO TEXT]

[20] Capra, J. D. and Edmundson, A. B., Sci. Am., 236(1):50, 1977.  [RETURN TO TEXT]

[21] Lebecque, S. G., Gearhart, P. J., J. Exp Med., 172:1717, 1990.  [RETURN TO TEXT]

[22] Levy, S., Mendel, E., Kon, S., Avnur, Z. and Levy, R., J. Exp. Med., 168:475, 1988.  [RETURN TO TEXT]

[23] p = 3/64    = 0,0469                       Stop codons

            bp                   = 1.8X108                     20,000 genes, 900 bp per gene

            Mutations      = 10-6                          per bp

            From the binomial probability distribution, p(x=0 Stops) = 1,7710-4

            Expected number of Stops = 0,0469 X 1.8*108 X 10-6 = 8.4 per cell division.  [RETURN TO TEXT]


Home | Feedback | Links | Books | Donate | Back to Top

© 2024 TrueOrigin Archive.  All Rights Reserved.
  powered by Webhandlung