Posted by:
aspidoscelis
at Sat Nov 11 14:11:36 2006 [ Report Abuse ] [ Email Message ] [ Show All Posts by aspidoscelis ]
The short version:
Whenever a molecular clock is used, it needs to be calibrated for the taxa being used. There is no standard rate that can simply be assumed. This means you need fossils that can be used to date nodes on the tree, to give you a calibration that can then be extended to areas of the tree for which we have no fossils. However: fossils are in short supply and accurately placing a fossil on a tree can be problematic. Fossil dates are also often misinterpreted; for instance, some researchers make simple mistakes like assuming that the age of the oldest fossil of genus X is the date of the origin of genus X, when it is really only the minimum possible age of the genus. Generally, establishing maximum possible ages, or exact dates of origin for lineages, is difficult or impossible (you can't use a sparse fossil record to establish that a genus *wasn't* around), and as a result molecular clock estimates are generally minimum possible ages.
Nodes on a tree are also sometimes estimated based on geological or geographic events; for instance, if we have two sister genera that are separated by a mountain range, we can hypothesize that the two diverged at about the time the mountain range rose. This presents its own problems, but may be useful nonetheless. We also know that rates of molecular evolution vary not only between but also within lineages; we have no good way of eliminating this as a confounding factor in molecular clock estimates, short of including larger numbers of dated nodes (of course, if we had a lot dates on the nodes to start with, we wouldn't need the molecular clock estimates!).
Patrick Alexander
[ Reply To This Message ] [ Subscribe to this Thread ] [ Show Entire Thread ]
|