A team of international scientists launched an ambitious project on Thursday to genetically identify, or provide a barcode for, every plant and animal species on the planet. By taking a snippet of DNA from all the known species on Earth and linking them to photographs, descriptions and scientific information, the researchers plan to build the largest database of its kind.From WordIQ.com, the history of the barcode,
The idea for the barcode was developed by Norman Joseph Woodland and Bernard Silver. In 1948 they were graduate students at Drexel University. They developed the idea after hearing the president of a food sales company wishing to be able to automate the checkout process. One of their first ideas was to use Morse code printed out and extended verticaly, producing narrow and wide bars. Later, they switched to using a "bulls-eye" type barcode.Let’s see, coding is applied to identifying items in a database. It’s quite interesting that a randomly generated, purposeless entity such as DNA could be utilized in such a manner. ---------- And then there’s… Frozen Accidents (from September 1, 2004) Back in May I wrote a post titled Ultra-conservative DNA in which we see certain DNA sequences that supposedly evolved to a certain state (and, more importantly, function) and then stopped - or, froze - in place. Per the August 3rd edition of Reasons to Believe's webcast Creation Update we hear of a study titled Why nature chose A, C, G and U/T: An error-coding perspective of nucleotide alphabet composition. From the study's abstract,
The question of whether the size and make-up of the natural nucleotide alphabet is a consequence of selection pressure, or simply a frozen accident, is one of the fundamental questions of biology. Nucleotide replication is essentially an information transmission phenomenon, and so it seems reasonable to explore the issue from the perspective of theoretical computer science, and of error-coding theory in particular. In this analysis it is shown that the essential recognition features of nucleotides may be naturally expressed as 4-digit binary numbers, capturing the hydrogen acceptor/donor patterns (3-bits) and the purine/pyrimidine feature (1-bit). Optimal alphabets consist of nucleotides in which the purine/pyrimidine feature is related to the acceptor/donor pattern as a parity bit. Numerically interpreted, such alphabets correspond to parity check codes, simple but effective error-resistant structures. The natural alphabet appears to be an adaptation of one of two optimal solutions, constrained to its present size and composition by a combination of chemical and coding-theory factors. (emphasis added)Given that the evolutionary paradigm posits natural selection as a blind and unguided process, it is no wonder that potential plateaus in the process are defined as frozen accidents. It's also interesting that the process being addressed, that of parity check codes, is that of intelligent action. Hardly an accident. If nucleotide replication is essentially information transfer, and if theoretical computer science and error-coding theory allow us to analyze the parity check codes contained within the nucleotide alphabets, what could be driving the conclusion that the entire process was driven by determinism and chance? From the Christian's perspective, God created mankind in His image. One of the many implications of such a doctrine is that God has endowed mankind with creative ability inasmuch as God expresses His creative ability. The pre-existence of information, alphabets, parity check codes, and the like, should not be surprising in that one would expect the God of the Bible to express His creative ability in forms that mankind could not only recognize, but have the ability to develop as well. ---------- As well as… On plans, templates, and similarities (from April 21, 2004) Over at The Panda’s Thumb we see a post that highlights a study done on limb loss in vertebrates. John Lynch states,
An interesting article in this week's edition of Nature suggests that at least in some fish, alterations in a single gene bring about evolutionary change in the form of limb (fin) loss.Two follow-up posts on TPT, each by P. Z. Myers, can be found here and here. In the first follow-up P. Z. states,
Some of the complicating features of developmental genetics are pleiotropy and multigenic effects: that is, that the genes required to build an organism are all tangled together in an intricate web, with multiple genes required to properly assemble each character (that's the multigenic part), and each gene having multiple effects on multiple characters (that's pleiotropy). One might think of the organism as a house of cards, each card supporting all of the cards above it, so that tinkering with any one piece leads to catastrophic collapse. This isn't the case, of course. While developing systems are all elaborately interlocked, they also exhibit modularity and surprisingly robust flexibility.In the second follow-up P. Z. states, with regards to the idea that the modularity and robust flexibility of a system could be used as evidence of design:
Quite the contrary, I see evidence of mechanisms that permit integrated evolution of organisms, with no designer required.He provides more detail, via his own blog Pharyngula with, Development. Evolution. Genes. Fish. What's not to like?. In it we see the following image: Essentially, what we're hearing is that the integrated complexity found within the genetic structure of these species achieved its integrated complexity through blind chance because... well... they're here aren't they? Isn't it amazing how nature has solved the problem of spitting out either limbs or fins? - all with the flip of a switch! Yet imagine the power of templates. Imagine the efficiency in using a plan that allows for minor alterations that garner major changes. Imagine a set of instructions, a code - if you will, that allows one to step through a financial accounting program and, depending on the desired outcome, run a report of actual cost expenditures by region vs. running a report of revenue by project. Shucks, I don't have to imagine it at all - I ran a set of those reports today on a piece of software designed by semi-intelligent people. On the morphological side, consider the skeletal and muscular structure of the human arm and hand. Now note a robotic arm and hand that mimics the same functional capabilities as its human counterpart. As the website for the Shadow Robot Company states, "The human hand has twenty-four powered movements. Shadow have implemented every single one, with all the power and range of movement, that the human hand has... The muscles in the upper arm and torso are analogous to the human's." Have the robotic designers used the basic structural and morphological elements of a human arm and hand as a guide for their design criteria? How about a steam rotary engine? Let’s look at the schematics for such a device. If we cross reference now with electrical rotors we find the following from Penntex: a rotor, and a stator. Cross referencing with pump rotors we find, at Seepex pumps: a universal joint Common sense tells us that there are similarities in these various human designs because they are all working off the same basic template (i.e., rotary motor design). Variances within the details are due to varying specific parameters with regards to design criteria as well as to function, materials, power supply, etc. Now compare the human artifacts with a flagellar motor per the NCBI and ARN websites: , How interesting that the components of the flagellum precisely match up with the designed components of rotary motors. Additionally, note that the work in robotics, as well as the development of the rotary motors, did not occur through blind, random chance, but through intentional, rationalistic thought processes – i.e., through design.