(Not so) Junk DNA

Discussion in 'Biology & Genetics' started by Eflex tha Vybe Scientist, Oct 22, 2003.

  1. sideshowbob Sorry, wrong number. Valued Senior Member

    The Junk DNA deniers remind me of hoarders: Everything in the house is so vitally important that nothing can be thrown away. Somewhere in that pile of old newspapers there's a valuable article that they vaguely remember wanting to reread.

    Never mind that evolution predicts junk DNA. Reality predicts redundancy.

    Everything doesn't have to have "a purpose". Everything doesn't have to be a cog in a wheel. The universe is more impressive than that.
  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. Eflex tha Vybe Scientist Registered Senior Member

    lol @ Junk Deniers. You make it sound like a cult. Im simply of the opinion that we need to see more comprehensive studies of the proteome/genome before we can confidently call any DNA Junk. Junk implies that it has no use whatsoever. Seems a bit arrogant at this early stage in genomics to dismiss something as useless.

    At first the claim was 97% Junk DNA, now its shrunk down to 50-75% Junk DNA. In time, as our understanding of DNA evolves and grows, I predict that the percentage of DNA deemed "Junk" will continue to shrink. Its really quite simple, we dont know enough about the human genome/proteome to call anything Junk. Thank goodness for peer review so that we can jettison these overly simplified terms (like Junk DNA or Planetary Nubulae).
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. ElectricFetus Sanity going, going, gone Valued Senior Member

    And I agree almost all of it has one function or another.
    Structural: A large percentage of the genome consist of constantly repeated sequences over and over again, many of these regions have known structural function like the telomeres and centromere.
    Regulator: as this and other studies suggest a large percentage may have vague regulatory function to controlling the activity of genes acting as distant promoter and depressor sites.
    RNA only genes: genes that do no produce RNA for protein translation but produce RNA strands that provide regulator roles as well as some enzymatic ability.
    Archaic genes: of course a very small percentage also consists of genes that clearly onces had full protein production function in our ancestors but not any more, genes for thick body hair for example, which in some rare cases a back mutation occurs activating it and causing some cases of werewolfs aaah I mean Hypertrichosis.
  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. Jens Registered Member

    To me this article (free!) was very helpful on this topic:

    It has become very fashionable to study long noncoding RNA (ncRNA). Given that only 1%–2% of the mammalian genome carries protein-coding information, genomicists have oft pondered the relevance of the remaining 98% which, as a whole, bear little phylogenetic conservation among any except for the most closely related species. It is by now clear, however, that the vast intergenic space is anything but quiescent. An astounding 70%–90% of our nucleotides are apparently transcribed (Kapranov et al. 2007b; Mercer et al. 2009).
    [End Quote]

    The article explains how one of the two X-chromosome are deactivated.
    If this is a typical case, than lot of the "junk" DNA is actually translated and serve a function. But sometimes it is speed-dependent effects, meaning that the sequence does not matter but simply the length. So deleting it could cause severe damages, while exact sequence conservation does not matter.
  8. Jens Registered Member

    I agree to Eflex tha Vybe Scientist.

    also to consider:
    we just found out that the chromosomes in the nucleus have a specific order in space (depending on cell type!) and are not just randomly distributed. So timing effects actually probably matter. Bacteria do not have this order and also do not have much "junk DNA". (There was an title page article in "Spektrum der Wissenschaft" which corresponds to "Scientific American" in the USA).
    ...and actually malignous cancer is identified by its multiple chromosome abberations (so loosing this order seems to have effects).....
  9. sideshowbob Sorry, wrong number. Valued Senior Member

    Yes, I was refering to the cult of creationism.

    "Junk" can be used as a paperweight or a doorstop. Having A use is not the same as having a "designed purpose". As I said, we would expect evolution to leave some odds and ends lying around, maybe doing something but not particularly essential to anything. That qualifies as junk in my dictionary.
  10. rpenner Fully Wired Valued Senior Member

  11. river-wind Valued Senior Member

  12. Eflex tha Vybe Scientist Registered Senior Member

  13. Eflex tha Vybe Scientist Registered Senior Member

    Long stretches of DNA previously dismissed as "junk" are in fact crucial to the way our genome works, an international team of researchers said on Wednesday.

    It is the most significant shift in scientists' understanding of the way our DNA operates since the sequencing of the human genome in 2000, when it was discovered that our bodies are built and controlled by far fewer genes than expected. Now the next generation of geneticists have updated that picture.
  14. rpenner Fully Wired Valued Senior Member


    "Despite redundancy in the genetic code, the choice of codons used is highly biased in some proteins, suggesting that additional constraints operate in certain protein-coding regions of the genome. This suggests that the preference for particular codons, and therefore amino acids in specific regions of the protein, is often determined by factors unrelated to protein structure or function. On page 1367 in this issue, Stergachis et al. reveal that transcription factors bind within protein-coding regions (in addition to nearby noncoding regions) in a large number of human genes. Thus, a transcription factor “binding code” may influence codon choice and, consequently, protein evolution. This “binding” code joins other “regulatory” codes that govern chromatin organization, enhancers, mRNA structure, mRNA splicing, microRNA target sites, translational efficiency, and cotranslational folding, all of which have been proposed to constrain codon choice, and thus protein evolution (see the figure). "


    "Genomes contain both a genetic code specifying amino acids and a regulatory code specifying transcription factor (TF) recognition sequences. We used genomic deoxyribonuclease I footprinting to map nucleotide resolution TF occupancy across the human exome in 81 diverse cell types. We found that ~15% of human codons are dual-use codons (“duons”) that simultaneously specify both amino acids and TF recognition sites. Duons are highly conserved and have shaped protein evolution, and TF-imposed constraint appears to be a major driver of codon usage bias. Conversely, the regulatory code has been selectively depleted of TFs that recognize stop codons. More than 17% of single-nucleotide variants within duons directly alter TF binding. Pervasive dual encoding of amino acid and regulatory information appears to be a fundamental feature of genome evolution."

    Main takeway: DNA is a mess as past successes constrain future development into a hodge-podge of "good-enough" and accidental piles of complexity that just manages to get the job done without evidence of a plan or central organizing principle.
    Except by going back to the Encode results of last year you ignore that they vastly overstated their results and so you have lost the thread.


    "The Encyclopedia of DNA Elements (ENCODE) project suggested in September 2012 that over 80% of DNA in the human genome "serves some purpose, biochemically speaking". This conclusion however is strongly criticized by other scientists. The general consensus among knowledgeable scientists is that a large percentage of the human genome is junk DNA. Naturally, this junk DNA is all noncoding DNA but that does not mean that all noncoding DNA is junk.
    The term "junk DNA" became popular in the 1960s. It was formalized in 1972 by Susumu Ohno, who noted that the mutational load from deleterious mutations placed an upper limit on the number of functional loci that could be expected given a typical mutation rate. Ohno predicted that mammal genomes could not have more than 30,000 loci under selection before the "cost" from the mutational load would cause an inescapable decline in fitness, and eventually extinction. This prediction remains robust, with the human genome containing approximately 20,000 genes. Another source for Ohno's theory was the observation that even closely related species can have widely (orders-of-magnitude) different genome sizes, which had been dubbed the C value paradox in 1971."

Share This Page