Entropy vs. Anti-Entropy (How DNA Defeats the Blackhole)

Discussion in 'Alternative Theories' started by tonylang, Jan 28, 2015.

Thread Status:
Not open for further replies.
  1. brucep Valued Senior Member

    Messages:
    4,098
    It's my experience that the 'bong' shouldn't be blamed for poor cognitive skills. We fail to develope them long before the 'bong' enters the picture. I'd speculate that a majority of the intellectually dishonest cranks posting on the Internet don't have much bong experience.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. tonylang Registered Member

    Messages:
    67
    True to every age of human understanding is our certainty in what we think we know. The use of prevailing accepted knowledge to defend the status quo such as it is. But we know so little.

    Clearly, life doesn't violate any of the laws of nature otherwise there would be not life. It isn't easy for me to know how to respond or indeed if I should respond to some of these arguments. Ironically it is a bit like arguing with religious minded people. Only in this case the religion is a tiny bit of accepted scientific understanding that what is known is sufficiently complete and true so it may explain all else. This is perhaps as it should be, up to a point.

    As for Entropy and Anti-Entropy some of you must know that these concepts have their roots in information theory and it should be no surprise that all which exists is information. The thermodynamics we experience is essentially a macroscopic emergent phenomenon with a very limited scope useful certainly to us in all that we do but which quickly becomes less significant at the sub molecular levels of nature. It is this information regime that constitutes the software of nature. In fact this information regime carries all the way down to the plank scale which, in nature, is where the tire meets the road so to speak.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    No one is defending the status quo. New theories are not only encouraged and welcomed but are the backbone of science. But ideas with out evidence or rigor or tossed on the trash heap with the other failed ideas.

    Was there someone with you when you were typing this?

    Please Register or Log in to view the hidden image!



    Oh my gosh, I just heard the territorial cry of the Greater Crested Pseudoscience bird.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. wellwisher Banned Banned

    Messages:
    5,160
    What is in a name, whether you call it anti-entropy of negative entropy, it will still measure the same. When we freeze water into ice the entropy decreases. Below are some entropy data for water. Entropy, no matter how you wish to define it, has measurable values that are fixed for substances at given conditions. Ice I h at 0C has a negative entropy. This is not a big deal. Those who don't understand entropy might think negative or fixed entropy is magic.

    Ice I h: 3.408 J mol-1 K-1 (0 K)
    Ice I h: -21.99 J mol-1 K-1 (0 °C) ( IAPWS)
    63.45 J mol-1 K-1 (Absolute entropy at triple point)
    [869J mol-1 K-1 (25 °C) [67]
    [1832 J mol-1 K-1 (100 °C, 101.325 kPa) [540]

    Notice that these phase conditions of water have exact entropy values. Phase entropy does not increase with time. The entropy of ice I h at 0C has always been that amount even before the dinosaurs. These are milestones for material entropy, not subject to change. Proteins form exact folds in water; entropy is fixed. In other solvents the proteins do not have fixed entropy.

    Water forms hydrogen bonds, which have both polar and covalent bonding character. These are close in energy and can move back and forth. This transition in the hydrogen bond can act as a binary switch, with the water able to switch hydrogen bonding between higher and lower entropy states, without the hydrogen bond ever breaking. Since entropy needs energy to increase, switching to an increase in entropy becomes an energy sink. If we switch back and entropy falls, energy will be released.

    Water is important to life, not only because of the water-oil effects that can force phase entropy down so water can lower its free energy, but water can also switch between higher and lower entropy bonding states forcing the organics to lower entropy into order or increase entropy into a transition state.

    Where the consensus theory of life goes wrong is assuming any solvent will work, therefore they assume you can ignore the solvent and therefore ignore water. This big mistake focuses the mind on only the organics as though they float in a vacuum and water is nothing but a bath, and not an entropy controller. The idea that life can emerge with only random interaction of organics is unlikely. The organic needs water to regulate entropy at various phase milestones; protein folds. The protein will not do this on its own, but needs water to give it that unique push and pull.
     
    Last edited: Feb 9, 2015
  8. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    You really should try understand entropy. You say absurd things like entropy needs energy. Entropy is energy it doesn't need energy. It is the energy that is unavailable to be converted into work.

    Like I said before entropy increases and decreases all of the time in open systems. One of the first things you learn in chemical engineering thermodynamics is the use of a Mollier Diagram for water and steam which is enthalpy VS entropy. Mollier Diagram. Notice that depending what part of the steam cycle you are in the entropy will increase or decrease, this is because a steam cycle is an open system.

    I object to the term anti-entropy for the same reason that I object to the term anti-energy it is meaningless. I also object to negative entropy for the same reason I object to negative energy - again it is meaningless and only adds to confusion.

    Since entropy is the energy that is unavailble to be used for work, what do you suppose negative entropy is, energy that can be turned into work??
     
    wellwisher likes this.
  9. tonylang Registered Member

    Messages:
    67
  10. exchemist Valued Senior Member

    Messages:
    12,544
    Well thanks for the reference. This indeed shows that a couple of mathematical researchers had a go at defining anti-entropy back in 1969. However, there is no clue in this paper as to what they think it can be used for, i.e. what value it adds to science. It appears to be a mathematical curiosity of geometric programming, rather than a thermodynamic concept with any demonstrated utility.

    You keep mentioning entropy and anti-entropy as if they are a pair of equivalent concepts. I repeat, entropy is a well defined and fundamental thermodynamic concept, whereas anti-entropy is not and is simply not a term used by science. There is no equivalence between the two whatsoever.

    I bet you had to really scour the internet to find this reference, didn't you? Do you know what it means? I bet you don't. Whereas if you look up entropy, you will find pages and pages on this subject. If you want to use the term anti-entropy, I think you need to show first how it has proved itself useful in science in some way. Frankly I very much doubt that it has.
     
  11. wellwisher Banned Banned

    Messages:
    5,160
    One needs energy to increase the entropy. This helps change the state. Entropy increase cannot occur without energy available. As you pointed out, entropy is often associated with the waste heat due to inefficiency. This energy output makes it possible for the entropy to increase. Look at the units of entropy, within water data [869J mol-1 K-1 (25 °C) [67]; The entropy has the units of joules/mole/K. This is not exactly energy due to the K term.

    The Gibbs free energy equation is G = H-TS. G and H are types of energy/mole. The T is temperature from which the degrees K of entropy comes from. The energy used to increase entropy comes from free energy G or enthalpy H. I am not defining entropy in terms of a concept, because this debates becomes like discussing a political issue. I am sticking to hard measurements and the unit. From this one can infer the relationship of energy to entropy. If there is no energy; G or H, the TS equals zero. Entropy is a state variable in that it characterizes a given state of a system. The folded enzyme is a given state.

    I was thinking about negative entropy and the idea anti-entropy. Anti-entropy is not a normal science term. This is a new coined phrase of a special case of negative entropy. As an example of this special case, say we have information, like science data, within the brain of a person. Say this information becomes changed due to entropy. However the loss of the original order of the data, is not noise, but creates an innovation. This is still entropy, compared to the original data, due to the change of state and an energy input.. However, now the order is better than the original. The way you will infer if the entropy goes negative, is to look to see if the result of the change is an output of energy. We will not try to add energy to change the new innovation back to the original data back; will increase entropy. Rather, the excitement of the innovation and the drive to implement will release energy.

    This is an aspect of entropy this is rarely discussed by information theory, because in information theory there are no intelligent computers which can randomize data into higher order; negative entropy induction into a stable key state that releases the Kraken. What most of the tradition discusses is information into lower quality; loss. Life is the same way, due to water in that water forces order so a useless protein information state because the basis for a reaction.
     
    Last edited: Feb 10, 2015
  12. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    That shows you still do not get it. Entropy is energy. You are saying you need energy to increse the waste energy.

    Joules, get it, energy.

    As is S which is Energy per mole per degree!

    No, no, no. The entropy is already present in the system, it doesn't come from the GFE or the H! The Gibbs Free energy is equal to the total energy of the system (H) minus the unusable energy of the system (TS).

    edit to add: I noticed you liked my post where I said you did not know what you were talking about. That seems strange, if it was a mistake you can unlike the post to reverse it.
     
  13. wellwisher Banned Banned

    Messages:
    5,160
    Entropy=joules/mole/K, while Energy=joules/mole. Can you see the unit difference (K)?

    As another example;

    Velocity = d/t, while Acceleration =d/t/t ; one extra unit makes a difference.

    These are not the same thing even if both have distance and time. The entropy of the universe has to increase but energy is conserved. Their behavior is different. If you lump these as one you will be lost.
     
    exchemist likes this.
  14. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    No. Energy has the units of joules.
    Joules/mole is the amount of energy per mole.

    Can you see the unit difference (mole)?

    Entropy is the energy that can be converted into work or useful energy. The units of entropy is energy/mass/T. As the temperature is raised the entropy decreases because more energy is available to be converted to work.

    The easiest way to look at this is that there is a certain amount of energy in a process, whether that energy is chemical, thermal, mechanical or other. The useful energy availble for the process is not 100%, the 'left over energy' is entropy. The reason that entropy always increases in a closed system is because no process is 100% efficient.
     
    Last edited: Feb 10, 2015
  15. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    That makes no sense.
    The person who thought that idea up expended energy. The process was not 100% effecient - entropy increased. No negative entropy or even anti-entropy.

    Please Register or Log in to view the hidden image!

     
  16. exchemist Valued Senior Member

    Messages:
    12,544
    Origin I have to say this is not quite my understanding. If entropy were simply leftover unavailable energy, it would have the units of energy, which, as Wellwisher says, it does not. I would agree it is a measure of the unavailability of energy to do work. But it is a quantity in its own right, not merely a subcategory of energy, surely? Sideshowbob seemed to me to capture it well in saying it was a measure of the degree of dissipation of energy. In other words it is about how energy is distributed (e.g. S = k lnW). Isn't it?

    (But I do recognise entropy is one of those tantalising concepts that is notoriously hard to capture accurately in words.)
     
  17. origin Heading towards oblivion Valued Senior Member

    Messages:
    11,890
    You are right and I am simplifying the concept and losing some precision in doing that. I thought Sideshowbobs definition was really good. Wellwishers ideas such as 'anti-entropy' or 'entropy is the fifth force' are so off that mark that I sort of lose my mind in frustration.

    Entropy is a quantity in its own right and as such can be calculated for a system and has a real affect. But I also think of it as a subcategory of energy (entropy doesn't need energy as WW thinks), because every process has a portion of it's energy that will be 'lost'.

    The internal energy is defined as:

    U = TS - pV + G

    So this gives us the total energy of the material or process. The TS energy term is clearly dependent on the temperature. As the temperature goes up the energy lost due to entropy increases. At absolute zero the energy from the entropy terms goes to zero. So I suppose negative entropy could be the entropy at less than absolute zero.
     
  18. arfa brane call me arf Valued Senior Member

    Messages:
    7,832
    Entropy has a much tighter definition these days in terms of information. The entropy of a closed system is equal to the amount of information in a description of that system's state.

    Which is the number of bits in the description. Thermodynamic entropy however, has units of Joules per degree Kelvin. Temperature is the average energy per particle at equilibrium, so you're dividing total energy by average energy per particle, leaving . . . particle (!) times an energy ratio (which should be dimensionless).
    How much information, or what kind of algorithm describes a particle with an average energy; is there an algorithmic "temperature"?
     
  19. tonylang Registered Member

    Messages:
    67
    No one ought to consider themselves to be a person of science unless and until they themselves have conceived, or are in the process of conceiving at least one original scientific idea.

    My leap is to suggest that what dominates in nature’s software domain is the density of natural complexity of life just as what dominates in natures hardware domain is the hardware density of none life. It is with this hardware density that we have been distracted for the entire duration of our scientific activity. Hardware density or lack thereof is what defines our science from the least dense particles of the standard model to most dense black holes. Then there is life, at our local level it is the other state of nature, but what aspect of life and blackholes influences the underlying software of nature what footprint of each of these is left in the quantum states of nature’s software ocean? I see evidence that the quantum profile of life is as immense in nature as is that of nonlife.
     
  20. paddoboy Valued Senior Member

    Messages:
    27,543

    No, that's wrong. Not everyone will achieve greatness, that's just crazy to even contemplate....
     
  21. exchemist Valued Senior Member

    Messages:
    12,544
    I think you are speaking of "information entropy", aren't you?

    While this is a concept that has become a lot more prominent since I left university, so I'm not at all expert on it, I have the distinct impression that information entropy cannot be simply equated with the original thermodynamic entropy. (See this Wiki article for example: http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory )
    If they are indeed not the same, then it seems to me there may not really be an answer to your question. But I'd be interested in the view of physicists.
     
  22. exchemist Valued Senior Member

    Messages:
    12,544
    I'm sorry Tony but you are lapsing into appalling woo here. What on earth is meant by "the quantum states of nature's software ocean"?

    I continue to think you have become mesmerised by some illusory notion of balance, between animate and inanimate matter. Sort of Yin/Yang, yer know, man(?)

    It is far from clear what you mean by nature's "software" and "hardware". Language like this suggests you think nature is operating as a programmable calculating machine and that this "software" is the program. Do you actually have any evidence for such an idea?
     
  23. wellwisher Banned Banned

    Messages:
    5,160
    I am defining entropy in terms of reality, which is based on open systems. This is not the same as closed systems. The closed system is used as an introduction to entropy. In a closed system, the entropy has to increase. This example is used to prove the second law.

    An open system is different in that it can have some aspects that can lower entropy, while other aspects can increase entropy, as long as the entire system increases entropy; second law. Open system allow the transfer of energy,mass, momentum, etc, in and out of the system that adds a wild card.

    In an open system, one part can lower entropy and another part can increase entropy, as long as the sum for the open system is positive. With information transmission, there is no such thing as a closed system, even in a single computer. Stability of information requires designing material states, that are stable enough to handle low level energy that might be increase entropy, such as from outside interference and heat. The signal itself may generate heat in the lines so even that has to be addressed, such as with cold.

    With proteins in water, the protein fold into exact folds. These define a state of fixed entropy. The open system of the cell requires this protein stability, so other information can be transferred and not cause folding interference in the protein; alter the entropy of the fold. The water shrouds the protein with a protective shell, that also conduct information within the cell via hydrogen bonding. The hydrogen bonding of the water acts like a binary switch that can flip to change the entropy of the water shell, thereby changing the state of protein entropy; reaction.

    If evolution occurred in a closed system, then entropy would need to increase, period. This is where trial and error model comes from. If we use an open system, we can balance increases of entropy in one place in the cell with decreases of entropy in another as long as the net is positive. This is where water comes in since the continuum can switch between high and low states locally and in zones to make a balance that has bandwidth.
     
    Last edited: Feb 11, 2015
Thread Status:
Not open for further replies.

Share This Page