The Law of Entropy

Discussion in 'Psychology' started by andrasnm, Feb 28, 2006.

  1. I may cause a bit of friction here, but does anyone else happen to think that the second law of thermodynamics is a crock of shit.

    I mean, theoretically the concept of a closed system is simple, but in reality, there is no such thing as a close system.

    Every system within the universe is open to other systems within the universe, constantly sharing energy. The universe is the only closed system. right?

    Yet the energy within the universe is incomprehensibly massive.

    The second law of thermodynamics was a good idea at the time but it's way past its used by date.

    The only reason it hasn't been replaced yet is because no-one has yet to come up with anything more elegant.

    There are plenty of real life examples of
    events that defy the second law of thermodynamics. Like the 600 odd successful cold fusion experiments conducted since 1989.

    Closed system - excess energy. Not possible apparently. All the result of scientific error. Yeah right.

    Sorry. Didn't mean to get too deep.

    Runningbear
     
    #21     Mar 1, 2006
  2. RB, never forget: but for Lord Kelvin you would be drinking your beer warm. (The second law is what makes refrigeration possible.) Or was it the quest for cold beer that led to the discovery of the second law? It's been 43 years since I hooked thermo, so I forget. And if you don't believe in closed systems, you have never seen my ex-wife's thighs. No amount of energy input was sufficient to raise the temperature enough to increase the entropy there.
     
    #22     Mar 1, 2006
  3. Your definition of the 2nd law is contrary to what thermodynamics professors say.

    The amount of energy is constant , ie the first law of thermo.

    Simply, and that is difficult to do, the 2nd law says that energy flows from concentrated to diffuse, ie a glass of hot water in a cool room will cool, not get hotter.

    See:http://www.secondlaw.com/
     
    #23     Mar 1, 2006
  4. Enginer

    Enginer

    I have to explain this to non-engineers all the time. Professors just confuse 'em.

    Think of a closed bottle. It contains some matter that is hot, and some that is cold. Relative to a heat engine in the bottle, we normally think of the "hot" matter as being more able to make the heat engine do work. (Forget the fact that the cold material is a "sink" and can force work also by extracting heat thru the engine...)

    Therefore we say the "heat" is of a higher quality than the average, or (unfairly) the cold. It's entropy is >lower<.

    Now we let the contents come to an equilibrium, 'average' temperature. We define this as higher entropy, less ability to do work.

    Likewise we >can< define a work of art or a patent as having lower entropy than a pile of rust or paper dust. We can even say that 'evil' is the friend of entropy, and 'love' is the enemy of entropy.

    It's all symantics. Use it as a concept, a tool, or hate it.
     
    #24     Mar 1, 2006
  5. My Therm prof used to use the coffee mugs we brought to class as an example: the 2nd law states that the hot mug gets cooler not vice versa. Extending the law to heat engines, etc gets a little more complicated.

    DS
     
    #25     Mar 1, 2006
  6. nitro

    nitro

    I think you mean S = K*(T^-1), or equivalently S = K / T

    nitro
     
    #26     Mar 9, 2006
  7. nitro

    nitro

    It is very difficult to apply a riqorous mathematical entity like entropy to non-inanimate objects because entropy is only rigorously defined for systems that are in [near] equilibrium. So if you have a room full of people, and 1/4 of them are insane and the other 3/4 of them are geniuses, and you try to measure the entropy of this system (based on some measure of say the complexity of the conversations being discussed), you get jiberrish. Only for systems where "temperature" differentials are small for the entire system does it make sense to measure it's entropy, i.e., systems that are close to being in equilibrium.

    So a system not only has to be closed, it has to be near equilibrium. There are ways to fake the result in the case where the system is not in equilibrium, but then you would have to know how to partiion the space into sub-domains where the temperature differentials are small (group the insane people with the insane...) treating each as an "isolated" system, and then take entropy measures within each subpace. Then add at the boundaries...

    "Utter Chaos" is a bad term - unable to do "work" is the proper term.

    nitro
     
    #27     Mar 9, 2006
  8. LowRisk

    LowRisk

    Energy in a CLOSED system has to remain constant as defined by the 1st law of thermo. Entropy by definition is the randomness of a system (open or closed). Entropy of a closed system always increases if left alone. The odds of a closed system to remain orderly are nil because orderliness is actually just one state of randomness out of infinite amount of permutation of states which we define as randomness collectively. Thus, to say entropy of a closed system always increases is to state the obvious. Strictly speaking, it's a theorem not a law or a theory. Tell others you heard it first at Elitetrader from a nerd daytrader who took thermodynamics in college just for fun. And yes I aced it.
     
    #28     Mar 9, 2006
  9. bellman

    bellman

    okay, finally someone who actually understands the term entropy!!! Most of the other posts belong in the jokes thread.

     
    #29     Mar 10, 2006
  10. Agreed. By the way Heinz von Foerster, who is a genius physicist, describes organic self-regulating systems - like humans for example - as entropy retardors.
    And - entropy itself is a misnomer because the physicists who discovered it where too dumb to translate a simple term into plain greek and mixed it up. Intended meaning was inability to change ( in the transitive sense, not the intransitive... For all science major analphabets: the transitive meaning of inability to change translates into inability to perform work.).

    It has NOTHING, absolutely NOTHING to do with energy. Even though it is permissible to say that the entropy of energy is increasing.
     
    #30     Mar 10, 2006