“So a club granting its members more freedom is analogous to a macrostate with higher entropy.” Microstates, Styer clarifies, do not have entropy, only macrostates. “If there are more microstates corresponding to the macrostate, the macrostate has a larger entropy,” he summarises to OpenMind. A club (macro-state) with more permissive rules than another allows its members (micro-states) a greater variety of choices. Styer prefers another word to explain entropy: freedom. Theoretical physicist Dan Styer, of Oberlin College (Ohio, USA), offers another analogy that helps to dismantle the idea of disorder : a bottle of Italian salad dressing has separate, highly ordered layers of oil and vinegar, yet it is in thermal equilibrium, in a state of maximum entropy. This is one of the reasons why, according to Watson, we probably cannot travel back in time, as this would violate the second law of thermodynamics, the increase in entropy. This tendency to equilibrium is related to the arrow of time, a concept associated with entropy because it deals with irreversible processes that move in only one direction in time. It is unlikely that all six will be on the same side, a low-entropy situation a distribution of three on each side is more probable. Watson gives an example: there are six atoms of gas in a room. Dan Styer explains entropy with the analogy of the bottle of salad dressing, which has separate, highly ordered layers of oil and vinegar, yet it is in thermal equilibrium, in a state of maximum entropy. “ Disorder is technically wrong,” theoretical physicist Peter Watson, professor emeritus at Carleton University (Canada), tells OpenMind “but the idea is correct, and if you replace it by probability, I don’t have any issue,” he adds. Boltzmann’s definition refers to the statistical measure of disorder, understood as the probability distribution between the different possible microstates. Years after Clausius’ definition, the Austrian physicist and philosopher Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning that relates the microstates of matter (atoms, molecules) to the macrostate (observable) of the system. Bryan wrote in Nature that entropy is “that most difficult of all physical conceptions.” The formulation of entropy In coining the term, the physicist ended his work by summarising the first two laws of thermodynamics in this way: “The energy of the universe is constant,” and “The entropy of the universe tends to a maximum.” But Clausius’ choice of term has not made the concept any easier to understand as physicist and Nobel laureate Leon Cooper wrote in 1968, by choosing the term entropy, “rather than extracting a name from the body of the current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing.” In 1904, thermodynamics mathematician George H. This spontaneous behaviour of a system is the basic foundation of the second law of thermodynamics, as intuited by Clausius years before his definition of entropy. This is why “entropy in a thermodynamic sense is an energy divided by a temperature,” summarises chemist-physicist Emil Roduner, professor emeritus at the University of Stuttgart (Germany) to OpenMind. To express the unusable heat lost, he defined entropy-etymologically, a transformation of energy content-which measures how spontaneously a hot body gives up heat to a cold body as the system tends to equilibrium, unless interfered with to prevent it. Clausius was seeking to explain mathematically the workings of energy in the Carnot heat engine, an optimised model of the heat engine-on which the original diesel engine was based-proposed four decades earlier by the French engineer Sadi Carnot. When the Prussian physicist Rudolf Clausius defined entropy in 1865, the idea of disorder was nowhere to be found. But what does entropy actually mean? The Austrian Ludwig Boltzmann introduced the current formulation of entropy, giving it a statistical meaning, understood as the probability distribution between the different possible microstates. And yet, with this approximate meaning, it has almost become part of the lexicon of everyday life. Except that, in reality, it won’t: physicists are constantly explaining that no, entropy does not mean disorder. If we don’t get our house in order, we are told, entropy will eat us alive. And, of course, we all know what entropy means: disorder. We find it in the unintelligible phrases of some famous spiritual guru, and even in self-help advice and motivational coaching. There are certain words that can embellish any speech or quotation.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |