Talk:Entropy/Archive 12

From Wikipedia, the free encyclopedia

Archive 5Archive 10Archive 11Archive 12Archive 13Archive 14

Reversible vs. irreversible

The article so far fails to distinguish clearly between entropy changes due to a reversible process and an irreversible process. The two are usually conceptually distinct and the article should strive to provide an example of each so that the distinct is clear to the reader.--212.3.132.195 (talk) 13:35, 27 January 2013 (UTC)

I agree. There seem to be some common misconceptions about what a reversible process is, and why most processes are irreversible. There is nothing in the 2d law that says a closed system cannot spontaneously run in reverse. All that is needed is to add all of the work to the waste. However, the universe is supposed to be the only closed system. Therefore, it is entirely possible that the universe will, one day, stop expanding, collapse in on itself and --Bang-- form the universe all over again. However, the universe is simply not running in that direction right now. Within the universe, there are only open systems. For example, I can gather a certain amount of air and compress it. This will concentrte the heat energy, adiabatically raising its temperature. This is a reversal of entropy. However, it is only a small reversal in a very large river. In actuality, energy had to be added to the compressor through the motor, and the motor's electrical energy (no matter how it was made) came from the sun on it's way to the universe. The compressor is simply a small reversal along its way. Some of the energy from the compressor cannot wait, and will disperse into the universe. Therefore, if you try to power the compressor by relesing the air into an air motor, you will never be able to pump the compressor up. Ultimately, the energy is on a one-way trip, and any reversal is simply like an eddie current in a river, being small and having little effect on the overall flow. Zaereth (talk) 21:47, 29 January 2013 (UTC)

Restructure

There is a serious structural problem with this article. Most of the material in this article should be moved to Entropy (classical thermodynamics) and this article should then be redirected to Entropy (disambiguation). That is the only way to make plain the other subject areas of entropy from information theory, quantum mechanics, etc. No wonder this article has been so confused and has failed to find focus.--190.204.70.243 (talk) 07:18, 9 January 2013 (UTC)

I second that motion.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
18:01, 9 January 2013 (UTC)
For an article such as this, I would rather see it become a parent article rather than a simple DAB page. DABs are great for completely different subjects that simply share the same name, like Mercury (planet) and mercury (element). Here we have different aspects of the same subject. Personally, I think it's better to briefly summarize them in the parent article, and provide "main article links" to the subordinate articles. A similar example is the potential energy article, which is the parent article of gravitational potential energy, nuclear potential energy, and so on. I think this subject can benefit from having a similar parent article, provided we can make it much more readable. (Actually, I think this article could also use an introduction section, briefly summarizing the introduction to entropy article, as well.) Zaereth (talk) 18:29, 9 January 2013 (UTC)
A lot of other languages point to this page. Redirecting it might be disruptive. Almost all of the classical thermodynamics should be moved to Entropy (classical thermodynamics).--61.141.152.67 (talk) 05:03, 11 January 2013 (UTC)
I agree with Zaereth and above. Have this as an introductory article for entropy in its many forms, links to specific subjects. Then go back an fix the links to this page if appropriate. PAR (talk) 07:55, 11 January 2013 (UTC)
Scholarpedia's article on Entropy is a good example of what an overall article introducing the many meanings of entropy could look like. Note that SP is licensed under a strict non-commercial license though, which is not compatible with WP, so we can't just copy the SP article over here either in whole or part.
Note that when this has been raised in the past, there has been a vocal lobby protesting that the primary meaning of entropy in the thermodynamic one, as e.g. often first encountered in high-school chemistry, or though discussions of the Second Law in mass-audience physics material. That in the past is what has led to this article having the scope it has (i.e. entropy in thermodynamics, with non-thermodynamic understandings of the term entropy handed off to other articles).
I'm not against re-scoping this article, but we should perhaps raise this to a formal RfC, give links to past discussions, and advertise it widely, including especially at WT:PHYSICS and WT:CHEMISTRY. Jheald (talk) 14:06, 11 January 2013 (UTC)
I've been giving this a lot of thought over the last few months. Perhaps a top-down approach is not the best way to tackle this. Maybe it would be better to try reverse-engineering, from the bottom up. The more I think about it, the more it seems like we need a fresh place to start, rather than trying to reword or clarify what is already there. I don't have much spare-time to actually sit down and work it all out at once. However, I've been thinking that, within the coming weeks, I would begin an introduction section in my sandbox. (Anyone who has seen my work knows I'm fond of intros.) Intros tend to give a little more latitude, rather than trying to cram it all in the lede, giving some room in which to tie it all together.
I like a good challenge, and this should surely present one. I am also excited because I see great potential for some serious collaboration here, and am happy that we have people representing all of the various aspects of entropy, such as Jheald, PAR, Count Iblis, Dolphin51... (Forgive me if I missed anyone.) I can't promise quick turn-around but, once I get something written, I hope I can get a some input from everyone. Thanks. Zaereth (talk) 22:00, 11 January 2013 (UTC)
I've started working on an intro-section, at User:Zaereth/sandbox. If anyone is interested in leaving some feedback, or helping to correct any mistakes, it would be appreciated. Zaereth (talk) 23:54, 18 January 2013 (UTC)
The initial paragraphs of your introduction confuse entropy with heat capacity. Entropy is not a rate: it is an integral. Heat capacity = dQ/dT but entropy = integral dQ/T . I have removed your first two paragraphs, which introduce the confusion.--212.3.132.195 (talk) 13:31, 27 January 2013 (UTC)

(Undent)Well, I was trying to avoid that confusion. I think you're right, though, that it did introduce some confusion. Perhaps describing it as an interval variable rather than a ratio variable is better. (I was thinking that, because both joules and kelvin are ratio variables, entropy must be a ratio as well.) However, as the new addition is written, it doesn't make much sense, especially to a newcomer. As PAR mentioned above, saying that entropy is an "abstract function of state" doesn't really say anything, and I believe this only adds to the confusion. The first sentence there should concisely state exactly what the math says entropy is. The difficulty lies in making the correct translation.

Entropy is definitely not an abstract thing, but a measurable property of heat, which I was trying to define from the macroscopic, thermodynamic standpoint first, before getting into other forms. To see this, perhaps it would be helpful to point out the difference between entropy and heat capacity. Heat capacity is the amount of energy that needs to be added to a certain amount of something to change its entire temperature a single degree. For instance, It takes a certain amount of energy to raise the temperature of a gallon of water a single degree.

On the other hand. entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only. Entropy does not deal with the heat capacity of the entire substance, but only with the energy needed to change (or "maintain" perhaps would be a better word) the temperature at the boundary where energy is being transferred.

In other words, as energy is added to the gallon of water, the temperature of the boundary does not change instantly. If it did, the energy and temperature would be equal, and the entropy would be nothing. Instead, if adding 1000 joules only increases the boundary temperature to 800 kelvin, then logic dictates that some of that energy is being used for something else. By dividing 1000 by 800, we get 1.25. If providing 800 degrees at the boundary is 100% of the energy needed to perform work, (in this example, performing work is simply heating the entire gallon one degree), then you will actually need to add 125% of the needed amount. The rest of that energy will not be used for work (temperature change), and will only be released as waste once the gallon of water cools. I think the main thing to understand is that entropy is not just something that occurs in machinery, but it occurs anytime heat transfers. Zaereth (talk) 01:01, 29 January 2013 (UTC)

"entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only."
What you wrote makes no sense. The correct relation is that temperature is (proportional to) the amount of energy that must be added to change the entropy by a given amount.
"If it did, the energy and temperature would be equal, and the entropy would be nothing."
????? ? Entropy is not a form of energy. Nor is temperature.
You seem quite confused. Jheald (talk) 22:17, 29 January 2013 (UTC)
Ok, so the definition you're giving me for entropy is T=Q/dS. I don't doubt that your math skills are better than mine, but what confuses me is how a definition for temperature can be a definition for entropy. Zaereth (talk) 22:36, 29 January 2013 (UTC)
I've been trying to think about a good analogy when talking about things like temperature, entropy, or energy on the macroscopic scale. One that comes into mind is electrical energy (Watts per second). Electrical energy is defined by the parameters "power (W)," "amps (I)," and "volts (E)." The definitions of each of each of these parameters is: Power = W=IE, Amps = I=W/E, and Volts = E=W/I. None of these parameters are the same things, but they are all necessary dimensions of something called watt/seconds or "electrical energy." Similarly, the relationship between entropy, temperature, and thermal energy are all necessary parameters of something we, as a matter of convenience, call "heat." A quick look at a TS diagram can easily show the relationship. Zaereth (talk) 19:19, 30 January 2013 (UTC)
I'm not sure that's helpful (as well as being WP:OR. You can have entropy (and energy and temperature, for that matter) without having heat.
In response to your earlier comment, it's a change people come to as their understanding of the subject develops. So when people get their first thorough introduction to entropy -- e.g. perhaps in the context of the thermodynamics lectures of a first year undergraduate chemistry course -- then what seem natural are energy, and temperature that one can measure on a thermometer. These are the familiar safe everyday touchstones that one works out from; whereas entropy can seem nebulous, abstract, artificial -- a state function (as the syllabus may require one to prove), but one for which the meaning seems rather remote.
On the other hand, by the time one has moved on to statistical mechanics, entropy is something one can calculate, something that comes to seem completely instinctive (and also very fundamental); and then it is temperature which starts to seem the derived quantity, and in extreme cases not so easy to measure, unless one comes to define it as 1/(dS/dQ). Jheald (talk) 19:41, 30 January 2013 (UTC)
It seems like some of the later paragraphs of the introduction are not specifically about entropy. They mention entropy and talk about heat flow, but they do not prepare the reader to, for example, actually use or even become familiar a TS diagram. Explaining entropy should go beyond just reiterating the second law of thermodynamics. It should familiarize the reader with what it is like to solve problems with real entropy data. Even a high school student can perform a calculation with a TS diagram if properly instructed.--86.96.65.146 (talk) 20:37, 30 January 2013 (UTC)
Yes, and you can have volts and amps without having power. Forgive me for the OR, but I wasn't aware that I needed to cite talk page discussions. For a reference, the book Kinetic Theory and Thermodynamics says, "The quantity Q/T is a definite thermal property of the working substance and is called Change in entropy.... The entropy of a substance is a real pysical quantity like energy, pressure, temperature that can be measured in a laboratory.... From the above relation, we can say that the dimension of heat energy are the same as that of the product of entropy and absolute temperature." For a ref about electrical energy, see the book Basic Electrical And Electronics Engineering. I agree with you on the statistical side of things, but I fear beginning with that approach will lead to a "cannot see the forest through the trees" syndrome. Zaereth (talk) 20:54, 30 January 2013 (UTC)
It is a real physical quantity, but when it is "measured in a laboratory", what happens is that heat and temperature are more directly observed and then entropy change is computed.--86.97.247.44 (talk) 00:36, 5 February 2013 (UTC)
I understand that. Those words are from R. K. Agrawal, the author of the book. Personally, however, I'm beginning to agree with Hans Fuchs (The Dynamics of Heat: A Unified Approach to Thermodynamics and Heat Transfer): The problem appears to be one of cognitive linguistics. On the one hand, there is the language of math, and I think we all have an intuitive understanding of that language, (whereas others may speak it quite fluently). On the other hand, there is English which is far more complex, containing a much broader and subtlervariety of functions and relationships. The disjoint between math (thermodynamics in particular) and other forms of language seems to be due to the inability to translate an otherwise abstract math formula, and then project that translation metaphorically in standard cognitive imagery and standard cognitive dimensions. I understood this even before I read Fuchs' book, which is why I gave the above example about electricity (eg: amps and volts are easy to measure, but watts must be calculated). These same types of cognitive structures, expressions, and equations are found in all languages, and are almost always based on the relationship between at least three characteristics. (ie: This and that are those. These are those but not that, etc...) Personally, however, I've decided to just leave well enough alone until the linguistics is sorted out. My only suggestion is to include some real-world examples, which can be found in abundance, from cosmology to pattern recognition, if one really cares to look. Zaereth (talk) 04:10, 5 February 2013 (UTC)

(undent) Getting back to the topic of restructuring. It might be helpful to see what some of the other languages do on this topic.

But English currently has:

Perhaps we should restructure along the lines of the French and German.--190.73.248.92 (talk) 14:15, 2 February 2013 (UTC)

Most of the various other uses of "entropy" have been moved out of this article. The article is now focused better on just thermodynamic entropy. We may have to settle for that for now.--76.220.18.223 (talk) 22:54, 6 February 2013 (UTC)
Mostly very sound, I think.
But in my view the article should at least touch on the view that the entropy state function is simply the amount of information (in the Shannon sense) that would be needed to specify the full microstate of the system, that is left unspecified by the macroscopic description.
Many people find that this is the way they like to think about what entropy fundamentally is (although, equally, there are others that utterly dislike it). Jheald (talk) 23:10, 6 February 2013 (UTC)
On further inspection, that is still stated pretty much word-for-word in the lead; but perhaps there does need to be some more in the article to contextualise it. Jheald (talk) 23:13, 6 February 2013 (UTC)
Mostly restored the "information theory" section. Added your description as an introductory paragraph. It still seems a bit long.--200.246.13.126 (talk) 16:48, 7 February 2013 (UTC)

Specific entropy

I have removed the anchor for "specific entropy" and put sentence to define it in the lead. I have also updated the redirect specific entropy to just point to this article with no anchor. The anchor I removed did not make sense and seems to have been added by someone just searching for the first occurrence of the phrase, which turned out to be inappropriate.--178.210.45.18 (talk) 18:11, 6 February 2013 (UTC)

Peer review?

So what else does this article need before it is ready for peer review? One thing that would be nice is if we could have just one section that deals with the classical and statistical descriptions.--200.165.161.254 (talk) 00:01, 8 February 2013 (UTC)

Merged the two sections after the "History" section. I also added a "Function of state" section under "definitions". Much better outline: now, each element of the material has a natural home. It is really how these concepts come together in one obscure function that makes entropy difficult to grasp.--189.51.14.154 (talk) 14:17, 8 February 2013 (UTC)
Another thing about the style of the article: it really should be about the math i.e. geared for scientist and engineer student at university. That is the adult version of the article. Generalizations about entropy almost all belong rather in the section (or article) about the second law. So...what else does this article need before it might be considered ready for peer review?--77.122.164.30 (talk) 19:28, 11 February 2013 (UTC)
Um. Are you all the same person, that keeps changing IP (which seems to shift from continent to continent). Or were you three different people? It would be a bit helpful if you could get a user-id (or user-ids plural, if you're more than one of you), so the rest of us can know which of the edits were all made by the same guiding mind...
That said, I think the idea of peer review is a really good one, because no question this should be one of the most important articles under WP:PHYSICS, and in my view it simply hasn't been up to scratch.
I have to confess I haven't read through your edits in detail. But one key question that anyone taking on to reshape this article needs to come to a view about is what relationship it is going to have to all our other articles on entropy -- in particular to Entropy (classical thermodynamics), which I think is in reasonably good shape, in terms of being relatively accessible and not over-complicating things too early, but also covering a lot of good material. So wherever you take this article, it needs to have a distinctive role, that is different from any of the other articles we have on entropy. In the past, I think, it has been to outline the idea of entropy in classical thermodynamics, and also the idea of entropy in statistical thermodynamics, and to discuss how the two are compatible, while leaving quite a lot of each subject to be addressed in more depth by the more specific articles. So this article becomes a coat-stand, if you like, to hang various more detailed more specific articles off. I think that is how it has been in the past. But it might be objected that such a structure may not be perfect, because too many people may only get to this article, and not read things that would actually be valuable to them, because they never get to the sub-articles that would discuss them. But I think the real question is first to get clear to yourself what ideally you think the scope of this article should be for it to contain.
Assuming you get past that stage, something that almost all our Physics articles are very bad at compared to the rest of Wikipedia is referencing. (Perhaps because the ethos in Physics is so much to be able to work things through for yourself from first principles, if you really understand them.) For contrast, compare one of our few entries in Physics that is rated as having Featured Article status, namely Equipartition of energy, and see just how dense the referencing is. If you really want to take this article a step further, one of the things that could help would be to take some of the favourite classic textbooks on thermodynamics -- eg Reif, Kittel etc -- and add the relevant page references to each key idea or each section in this article. That also has the advantage of helping you to really scrutinise this article, and make sure that everything we say really does solidly reflect something presented in what we would consider an authoritative source. Working through the books may also have the flipside advantage of drawing your attention to perhaps anything we really ought to be discussing or pointing out in our text but maybe are not.
As for peer review, when you feel you've got the article into a state that you feel happy with, you could post at WT:PHYSICS or WT:CHEMISTRY or perhaps WT:AST asking for an informal internal review. Don't probably expect too much from this -- while there are some very high calibre people out there (particularly in the Maths project), most of us are just run of the mill editors, and it may be a long time since we studied thermo at Uni, or alternatively typical editors may be students who are still studying their way through their first degree -- either way typical editors probably don't have the grasp of detail that a really good peer review needs. But they can be very useful in telling you how your article reads to somebody not particularly initiated -- does it communicate the big ideas well, so they stand out? Does it flow? Does it avoid getting slowed down in unnecessary technicalities, that could be left until the reader has a better grasp of the overall shape of the topic? That's the sort of quality of input maybe to expect from such an informal internal review.
There is also WP's official assessment process -- how does the article compare with the criteria to be rated B, or A, or WP:GA, or WP:FA? From what I've seen this also may tend to be judged by generalists, who may be very hot on whether a particular sentence has a supporting reference link, or on whether the references are typeset just so, but who may be very weak on the actual science, and simply not qualified to spot whether the article may have glaring inaccuracies, glaring gaps, or material that simply isn't the best way to explain concepts.
So the scary one is asking for external peer review. This is probably best done when you have got very straight what intended scope you want the article to cover; got some informal internal reviewers to tell you whether they think it flows and develops reasonably, and when you've done quite a lot of work referencing it, and making sure it reasonably tracks the principal most authorititative sources. At that stage, it becomes a question of who you know (or who anybody at WT:PHYSICS knows) who actually teaches this stuff at a university, who could be persuaded to dig into it and give it a proper critique. It's a high bar to aim for, but if you pass that, you may be well on your way to getting it acclaimed as a featured article, and seeing your article as the big draw in the hot spot of the front page for a day. Jheald (talk) 22:46, 11 February 2013 (UTC)
If it could just get to be an A-class article or maybe a Good article, that would be nice. If it were in a state where a university student looking for answers such as: "What is it? How was it discovered?" would feel that they got those answers, at least, that would nice. The main changes made recently where to remove any suggestion that this is some sort of disambiguation page. This article is probably best named entropy (thermodynamics) since that is how the categories (and the French and German articles) are structured. Note that the first line of article has been "This article is about entropy in thermodynamics." The fate of the other two articles ( entropy (classical thermodynamics) and entropy (statistical thermodynamics) ) seems unclear since there is considerable overlap with this one. Look at the new section named "Entropy of a system" with material that is peculiar to neither definition. The choices seem to be: 1. Just give up and do nothing (except maybe just disclaim that the other two are specialized articles with a lot of overlap with this one.). 2. Merge the other two articles back into this one 3. Merge most of this article back into the other two. Any other viable options?--189.51.14.154 (talk) 23:29, 11 February 2013 (UTC)

(undent) The person who created the two sub-articles is this person:

The eoht.info web site (of which he is the owner and primary editor) is full of original research about trying to apply the math of thermodynamics to the human relationships between men and women and making babies, etc. There was technically nothing wrong with the sub-articles when he created them in 2006, but the whole idea of having both sub-articles was, I think, not well thought-out. I think that entropy (classical thermodynamics) should be merged back into this article because otherwise I do not think that we can provide any satisfactory explanation to the reader about what distinction we are trying to make in having both articles around.--200.109.197.133 (talk) 06:25, 12 February 2013 (UTC)

Merger proposal

The Entropy (classical thermodynamics) page is completely encompassed by this article. There is no sensible explanation about why we should have both pages.--200.109.197.133 (talk) 07:30, 12 February 2013 (UTC)

Proposal noted at WT:PHYSICS Jheald (talk) 12:53, 12 February 2013 (UTC)

*Support The reason seems apt. There may be other sections that can be factored out if the article gets too large.--Probeb217 (talk) 04:49, 13 February 2013 (UTC)

Vote struck out. Elockid (Talk) 04:58, 13 February 2013 (UTC)
  • Oppose. I've deliberately tried to give time to let other people come in here, and I'm but disappointed that they haven't, because this proposal deserves a fair hearing and fair consideration. It's not clear that we do have the scopes of the various articles on entropy correctly set up at the moment -- for example, the proposal above that the Entropy article, as the main entry point for people typing the word into the search box, ought to be something more like the Scholarpedia article , presenting an overview of all notionss of entropy, essentially replacing our current Entropy (disambiguation) article, rather than being focussed specifically on thermodynamic entropy.
I am not sure about that proposal, because in the past people have very strongly made the argument that it is thermodynamic entropy that people are most likely to be looking for, either from general interest about the nature of the universe, or because they're specifically meeting the concept in Physics, Chemistry or Engineering classes. As a result the argument is made that the WP:PRIMARYTOPIC for entropy should specifically be thermodynamic entropy. But that may or may not be right.
However, I am also uncomfortable about the current proposal. Because I think there is still a role for what was the original intention of this article, namely to be an entry-point for all ways of thinking about the entropy of physics, chemistry and thermodynamics -- so introducing the entropy of classical thermodynamics certainly, but also introducing statistical mechanical entropy, and the von Neumann entropy of quantum statistical mechanics, and showing how they (may) all marry up consistently with each other; as well as introducing how people think physical entropy should be talked about qualitatively -- particularly whether or not it's useful to talk about the log of the number of microstates as being predominantly determined by the extent of energy "dispersal", or the pros and cons of more traditional language involving "order" and "disorder".
I think that's a useful scope for the article to have, to give people as much of a handle on the question "So, what is entropy?"; though it's not an easy scope to do well. It's a rather different scope to trying to write a fully detailed article just about entropy in classical thermodynamics, which is what Entropy (classical thermodynamics) sets out to do. In particular, I think what the merge proposer writes, that the scope of that page should be "completely encompassed by this article" is probably not true. For the scope above to be possible, I think this article has to be written WP:SUMMARY-style, knowing that not all the detail can be presented here, that this article itself can only present a "taster" of a full proper treatment of each domain of thinking and calculating about entropy. The approach also seems to me very much in line with WP:TECHNICAL -- to try to give people as much as you can that is accessible as an overview of the whole subject first (the skins of an onion approach), rather than expecting to march the reader in order through the full detail of each domain one after the next.
So that's what I think had been the rationale between the article as it was (though I think it's proved a difficult brief to deliver). I still think it's a rationale that makes some sense. But if others want to argue why a different model for the article would be better, I'm not going to stand in anybody's way. (And sorry if I've been WP:TLDR for everybody). Jheald (talk) 11:46, 13 February 2013 (UTC)

Also oppose. The thermodynamic entropy article is, within its limitations, quite coherent, careful, and clear. The general entropy article has some nice attempts to include a more modern, general definition. However, the organization is a mess and various idiosyncratic and sloppy ideas dangle off various parts of it. Today I removed one section that was entirely wrong. Until this article is in better shape, I think it would be a shame to mess with the well-constructed thermodynamic article.Mbweissman (talk) 03:27, 8 March 2013 (UTC)

Delete first paragraph?

It seems to me the article might be better off without the first paragraph, which reads:

"Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy."

The article concerns thermodynamic, not specifically statistical mechanical, entropy, but the first sentence is more applicable to statistical mechanical interpretation of thermodynamic entropy than to the thermodynamic concept itself, which is worth understanding independent of the statistical mechanical accounts that may be given of it. It is also a specific, Boltzmannian attempt to give statistical mechanical interpretation to entropy, and may be at odds with more Gibbsian versions of entropy, so again, is probably best not to lead with. The validity of the second sentence is highly dependent on your definition of entropy, and again, it is probably best not to lead with it, but to discuss it later in the article. It is a reasonable point of view that isolated systems do not evolve toward thermodynamic equilibrium (and many attempts to prove that this is always so have failed), but rather that thermodynamic equilibrium tends to be reached through interaction with an environment.

If there's no strong objection within the next couple of weeks, I may give this a try, also checking to make sure that the points made in these sentences are addressed, with more nuance, later in the article.

The next paragraph introduces a thermodynamic definition of entropy, which seems a better starting point.

MorphismOfDoom (talk) 12:30, 5 June 2013 (UTC)

P vs. p: Power, pressure, probability

Today Koen Van de moortel changed one P to a p with the edit summary "P=Power, p=pressure". So I skimmed the whole article with the thought of making the notation uniform and found that

  1. power actually does not occur in this article
  2. pressure is sometimes P and sometimes p; and also
  3. probability is sometimes P and sometimes p.

So to be consistent we can choose Option 1 P = probability, p = pressure everywhere, or Option 2 P = pressure, p = probability everywhere. Opinions? Dirac66 (talk) 19:00, 21 May 2013 (UTC)

Yes, P for pressure and p for probability (Option 2) seems best to me but either seems reasonable. Go for it! MorphismOfDoom (talk) 12:32, 5 June 2013 (UTC)

I have now counted the uses of each symbol at the moment in this article. Pressure is P 9 times and p 4 times, and probability (by coincidence) is p 9 times and P 4 times. So option 2 recommended by MorphismOfDoom corresponds to what previous editors decided most often and I will revise the article to conform to that. As for other articles, their usage also seems to vary between P and p in an unsystematic way. Dirac66 (talk) 02:23, 10 June 2013 (UTC)
OK, now P is pressure and p is probability. Except that the section Entropy and other forms of energy beyond work contains two equations where P is pressure and p is ... momentum. So we have another reason to use P for pressure, to avoid confusion with momentum within the same equation. Dirac66 (talk) 02:45, 10 June 2013 (UTC)
Many classic texts use W for probability, this goes back to Boltzmann himself, since it stands for "Warscheinlichkeit". — Preceding unsigned comment added by 98.109.238.95 (talk) 22:08, 28 June 2013 (UTC)

biassed advocacy of "energy dispersal" point of view

The entire section on energy dispersal should be deleted. All but one of the sources it references are innapropriate for a wikipedia article since they are not authoritative or representative of a widely accepted approach. A retired professor's personal website should not be relied on. One stray article he wrote for an education journal, even though a respectable peer-reviewed publication, also is insignificant. An unsupported unsubstantiated and, I suspect from perusing google books, false statement about shifting trends in Chemistry textbooks is just too remote from the purpposes of this article. A direct quote from Atkins might be useful. This section is way too big to be in proportion to the importance of this hobby-horse point of view.98.109.238.95 (talk) 22:05, 28 June 2013 (UTC)

Keep that section. Even though it's a minority viewpoint, it should be covered, on the grounds that in Wikipedia, minority viewpoints get some coverage. See Neutral point of view for information on such policies.--Solomonfromfinland (talk) 00:24, 11 September 2013 (UTC)

Mystified

Hi,

I am mystified by this formulation (which appears to be often used in most unpredictable ways): "It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it." Are not order and disorder purely psychological (not physical) phenomena? - 92.100.165.149 (talk) 16:48, 13 December 2013 (UTC)

The word disorder has many meanings as listed on the disambiguation (dab) page Disorder. Entropy is related to the meanings discussed in the articles on Randomness and Order and disorder (physics), not to the various disorders in psychology and medicine. Dirac66 (talk) 03:27, 18 December 2013 (UTC)

edit about heat, work, and transfer of energy with matter

The edit summary, and references given, state the well established reasons for the edit that said "In the model of this present account, as shown in the diagram, it is important that the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.<Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, pp. 44, 146–147.><Haase, R. (1971). Survey of Fundamental Laws, chapter 1 of Thermodynamics, pages 1–97 of volume 1, ed. W. Jost, of Physical Chemistry. An Advanced Treatise, ed. H. Eyring, D. Henderson, W. Jost, Academic Press, New York, p. 35.>"Chjoaygame (talk) 23:28, 24 May 2014 (UTC)

Why is it important to have paths for heat and work different from those for matter transfer? Paradoctor (talk) 23:38, 24 May 2014 (UTC)
The edit summary, and references given, state the well established reasons for the edit.Chjoaygame (talk) 00:54, 25 May 2014 (UTC)
a) Why isn't this in the article?
b) If this is "well established", then there should be secondary sources instead of the primary sources provided. Considering the level of generality we're talking about, this should mentioned in the major textbooks, shouldn't it? Paradoctor (talk) 09:10, 25 May 2014 (UTC)
c) The Born reference does not seem to support the statement, could this be an instance of WP:SYN? Please provide a quote of the Haase reference supporting the claim, so we can verify. Paradoctor (talk) 10:52, 25 May 2014 (UTC)
Thank you for drawing attention to this.
The usual introductory presentations of definitions of work and heat for thermodynamics start simply by considering closed systems, that is to say systems for which matter transfer is not allowed. For this, Born was a main leader of the move to insist that heat be defined as a residual transfer of energy after mechanical transfer as work is calculated. No question is considered then of what would be so if matter transfer were allowed. Many texts are vague or uninformative about it. The question is considered at some length in the Wikipedia article on the first law of thermodynamics. Perhaps that may help. I am not sure that it is appropriate for me to try to summarize that here.
Born on page 44 writes "... thermodynamics is definitely connected with walls or enclosures. ... in free contact with its neighbourhood ... the flux of energy and material constituents through its boundary, which themselves cannot be reduced to mechanics." In an appendix, on pages 146–149, Born provides more detail. He presents a model in which transfers of matter and of energy as work pass through physically separate portals. Such a separation is also shown in the diagram for the model in the present Wikipedia article.
The present problem is to analyze a change of the entropy content of a system due to a thermodynamic process. It is due not only to entropy production within the system during the process, but also to transfer of entropy. The present article tackles the problem by considering transfers of energy as heat and work. In general, those quantities are defined only in paths in which there is no transfer of matter, as shown in the diagram in the article. In paths in which there is transfer of matter, they are in general not definable. That is what Born means when he writes "which themselves cannot be reduced to mechanics". Haase on page 34 writes: "On the other hand, work and heat are not functions of state; they essentially relate to interactions of the system with its surroundings. Hence these quantities as defined heretofore have no definite meaning for open systems. (cf. Defay (1929). See also Haase (1956a))." Haase goes on to point out "There are however, important exceptions: the external work Wa and the dissipated work Wdiss can always be calculated from external actions. If, for instance, there is flow of electricity .... ". These exceptions are for quantities of work that pass by paths separate from matter transfer. Haase then summarizes "But the total work done on the open system remains indefinite."
Born and Haase are not alone. For example, Münster writes on page 50 "As explained in §14 the classical point of view (equivalence of heat and work) as well as Carathéodory's point of view (definition of heat) are meaningless for open systems."<Münster, A. (1970), Classical Thermodynamics, translated by E.S. Halberstadt, Wiley–Interscience, London, ISBN 0-471-62430-6.> Further references about this are in the Wikipedia article about the first law.
There are those who feel that this is not a happy situation, and they offer arbitrary definitions that are not tied to the basic physics of heat and work. Such arbitrary definitions are not appropriate here, where the paths are shown to be separate in the diagram.
One needs to be careful in reading texts on non-equilibrium thermodynamics, where the word heat is used in the vague senses of the nineteenth century, when it was still ok to speak as if heat were found within a body like internal energy, rather than in the strict sense nowadays used that insists that heat is energy only in transfer. I will not here go into detail about this, beyond commenting that it does not override the above comments.Chjoaygame (talk) 23:22, 25 May 2014 (UTC)
I see no response to the immediately foregoing. I think it justifies removal of the tag [failed verification]. As for the tag [why?], I am not persuaded that a detailed reason for the tagged sentence is needed in the article. The basic reason why it is needed is that the formula that is offered applies only to a special model as set out in the section and illustrated in the diagram in the article. Without that special model, the formula has no meaning.
I am writing in the reference to the special nature of the model, which was deleted, I think inappropriately.Chjoaygame (talk) 04:55, 30 May 2014 (UTC)

The side bar formula on work and entropy change

I agree. But the assumption of maintaining equilibrium should have started on the first equation, not the second. Otherwise it looks to the casual reader like you missed the chain rule in calculus. — Preceding unsigned comment added by 2601:C:8D80:249:250:8DFF:FEB5:7FA4 (talk) 22:24, 31 May 2014 (UTC)

Thermodynamic Systems Decay

"evolve" versus "decay"

asking for reliable source

Classical vs Quantum

notation

Related Articles

Wikiwand AI