Talk:Temperature/Archive 3
From Wikipedia, the free encyclopedia
| This is an archive of past discussions about Temperature. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
| Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | Archive 6 |
Mean temperature
Some climate denialists are oposed to the very notion of mean temperature.
The idea was advanced by Christopher Essex, Ross McKitrick, and Bjarne Andresen in February 2007 issue of the Journal of Non-Equilibrium Thermodynamics: http://www.uoguelph.ca/%7Ermckitri/research/globaltemp/GlobTemp.JNET.pdf
It appears the paper ( and the idea) don't have any merit:
http://www.realclimate.org/index.php/archives/2007/03/does-a-global-temperature-exist/
http://rabett.blogspot.com/2007/03/once-more-dear-prof.html
http://scienceblogs.com/deltoid/2004/05/mckitrick3.php —Preceding unsigned comment added by Mihaiam (talk • contribs) 18:04, 10 May 2010 (UTC)
- "Some climate denialists are oposed to the very notion of mean temperature." Mihaiam, you are in the wrong place. Read what it says at the beginning of the article -->
- This article is about the thermodynamic property. For other uses, see Temperature (disambiguation). Temperature is an intensive thermodynamic property, meaning it is localised i.e cannot be averaged, (see Intensive and extensive properties). Please refrain from using terms like denialist which may be considered abusive. --Damorbel (talk) 18:40, 10 May 2010 (UTC)
I wasn't trying to be abusive, just framing the context generating the humorous idea that a mean temperature cannot be defined or even useful. On the contrary, intensive thermodynamic properties are often averaged, at least for practical purposes, the above paper notwithstanding. —Preceding unsigned comment added by Mihaiam (talk • contribs) 20:10, 10 May 2010 (UTC)
- "intensive thermodynamic properties are often averaged" Before making remarks like that you should read the link I gave you. You clearly do not understand what is involved. You can see examples of intensive properties in the article here If you try to make an 'average' temperature you have to put it in another, perhaps climate related, place.
- In thermodynamics e.g. this article, if there is a need for something like 'an average temperature' it means that there is disequilibrium of some sort i.e. the temperature is not uniform, the entropy is not at a maximum and there will be energy flow of some sort. As I have pointed out before, the 'Greenhouse Effect' requires heat transfer from a cold troposphere to an Earth's surface that is already many degrees warmer. This is a massive breach of the 2nd law of thermodynamics which says that heat always flows from a hot place (the Eath's surface) to a cold place (the troposphere). Claims for heat to go against a thermal gradent are the 'business' of perpetual motion inventors.
- What do you mean by 'practical purposes?', This article should be about the well established science of thermodynamics. I have the strong impression you are not familiar with this branch of science, a rather difficult but extremely practical one.
- Referring to other contributors as denialists means quite plainly you define their contribution as worthless, as you do here, this is clearly abuse. Please avoid these personal attacks. --Damorbel (talk) 21:09, 10 May 2010 (UTC)
- Oh good grief, enough with the long words, try reading the article: Since thermodynamics deals entirely with macroscopic measurements, the thermodynamic definition of temperature, first stated by Lord Kelvin, is stated entirely in empirical, measurable variables - any *measurement* of temperature is inevitably an average, even if over a very short time and small region William M. Connolley (talk) 21:44, 10 May 2010 (UTC)
- William, you are clearly right out of your depth. To grasp these matter properly you need to get a grip of kinetic theory and (preferably) statistical mechanics. Kinetic theory is the fundamental mechanism of heat in atoms and molecules, from there you will get a basic understanding of heat transport at atomic level. Without this you are totally lost when dealing with matters seemingly so simple as temperature. Sorry you find the words a problem, the concepts are far from obvious but quite essential for a competent analysis. You take a very strong line on climate matters, how this is possible when you appear to have only a passing aquaintance with thermal physics I do simply do not know.--Damorbel (talk) 11:40, 11 May 2010 (UTC)
- I'll ignore the falacious appeals to expertise William M. Connolley (talk) 11:49, 11 May 2010 (UTC)
- Then I'll explain. A mean temperature only exists in an ensemble of molecules when the average momentum of the molecules is the same, there is a characteristic Maxwell–Boltzmann distribution of velocities (or momenta for inhomogeneous ensembles). If an ensemble has a non-uniform average i.e. there is a (macroscopic) difference in averages between the various parts (aka a temperature difference) then the concept of an average temperature is inapplicable. For a start the condition is unstable because heat will transfer from the warmer part to the cooler. The effect on the temperatures of this transfer cannot be predicted without detail knowledge of the thermal properties of the various parts so, in this case, any figure given as 'an average temperature' is meaningless. The misunderstanding that commonly arises comes with thermometry, for practical reasons molecular temperatures are not measurable so thermometers measure a 'bulk' temperature which averages and cannot readily detect small temperature differences; that does not mean that the concept of temperature does not extend to the microscopic level. It is similar with infrared thermometers, a mottled infrared image will always create some kind of 'temperature' reading that may be satisfactory for some purposes but scientifically it is probably worthless. --Damorbel (talk) 13:12, 11 May 2010 (UTC)
Damorbel is right in the sense that a system that is not in thermal equilibrium but which consists of parts that are (to a good approximation) in thermal equilibrium at different temperatures, is not equivalent to a system in thermal equilibrium at some average temperature. But this (well known) issue has been abused by some people to argue against Global Warming. If we want to mention something about this, I propose we stick to the relevant physics. E.g. one can mention that temperature gradients give rise to transport phenomena (heat conduction etc.). You can then delve deeper and explain that this means that a gas that is in dynamical equilibrium where you maintain a temperature gradient cannot be described by a velocity distribution function obtained by inserting the local position dependent "temperature" in the Maxwell distribution. It is in fact the deviation from this that gives rise to the heat transport. So, in a sense this can be taken to mean that even the concept of a local thermal equilibrium breaks down.
Now, we actually all use this fact when solving heat conduction problems. We don't have to bother about the fact that the local speed distribution is not precisely Maxwellian; you only need to consider that if you also want to compute the heat conduction coefficient from first principles. But you can just as well take the heat conduction coefficient as a given and then the local state at some point is specified by a local temperature, albeit that the distribution is not Maxwellian at the local temperature. But the point is that whatever it is, is now implicitely fixed.
This fact makes it possible to do thermodymnamics in practice where things are not in thermal equilibrium. You can make models of the Earth's atmosphere and describe the time evolution using differential equations. Those equations contain transport coefficients like the viscosity, heat conduction coeffienct etc. etc. Such a description arises precisely from a perturbation away from exact global thermal equilibrium and is valid if you are close to global thermal equilibrium. But here "close" means that the deviations from the Maxwell distribution is not large. This will only break down in extremely violent processes, certainly not when you model the Earth's atmoposhere. Count Iblis (talk) 14:56, 11 May 2010 (UTC)
- A recent edit by User:89.43.152.15 restored the global temperature map with the remark that the consensus in this thread is "clear enough". I see no statement here that says "okay, let's put in the temperature map". I thank Count Iblis for his/her insightful remarks, but I don't see how they address the issue of whether to have the map.
- My opinion is this: Temperature is a very fundamental property. To explain it, we should stick to simple, tangible examples. The planet Earth is arguably as far from this ideal as we could get. It would be like discussing childhood obesity in the mass article; it's an important topic, but it doesn't belong. The map does not add to the article. The text makes no reference to it. It is clutter that can be removed with no negative impact. Spiel496 (talk) 18:58, 12 May 2010 (UTC)
- I disagree with your argument, but at least it is plausible. The original argument (this article can't talk about average temepratures) clearly wasn't plausible William M. Connolley (talk) 19:09, 12 May 2010 (UTC)
- I disagree too. If we want to stick to a narrow and simple treatment we should remove the vacuum and negative temperature paragraphs, too, as they are neither fundamental or tangible. The map is illustrative about temperature use in other sciences, the scope of the respective paragraph.
- The very reason given for the initial removal of the map (the alleged inappropriate use of the term in most natural sciences) is an example of its importance in this article, as it is universally used nevertheless (with credible scientific motivation IMHO).
- I don't see a vote here for the removing of the map, either, user:79.113.7.217 removed it without even initiating a discussion, with a contentious reason. —Preceding unsigned comment added by Mihaiam (talk • contribs) 05:55, 14 May 2010 (UTC)
- So, to paraphrase, I said "the Earth is too complex a system" and you said "then so is the vacuum". Earth---Vacuum; Everything---Nothing. They're practically opposites. I want to hear from a different editor on this matter, because I don't follow Mihaiam's logic. Spiel496 (talk) 19:27, 15 May 2010 (UTC)
- There is no logic from mihaim. He has something with 'denialists', and his weapons are a web page and a couple of blogs against a published paper, ad hominem attacks (denialists, contentious reason = he's a telepath?) and apparently he didn't understand several clear replies abut averaging the temperature, which were given to him. As for me, I decided that I have no business with this page anymore. If the editors want to mislead people with that map, keep it here. You should also add maps on averaging intensive quantities on all pages describing them, to keep things consistent. —Preceding unsigned comment added by 79.113.2.40 (talk) 12:08, 20 May 2010 (UTC)
- Mean temperature is a useful enough concept to appear in many published papers to this very day, as a quick search may reveal:
- There is no logic from mihaim. He has something with 'denialists', and his weapons are a web page and a couple of blogs against a published paper, ad hominem attacks (denialists, contentious reason = he's a telepath?) and apparently he didn't understand several clear replies abut averaging the temperature, which were given to him. As for me, I decided that I have no business with this page anymore. If the editors want to mislead people with that map, keep it here. You should also add maps on averaging intensive quantities on all pages describing them, to keep things consistent. —Preceding unsigned comment added by 79.113.2.40 (talk) 12:08, 20 May 2010 (UTC)
- So, to paraphrase, I said "the Earth is too complex a system" and you said "then so is the vacuum". Earth---Vacuum; Everything---Nothing. They're practically opposites. I want to hear from a different editor on this matter, because I don't follow Mihaiam's logic. Spiel496 (talk) 19:27, 15 May 2010 (UTC)
- I disagree with your argument, but at least it is plausible. The original argument (this article can't talk about average temepratures) clearly wasn't plausible William M. Connolley (talk) 19:09, 12 May 2010 (UTC)
http://arxiv.org/find/all/1/all:+EXACT+mean_temperature/0/1/0/all/0/1 —Preceding unsigned comment added by Mihaiam (talk • contribs) 06:18, 21 May 2010 (UTC)
- Thank you for proving once again my point. For your information, 'useful' in not a valid logical argument. Just because some statistical figure which isn't a thermodynamical value is used in a lot of papers, doesn't mean that it should be pushed into thermodynamics.
- The presence of a map showing locations with different temperatures is only justified if it illustrates the confusion that can arise in a faulty thermodynamic analysis. The property "temperature" is quite independent of size. By assigning a temperature to a particular (macroscopic and larger) object, by definition all the parts of it must have the same temperature. In the case of a gas Maxwell and Boltzmann recognised that the heat transmission process described by kinetic theory meant that the particles comprising a mass of gas (i.e. at the macrosopic level) meant that the various particles of gas (microscopic level) had different (translational) kinetic energies, even at the "degree of freedom" level. What they did was to show that, when there was thermal equilibrium (no heat transport taking place at the macroscopic level) the distribution of the microscopic energies i.e. the energies of the individual particles, followed a statistical distribution now called the Maxwell-Boltzmann distribution; it is these statistics that connect the definition of temperature at the atomic level with the bulk temperature measured by a themometer.
- But this is as far as you can go when assigning a temperature to a multiplicity of locations. When assigning an average temperature to a large object (it doesn't have to be as big as a planet!), not in thermal equilibrium i.e. has parts definitely at different temperatures with heat transport taking place between them according to the 2nd Law, any figure produced will be meaningless. What actually happens when defining an "average temperature" this way is that the temperatures of the different parts are added together in an undefined way, perhaps to give a publishable figure. It is fairly easy to understand why this is unreliable; should the measure be made using infrared radiation then two locations with the same temperature but differing emissivities will appear to have quite different temperatures. A good example is data for the Sun, its temperature is often given to four figures by measuring its radiance. But as seen from the Earth its surface is far from uniformly bright (because of limb darkening) and its spectrum is only vaguely black body, all of which makes nonsense of four figure accuracy for the actual temperature. --Damorbel (talk) 09:47, 21 May 2010 (UTC)
Overview section overhaul
I've looked at this page for the first time today and the Overview section is in need of a major overhaul. The first sentence wasn't even complete. When I looked through the history to see what was deleted I noticed that much of the information in there is incoherent and/or incorrect. I will change a few things today to fix the incomplete sentences and I will hopefully revisit this page to give it some help. What do other people think? What should be in the Overview section anyway? The stuff that is currently in there isn't really an overview of anything? Should it just be deleted? Sirsparksalot (talk) 00:39, 13 May 2010 (UTC)
Contradiction
The following is a quotation from Temperature of vacuum "If a thermometer orbiting the Earth is exposed to sunlight, then it equilibrates at the temperature at which power received by the thermometer from the Sun is exactly equal to the power radiated away by thermal radiation of the thermometer. For a black body this equilibrium temperature is about 281 K (+8 °C)." This is correct but the "black body" reservation is misleading because the equilibrium temperature is in fact independent of the optical properties of the body.
The contribution goes on to say "Since Earth has an albedo of 30%, average temperature as seen from space is lower than for a black body, 254 K, while the surface temperature is considerably higher due to the greenhouse effect." Introducing temperature as a function of the reflected radiation (the albedo) is a contradiction since the first quoted statement makes no reference to the reflectivity of the thermometer. The first quotation remains correct because, in these conditions (no other sources or sinks of thermal energy) the absorption and emission of radiation equilibrate at the same temperature for all materials, independent of their reflectivity and transparency.--Damorbel (talk) 08:20, 21 May 2010 (UTC)
- I'm afraid the equilibrium temperature actually depends on the optical properties of the body, being lower for bodies with high reflectivity (or transparency for that matter). —Preceding unsigned comment added by Mihaiam (talk • contribs) 09:50, 21 May 2010 (UTC)
- "I'm afraid the equilibrium temperature actually depends on the optical properties" Care to justify this? It's a popular fallacy stated many many times but that doesn't make it true. I take it you mean the temperature in a thermal radiation field.
- Tell me what would be the temperature of a 1m spinning ball plated so that it reflected 99% of the incident radiation, same orbit as Earth but far away?--Damorbel (talk) 10:49, 21 May 2010 (UTC)
- Giving some simplifying assumptions, I'll say about 89K —Preceding unsigned comment added by Mihaiam (talk • contribs) 11:19, 21 May 2010 (UTC)
- 89K? Ok. Now if you changed one half to carbon black (emissivity 0.99) the black side would, I imagine, be about 190K warmer, giving you a nice thermal gradient which could be used for generating electricity, maybe 500W/m2? Which, when scaled up a bit and sent to Earth, could save the planet from AGW. Cheers! (You can patent it too!)
- PS Nearly as good as cold fusion! --Damorbel (talk) 12:06, 21 May 2010 (UTC)
- And your point is .... ?--Mihaiam (talk) 18:41, 21 May 2010 (UTC)
- Haven't you noticed my point? If your idea that the equilibrium temperature is dependent on reflectivity etc. was true you could be getting energy from nothing, you would have a perpetual motion scheme (or scam). The sad truth is, equilibrium temperature of any object in a uniform radiation field is independent of its reflectivity, its colour or its transparency. History is full of such perpetual motion schemes to "get energy from nothing". Pons and Fleischman's cold fusion scam is ever so slightly different in that they just had no way to get the energy output to exceed the input. --Damorbel (talk) 19:33, 21 May 2010 (UTC)
- The reflectivity simply don't play a role in all this. It just reduce the amount of incident radiation to be absorbed by the body. You do have a cold sink as the body will radiate according to Stefan–Boltzmann law. —Preceding unsigned comment added by Mihaiam (talk • contribs) 19:56, 21 May 2010 (UTC)
- Haven't you noticed my point? If your idea that the equilibrium temperature is dependent on reflectivity etc. was true you could be getting energy from nothing, you would have a perpetual motion scheme (or scam). The sad truth is, equilibrium temperature of any object in a uniform radiation field is independent of its reflectivity, its colour or its transparency. History is full of such perpetual motion schemes to "get energy from nothing". Pons and Fleischman's cold fusion scam is ever so slightly different in that they just had no way to get the energy output to exceed the input. --Damorbel (talk) 19:33, 21 May 2010 (UTC)
- And your point is .... ?--Mihaiam (talk) 18:41, 21 May 2010 (UTC)
- Giving some simplifying assumptions, I'll say about 89K —Preceding unsigned comment added by Mihaiam (talk • contribs) 11:19, 21 May 2010 (UTC)
You have to make your mind up. A while ago you wrote about body with an albedo (reflectivity) of 99% having an equilibrium temperature of 89K, that is the role I am talking about. You refer to the Stefan-Boltzmann law, fair enough, but a body that reflects 99% of the incident radiation has an emissivity of 1%, what emissivity did you use in your calculation? --Damorbel (talk) 20:09, 21 May 2010 (UTC)
- You are right if the emissivity of the object is the same at all wavelengths (gray body). One of the assumption I've made is that the object have a reflectivity of 99% for the solar spectrum (UV, visible and near infrared), but 0% in far infrared. Although not really warranted from your premises, this assumption does not contradict them since you only specified the reflection for incident solar radiation (albedo). As it happens, most materials exhibit low infrared reflectivity:
http://minerals.gps.caltech.edu/FILES/Infrared_Reflectance/index.htm FWIW, albedo is a more specific form of reflectivity, over the incident light wavelengths —Preceding unsigned comment added by Mihaiam (talk • contribs) 23:26, 21 May 2010 (UTC)
- Please stop the discussion and think. IF you have a thermodynamical heat bath, that is radiation gas - and IF you talk about thermodynamical EQUILIBRIUM, then you MUST have the object in equilibrium with the radiation. That means its temperature is independent of the optical properties. If it wouldn't be such, you could easily make a perpetuum mobile of the 2nd kind. Now in earth case, it's not the case. The radiation coming from the Sun is not at equilibrium, it would have to be at 2.725K to be there. Or it would have to come from all around as the one from the Sun, that is to be isotropic and homogeneous, which would mean Sun's "temperature" at Earth level. In the case of NON equilibrium, in a simple system, which does not rotate, you'll have to take into account the followings: - the albedo of the exposed part, for both incoming and outgoing; - the albedo for the shadow part, for outgoing; - the thermal properties of the body, that is thermal conductivity. You would have in the body a thermal gradient, which means a thermodynamical non equilibrium situation. For reasons exposed already, it's meaningless in physical sense to talk about 'the average temperature' of the body. —Preceding unsigned comment added by 79.113.15.75 (talk) 06:41, 22 May 2010 (UTC)
Equilibrium temperature in a way does have to do with reflectivity and in a much more meaningful way does not. As described I gather that while Equilibrium temperature can appear to be AFFECTED by reflectivity of various energies, it is however not a FUNCTION of reflectivity. In improper testing of material which does not take into account reflectivity properties of the material used in the experiment the equilibrium temperature of the material would appear higher than in actuality.
IE, Equilibrium temperature is the POINT at which the energy absorbed by the material is exactly equal to the energy released by he material. However a material may reflect a certain percentage of energy, never absorbing it in the first place, thereby artificially inflating the equilibrium temperature in improper observations.
However the equilibrium temperature is actually the point at which the energy ABSORBED it the same as the energy RELEASED by the substance, so you can subtract the amount of radiation REFLECTED as irrelevant to the equation.
That is NOT to say that a material which is reflective will not have to have more energy expended to reach the equilibrium temperature, or to say that an object cannot be higher or lower than it's equilibrium temperature. —Preceding unsigned comment added by QSquared (talk • contribs) 05:57, 22 May 2010 (UTC)
Metrology
What are we to make of the last four words of this: 'Fahrenheit's scale is still in use in the USA, with the Celsius scale in use in the rest of the world and the Kelvin scale.'? 213.122.28.238 (talk) 16:10, 24 June 2010 (UTC)
Introduction
The opening statement "In physics, temperature is the average energy in each degree of freedom of the particles in a thermodynamic system" is complete nonsense, if it were true temperature would be measured in Joules, not K.--Damorbel (talk) 15:25, 7 January 2010 (UTC)
- Absolutely right. I have changed the introduction to not only reflect this, but also to state that the scientific definition of temperature is in the realm of thermodynamics, not statistical physics. It was defined by Lord Kelvin, before the advent of statistical physics. Statistical physics provides an explanation and a deeper understanding of temperature rather than a definition. I'm just worried that the introduction now sounds too technical in the beginning. PAR (talk) 16:45, 7 January 2010 (UTC)
The opening statement "In physics, temperature is the average energy in each degree of freedom of the particles in a thermodynamic system" is complete nonsense, if it were true temperature would be measured in Joules, not K.
- Your statement is simply false as K is a unit of measure denoting average kinetic energy. Joules measure energy, K measures average energy. Therefore, the statement, "Temperature is the measure of the average kinetic energy of the particles in a substance," is absolutely correct. I have changed the first sentence to reflect this. Happy editing, hajatvrc with WikiLove @ 16:44, 3 July 2010 (UTC)
- I'm going to step in and agree with Damorbel and PAR. First, K is not a unit of denoting average kinetic energy as K is not a unit of energy. Second, consider that two completely different systems with same temperature can have different average kinetic energies. One simple example of this would be two separate containers, one filled with iodine gas and one filled with an equal amount of hydrogen gas. Both molecules are homo-nuclear diatomics and can be treated similarly. They should each have the same average kinetic energy of translation and rotation. However, due to the different vibrational frequencies of hydrogen and iodine molecules, they will have drastically different amounts of vibrational energy. This means that the two containers, despite being at the same temperature, have very different average energies. This average energies of each system will scale with the temperature, but that does not mean that temperature is a direct measure of the average energy. To measure the average energy of the system, the heat capacity of the system (which relates the average energy to the temperature) must be known. While it may seem like semantics, the opening statement should be treated rather delicately. Personally, my vote would be that it should stress that temperature is related to the average kinetic energy and is not a direct measure of it. --Sirsparksalot (talk) 17:30, 27 July 2010 (UTC)
Temperature is the frequency of the average molecular vibration. If temperature were the measure of energy why then does a pound of steam at 212 degrees F. and atmospheric pressure have much more energy than the a pound of water at 212 degrees F. and atmospheric pressure? The molecules in the steam have a much higher velocity than the molecules in the water, however the average frequency of vibration of the molecules in the steam and water are the same. The temperature of colors are also frequency. Geweber (talk) 16:58, 12 September 2010 (UTC)
- Amusing! The choice of units for temperature and kinetic energy is merely a historical artifact. The mean kinetic energy is only a function of temperature not the type of gas. This is why it can be used as a definition of temperature. In fact, when using natural units, the two are only different by a factor of 2: E=T/2. History could have equally decided to call the kinetic energy itself as being the temperature, but instead we have the correlation through the Boltzman constant. Kbrose (talk) 19:05, 12 September 2010 (UTC)
- The mean kinetic energy is most definitely dependent upon the identity of the substance. If it was only dependent upon the temperature, then all substances would have the same heat capacity! Also, what about phase changes? Consider a solid substance with a temperature that is just an infinitesimal amount below the melting point and compare that to that same substance as a solid with a temperature an infinitesimal amount above the melting point. For all intensive purposes, the temperature of both systems is the same (only infinitesimally different) yet the internal energy of the substances is COMPLETELY different. This is because immense amounts of energy had to be put into the substance so that it could undergo a phase transition from liquid to solid. Here we have two systems with the same composition and mass with practically identical temperatures that have drastically different energies. The mean kinetic energy is not a function of only the temperature. Sirsparksalot (talk) 22:38, 16 September 2010 (UTC)
- You're confusing different forms of energy. Kinetic energy: E = 3/2 kT, there is no dependency on anything but T. The articles states it too, as does any thermal physics text book. You couldn't construct a useful thermometer if this weren't so. It's the zeroth law of thermodynamics. Your model of heat capacity has similar flaws, and Internal energy is not just kinetic energy. Kbrose (talk) 23:30, 16 September 2010 (UTC)
- I don't even understand what you are saying. If you are going to disagree with something, please give reasonable explanations. How does the model of heat capacity have similar flaws? Also, don't refer to the fact that the "article states it" because the fact that the article states it is the precise reason this debate is taking place. Finally, please provide a citation for one of these books. I would be more than happy to look it up to try to better understand the argument you are making. Sirsparksalot (talk) 01:58, 17 September 2010 (UTC)
- You're confusing different forms of energy. Kinetic energy: E = 3/2 kT, there is no dependency on anything but T. The articles states it too, as does any thermal physics text book. You couldn't construct a useful thermometer if this weren't so. It's the zeroth law of thermodynamics. Your model of heat capacity has similar flaws, and Internal energy is not just kinetic energy. Kbrose (talk) 23:30, 16 September 2010 (UTC)
- The mean kinetic energy is most definitely dependent upon the identity of the substance. If it was only dependent upon the temperature, then all substances would have the same heat capacity! Also, what about phase changes? Consider a solid substance with a temperature that is just an infinitesimal amount below the melting point and compare that to that same substance as a solid with a temperature an infinitesimal amount above the melting point. For all intensive purposes, the temperature of both systems is the same (only infinitesimally different) yet the internal energy of the substances is COMPLETELY different. This is because immense amounts of energy had to be put into the substance so that it could undergo a phase transition from liquid to solid. Here we have two systems with the same composition and mass with practically identical temperatures that have drastically different energies. The mean kinetic energy is not a function of only the temperature. Sirsparksalot (talk) 22:38, 16 September 2010 (UTC)
I'm going to apologize for beating a dead horse here, but I am once again going to state that the kinetic energy is dependent upon the identity of the substance. Maybe my argument above was a bit unclear because I used the internal energy, but the ideas are still in order. The problem that I have with saying that the average kinetic energy is E = 3/2 kT, where 1/2 kT comes from each degree of freedom, is that it ignores quantum effects. While I do not disagree with this formulation, it has marked limitations that must be considered. This result is an application of the equipartition theorem and is shown quite clearly when considering kinetic molecular theory. The result of E = 3/2 kT only takes into account the classical motion of particles and will only work for quantum systems when kT is much larger than the energy spacing that arises from quantization. Examples of this include translational and rotational degrees of freedom, which is why E = 3/2 kT works for monatomic ideal gasses and a few extra 1/2 kT can be included to account for rotational energy, the number of 1/2 kT that are included depends on the number of non-equivalent axes of rotation. When vibrations are considered, however. This approximation breaks down because the spacing between adjacent vibrational energy levels can typically be much higher than kT. What this means is that the population distribution of molecules in each of the energy levels is weighted toward states of lower energies according to the relevant Boltzmann factors. Taking these Boltzmann factors into account when calculating the average kinetic energy gives a value that can be much different than the classical result of E = 3/2 kT.
Using the example I used above that compared the kinetic energy of hydrogen and iodine gasses, this effect is extremely apparent. Rotation and translation of the two species can be treated classically, offering 1/2 kT to the average kinetic energy for each degree of freedom. This means that the rotational and translational energies of each gas should be the same. Vibrations, on the other hand, must be treated quantum mechanically. Hydrogen gas has a vibrational frequency of approximately 4000 cm-1. This value is far greater than kT at room temperature which is approximately 200 cm-1. The fractional population of molecules in the first excited vibrational level will be negligible when compared to the population of molecules in the ground vibrational state (very close to unity!) and will thus contribute very little to the average kinetic energy. This case is much different from iodine which has a vibrational frequency of approximately 200 cm-1, which is on the order of the 200 cm-1 of kT available at room temperature. This means that the relative populations of the ground and first excited vibrational energy levels will approximately 3:1, which are comparable to one another (but still different enough that E cannot be approximated classically) and will contribute much more to the average kinetic energy than did the first excited state of hydrogen. This example illustrates that the two different gasses, hydrogen and iodine, have drastically different energies due to the quantum mechanical treatment of vibrations.
Another consideration that should be made is the effect of making the assumption that E = 3/2 kT. This is one of the underlying assumptions made in the derivation of the heat capacities of solids (Dulong-Petit). While this law did hold at higher temperatures, it utterly failed at low temperatures. This is because it did not take into account energy quantization. By considering these effects, the later models of Einstein and Debye did a much better job at explaining the temperature dependence of the heat capacity. A similar case would be how the classical description of blackbody emission spectra could not be obtained. Quantization had to be taken into account.
Hopefully this does a good job at showing how temperature is not simply equal to the average kinetic energy of the system in question. Including quantization is a necessary step and it cannot be overlooked. While I am not opposed to putting E = 3/2 kT into the article, I feel that it needs to have a bit of a clause attached to it. It needs to be made clear that this result is the consequence of treating the system classically and will only hold at high temperatures where kT is much larger than the energy spacing that arises from including quantization. At low temperatures, or in systems where quantum effects are strong, this result breaks down and other methods are required to describe the average kinetic energy of a material. Sirsparksalot (talk) 16:36, 21 September 2010 (UTC)
- I agree with Sirsparksalot. Temperature is not kinetic energy. Kinetic energy is e = 1/2 m v^2. Where is the "T"? A heat source has a temperature level to it but it is not energy. In some systems the velocity varies as the temperature even then velocity by itself is not energy. There has to be some mass. Since absolute zero temperature is defined as no molecular motion then you could say temperature is related to the motion of molecules. It is not the speed of the molecules since the speed of the molecules of water at saturation is different between the liquid and the gas yet the temperature is the same. The remaining characteristic of molecular motion is the frequency of vibration. So it seems temperature is the frequency of the molecular vibration.Geweber (talk) 02:05, 22 September 2010 (UTC)
- I doubt Sirsparksalot agrees with you, however. A monotonic gas has temperature, yet it does not vibrate. A collection of identical harmonic oscillators all vibrate at a fixed frequency, yet that system can have an arbitrary temperature. Spiel496 (talk) 05:55, 22 September 2010 (UTC)
- I don't have a good answer for you, Sirsparksalot, but the kinetic energy statement confuses me as well. I'm not saying it's incorrect, but I would like to read a clarification that goes beyond "it's in a physics text book". Spiel496 (talk) 21:16, 22 September 2010 (UTC)
Definition of temperature in Lead
The definition of temperature in the lead section gets reworded from time to time, but it is generally a variation of "the measure of the average kinetic energy of the particles". I'm uncomfortable with that wording for two reasons. First, that it doesn't address the random nature of thermal energy. Second, that it ignores the potential energy in solids.
The randomness of the motion seems important if we consider, for example, a liquid-helium-cooled bullet propelled at 100 m/s. Although the individual lead (Pb) atoms have an average kinetic energy that is consistent with 300 Kelvin, one would say the temperature is really more like 4 Kelvin.
Second, the potential energy is important in solids. I remember reading that half the thermal energy of a solid is in the form of potential energy. It makes sense, considering that the energy of a harmonic oscillator cycles between all kinetic and all potential. Now, we wouldn't want to count the static energy of chemical bonds as thermal energy, so I would make the distinction that it's the component of potential energy that is rapidly being transformed from and into kinetic energy.
Taking these two points together, I would propose, "Temperature is a measure of the energy associated with the random motion of the individual particles in matter." I say "associated with" to attempt to include the transient potential energy, but not the static energy of the chemical bonds. And I specify "random" to rule out the component due to center-of-mass motion, or some weird oscillation where all the atoms move together. Does this make sense? Spiel496 (talk) 00:53, 14 September 2010 (UTC)
- No, sorry. An oscillator constantly exchanges potential energy into kinetic energy and back at its characteristic frequency. So, potential energy is a driving force, but it is only the kinetic energy that is temperature, by definition. While thermal energy is 'generated' from stored potential energy of every oscillator contributing, it is the kinetic energy (=thermal energy) that is transferred and stored as another form of potential energy in a neighboring system. The randomness is expressed in the fact that it is the mean kinetic energy of an ensemble. That term implies a statistical sampling. Your example is no different than an atom flying through a gas, internally it has oscillators and perhaps a characteristic temperature. Just like the temperature of the bullet is one thing, it is an ensemble of oscillators that have a mean kinetic energy and the bullet therefore has a temperature internally, but the bullet's speed is another story. A single flying bullet doesn't have a 'temperature', even if you can calculate one with a formula, unless it collides with something. If you want to consider a gas of such bullets, then your container, the system, has a temperature as a whole, and the situation is identical to a container of gas. The definition of temperature is based on kinetic energy, not internal energy, which is the sum you're talking about. This is described in every elementary physics book. Today's definition of temperature is merely a consequence of scientific history. It could have happened that they decided to state: mean kinetic energy = temperature and the world of physics would likely still be whole. The use of natural units comes close to this ideal, E=T/2. Kbrose (talk) 02:12, 14 September 2010 (UTC)
- I appreciate you taking the time to reply to my post. But rather than explain why my wording is wrong, you have simply repeated the statements that I called into question. For example "it is only the kinetic energy that is temperature". Well, maybe not. I'm hoping there's more to go on than a definition. And throughout your paragraph, you make some statements that are either completely false or, at best, ambiguous:
- "potential energy is a driving force" -- No. Energy is not force.
- "The term [mean] implies a statistical sampling" -- No. It is a sum of terms, divided by the number of terms.
- "A flying bullet doesn't have a temperature" -- ??. You contradict this yourself in the preceding sentence.
- "internal energy ... is the sum you're talking about" -- No, internal energy includes the energy of chemical bonds, which I specifically omitted.
- "This is described in every elementary physics book." -- There are plenty of physics books which make no mention of thermodynamics.
- Pardon me for being so nit-picky, but even if my proposed definition of temperature lacking, it is not as wrong as some of the statements you've used to dispute it. I think I get your point about the bullets, though; their internal temperatures could differ from the temperature inferred by the CM motion of the larger bodies. It's too artificial an example.
- I appreciate you taking the time to reply to my post. But rather than explain why my wording is wrong, you have simply repeated the statements that I called into question. For example "it is only the kinetic energy that is temperature". Well, maybe not. I'm hoping there's more to go on than a definition. And throughout your paragraph, you make some statements that are either completely false or, at best, ambiguous:
- Maybe this would be a quicker way to get to the truth: Can someone come up with a thought experiment where I get the wrong answer if I define temperature to include the potential energy. For example, what if we had two systems exchanging thermal energy: One is a monotonic gas, in which potential energy plays no role; the other is an ensemble of harmonic oscillators (masses on springs), in which potential energy plays an important role. Transfer some thermal energy Q from the monotonic gas to the harmonic oscillators. The total kinetic energy of the gas has decreased by Q. Is it true that the kinetic energy of the harmonic oscillators has increased by Q? Spiel496 (talk) 06:25, 14 September 2010 (UTC)
- This is not a forum to philosophize about the merits of established physics. If you want to do that you can do that elsewhere. If you don't accept the definition of temperature and want to propose a new one, Wikipedia is not the place for original research, we report established knowledge. Take a physics book that has a thermodynamics chapter and find out the definition. It is already stated in this article, albeit not in great presentation. If you did notice that the definition here always involves kinetic energy, then perhaps you should take that as a signal. If you like a different but equivalent definition from state function approach, then the article also provides the differential of internal energy wrt entropy. Kbrose (talk) 14:34, 14 September 2010 (UTC)
- Please understand, I'm not trying to rewrite physics; I wanted merely to direct the conversation to a more concrete example. If it was too rambling or philosophical, then just ignore it.
- Here's my point more concisely: When a solid is heated, half of the thermal energy ends up as potential energy. Yet temperature is said to be a measure of just the kinetic energy. That sounds enough like a contradiction that I suspected that "kinetic" was being used loosely. Apparently it is literal, which surprises me.
- It is quite mistaken to consider temperature as a measure of energy. I pointed this out here http://en.wikipedia.org/wiki/Talk:Temperature#Introduction but it seems that active editors want it this way even though few report 'temperature' as 273J, practitioners of thermodynamics much prefer273K; 0oC is another popular way of saying the same thing. But this is Wikipedia and, in practice anybody can write what they like about physics. 'Temperature' is inextricably bound up with the Boltzmann constant which isn't even mentioned in the article. I would suggest that those editing or accepting the definitions in the current version of this entry should reconcile their ideas with the Boltzmann constant otherwise they will end by making a mess like the present one.--Damorbel (talk) 07:30, 16 September 2010 (UTC)
- The state of the article leaves a lot to be desired for, indeed, but this is partially the result of following advice like yours. The practice of expressing temperature in units of energy is common-place in physics, see for example the wide-spread use of the unit electron volt in plasma physics. In the history of thermodynamics, temperature was originally treated synonymous to heat (caloric). When using natural units, k=1, temperature has the unit of energy and this expresses the natural physics of temperature without human construct and many physicists don't accept the notion that it should be a separate physical base property in system of units. The involvement of the Boltzmann constant is only an artifact of choice of system of units. Kbrose (talk) 19:16, 16 September 2010 (UTC)
- It is quite mistaken to consider temperature as a measure of energy. I pointed this out here http://en.wikipedia.org/wiki/Talk:Temperature#Introduction but it seems that active editors want it this way even though few report 'temperature' as 273J, practitioners of thermodynamics much prefer273K; 0oC is another popular way of saying the same thing. But this is Wikipedia and, in practice anybody can write what they like about physics. 'Temperature' is inextricably bound up with the Boltzmann constant which isn't even mentioned in the article. I would suggest that those editing or accepting the definitions in the current version of this entry should reconcile their ideas with the Boltzmann constant otherwise they will end by making a mess like the present one.--Damorbel (talk) 07:30, 16 September 2010 (UTC)
- "the result of following advice like yours". Hmmm! You say "The practice of expressing temperature in units of energy is common-place". You can read what it says about using the electronvolt as a measure of temperature here: [] where it says "In certain fields, such as plasma physics, it is convenient to use the electronvolt as a unit of temperature. The conversion to kelvins (symbol: uppercase K) is defined by using kB, the Boltzmann constant". Please note multiplication by the Boltzmann constant which puts the temperature proportional to the energy per degree of freedom.
- Temperature is an intensive property, that is why energy flows from regions where the intensity is high to where it is low and vice-versa.
- An energy measure (joules, ergs) is not intensive, it is just a general statement about the energy in a system, it says nothing about the size of the system in contrast to "energy per degree of freedom" which severely limits size of the 'system'. Similarly a measurement of energy says nothing at all about the tendency of the energy to flow in one direction or another--Damorbel (talk) 15:28, 26 September 2010 (UTC)
- I understand the point on natural units but I think the purpose of natural units is being confused. Natural units are meant to normalize physical variables to one another. This is where the k=1 comes in. It simply normalizes the magnitude of k to 1. This does not mean that the units of one quantity are the same as the units of another. It just means that the magnitude of the units are the same. Think of them as being unit vectors. Just because unit vectors all have the same magnitude, doesn't mean they all point in the same direction (which they don't - they're orthogonal). Please see the comments I made above that address how temperature is not a direct measure of the average energy of a system. (http://en.wikipedia.org/wiki/Talk:Temperature#Introduction) Sirsparksalot (talk) 00:11, 17 September 2010 (UTC)
- It is still a matter of convention to assign incompatible dimensions to different physical quantities. Of course, temperature is not the same as energy, but then potential energy is not the same as kinetic energy either. Yet, we don't measure potential energy in different units as we measure kinetic energy in. Nothing would stop one from doing that, though (you would then get a dimensionful conversion factor in the energy conservation law). Count Iblis (talk) 00:23, 17 September 2010 (UTC)
- I understand the point on natural units but I think the purpose of natural units is being confused. Natural units are meant to normalize physical variables to one another. This is where the k=1 comes in. It simply normalizes the magnitude of k to 1. This does not mean that the units of one quantity are the same as the units of another. It just means that the magnitude of the units are the same. Think of them as being unit vectors. Just because unit vectors all have the same magnitude, doesn't mean they all point in the same direction (which they don't - they're orthogonal). Please see the comments I made above that address how temperature is not a direct measure of the average energy of a system. (http://en.wikipedia.org/wiki/Talk:Temperature#Introduction) Sirsparksalot (talk) 00:11, 17 September 2010 (UTC)
- "clarify" - "Temperature at the microscopic level is the measure of the energy per degree of freedom (the energy in a degree of freedom)." In a macroscopic system "temperature is the mean energy per degree of freedom". This is where the Maxwell-Boltzmann distribution comes is, it recognises that in a kinetic system of freely interacting particles, possible with different numbers of degrees of freedom, they all will have different energies at any given instant but the average over time will represent the temperature just as if all degrees of freedom had the same energy. --Damorbel (talk) 15:28, 26 September 2010 (UTC)
- Maybe point wasn't as clear as I had hoped. I was essentially trying to suggest that using dimensionless units isn't a good way of equating energy and temperature. Count Iblis makes an excellent point. Just because two quantities have the same units, doesn't mean they are the same thing. Sirsparksalot (talk) 01:25, 17 September 2010 (UTC)
The concept of equilibrium and the idea that a large number of particles is involved are both vital to the definition of temperature. Without equilibrium, temperature is undefined. The fewer the number of particles, the less useful the idea of temperature becomes.
Neglecting quantum effects, at equilibrium, for a sufficient number of particles (N), whose center of gravity is at rest, each degree of freedom will have an average energy of NkT/2. For example, if you have a monatomic gas (three degrees of freedom) and a bunch of linear harmonic oscillators (one degree of freedom) that have equilibrated, then they will all be at the same temperature and each degree of freedom will have, on average, an energy of NkT/2 where N is the number of particles, or oscillators, as the case may be. The energy of a linear oscillator is the sum of its potential and kinetic energy. Statistically, half of that total energy will be kinetic, half potential.
In the case of a solid, if the molecules are essentially point particles vibrating about a center of attraction, they will have three degrees of freedom, each degree of freedom having total energy NkT/2. At equilibrium, statistically, half of that energy will be potential, half kinetic. Again, NkT/2 will be the total energy per degree of freedom, NkT/4 will be the kinetic energy per degree of freedom.
Just as in the case of Entropy, there is a thermodynamic definition of temperature, and then there is the statistical mechanical explanation of temperature. The thermodynamic definition involves the experimental fact that all dilute gases behave as ideal gases, etc. Statistical mechanics provides an explanation of temperature by showing that, at equilibrium, the average energy (kinetic plus potential) per particle of a body with a large number of particles is equal to fkT/2 per particle where f is the number of degrees of freedom availiable to that particle, k is Boltzmann's constant, and T is temperature. PAR (talk) 17:14, 25 September 2010 (UTC)
- PAR, perhaps I misunderstand you when you write "kinetic plus potential" (energy) in relation to the temperature of a particle. I do not think potential energy directly affects the temperature of particles. For example gas moving "upwards" in a gravitational field will cool because part of its thermal (kinetic) energy is converted to potential energy, it cools because there is less energy in the 'kinetic' energy component. --Damorbel (talk) 15:28, 26 September 2010 (UTC)
- Hmm - I wish I could give a good answer for that off the top of my head. But it shows why equilibrium is important. For a gas of point particles in a gravitational field, at equilibrium, I believe the temperature will be the same at any height, while the density (and therefore pressure) drops off as you go higher. The internal energy density will be higher, closer to the "ground" because of the potential, but the temperature will be the same, which means the average kinetic energy per molecule will be the same at any height. But if the molecules are point particles, they still only have three degrees of freedom. This is not the same case where, for example, the molecules are like two point atoms connected by a spring. Lets say, for simplicity, all the springs are oriented in the same direction. Then each molecule has four degrees of freedom, three for the center of mass, and one to describe their separation. Again, the kinetic energy of the molecular centers of mass will reflect the temperature, each of the three degrees of freedom of the center of mass will have energy kT/2 per molecule. But the molecules will be also vibrating about their center of mass, and this degree of freedom will also have energy kT/2, part kinetic, part potential. In other words, the kinetic energy of the center of mass will not be equal to the sum of the kinetic energies of each atom in the molecule. The extra kinetic energy is part of the fourth degree of freedom energy, which is the sum of the kinetic energies relative to the center of mass plus the potential energy due to the stretching or compression of the spring. I know, this is a description, rather than an explanation, but I guess I will have to think about it some more in order to give a good explanation. PAR (talk) 17:22, 28 September 2010 (UTC)
- PAR you wrote "I believe the temperature will be the same at any height" Really? I thought it was common experience that the temperature in the troposphere falls with height at a 'Lapse Rate'. Most passenger flights experience outside temperatures below -50C. --Damorbel (talk) 20:31, 2 October 2010 (UTC)
- In an idealized case where there is simply a gas at equilibrium in a gravitational field, I think the temperature will be the same. Once you shine the sun on it, everything changes. The ground absorbs the sun's light and heats up, heating the gas at ground level. The gas absorbs heat from the sun, there is convection, with hot gases rising, and cooling as their density decreases, etc. etc. For the first (ideal) case, I haven't really seen the analysis, I'm only saying the temperature is the same from the idea that there can never be temperature gradients in an isolated system in equilibrium. The first case is isolated, the second case, with the sun shining on it, is not. I have a suspicion this might not be so simple for low density gases, where the molecules travel so far between collisions that their change in potential energy between collisions is of the same order as their kinetic energy. But this is straying from the subject - I still believe that the temperature of a gas is proportional to the kinetic energy of the center of mass of the molecule, but not to the kinetic energies of the individual atoms in the molecule, and this distinction should be made. PAR (talk) 23:09, 2 October 2010 (UTC)
The point I'm making is that one can do without units in physics, provided one has access to enough knowledge about (and access to) fundamental relations between quantities. E.g. a classical physicist will lack knowledge about the fundamental equations that relate energy to mass, distances to time intervals, mass to length etc. Even if one has knowledge of these relations, to do accurate measurements, one may need to resort to physical representations of units that cannot be related to each other. In that case, what happens is that some relations that express indentities come with conversion factors that will have experimental errors. You can define a set of units where you set all these conversion factor equal to 1, but that may not yield the most accurate measurement system. Count Iblis (talk) 23:01, 2 October 2010 (UTC)
Definition with Second Law of Thermodynamics
I think we should remove that section. That definition is of the thermodynamical scale rather than the temperature itself. For example, the old Celsius scale, defined by setting water's freezing point at 0 and boiling point at 100 with assumption that mercury expands linearly with temperature, is indeed a scale of temperature, but apparently in disagreement with thermodynamical scale.--Netheril96 (talk) 08:50, 5 October 2010 (UTC)
Definition of temperature in Lead (2)
I have opened this section because there is at least one real problem with the whole article and to maintain any sort of intellectual respectability it should be resolved. Temperature is not a measure of energy nor is it a general statement about energy. Energy is measured in Joules, ergs, electron volts, Btu, calories, kWh whereas temperature is measured in K, oC etc., etc.
Ultimately the article should be corrected but I have no intention of starting an edit war with so many editors convinced that temperture is a measure of energy. Lack of agreement and consequent edit warring is one of the depressing features of Wikipedia and I would like to find some way to avoid it. So if you are convinced that temperature is a measure of energy please state your case here and at the same time explain why K, oC etc., etc. can be replaced by J, cal. erg etc. without introducing error.--Damorbel (talk) 06:33, 7 October 2010 (UTC)
- It has already been explained to you and others (above) that this is the language of physics and that even a unit of energy is an acceptable unit for measurement of temperature and the use of today's temperatures scales is merely a historical artifact introducing a proportionality constant. You have been shown the equivalence of energy and temperature in natural units, the fundamental expression of the underlying physics of this aspect. Even the equation E=1/2 kT make this clear, as there is no other physical concept entering but the equivalence of kinetic energy and temperature with a constant conversion factor that does nothing but reformulate the system of units. If you still don't believe this, then why don't you go and do you own private research, which you should have done when this was first explained to you, before expecting other editors to you give you a personal physics lesson here. There are myriads of books that explain it in simple language and you will find the terminology used here in every physics book. This is not a discussion forum to find an audience for your views or to get a personal lesson. Kbrose (talk) 18:39, 7 October 2010 (UTC)
- Well, I hate edit wars too. A "measure" of a physical quantity is something which has a one-to-one relationship with that quantity (A bijection) for the given situation. The two quantities need not have the same physical dimensions. In the case of the Kelvin scale, at equilibrium, kT/2 is equal to the energy per degree of freedom (for any degree of freedom that is not "frozen out" by quantum considerations). The temperature T is therefore certainly a "measure" of this energy. For situations not even close to equilibrium, temperature is not defined. For a non-absolute scale like Celsius, temperature is proportional to the energy per degree of freedom plus some constant, which is also a "measure" of energy. PAR (talk) 17:50, 7 October 2010 (UTC)
- I would like to better understand your concern, Damorbel. I have a similar statement I would like your comment on: "The height of a column of mercury is a measure of the atmospheric pressure." I consider that statement to be analogous to the one "temperature is a measure of the energy per degree of freedom". Temperature, like millimeters-of-mercury is easily measured; microscopic energy, like pressure, is a well-defined physical concept that is not a directly observable. Do you consider the analogy valid? If you can explain why not, then I think I'll be closer to understanding your objection. Spiel496 (talk) 17:26, 7 October 2010 (UTC)
Kbrose, I think it would be useful if the definition of the Boltzmann constant and temperature were made consistent. The Boltzmann constant is not a simple equation ratio like that relating two temperature scales such as oC and oF (forgive me if I have not understood what you mean here). The Boltzmann constant gives a number denoting how much energy is contained in the Degrees of freedom (physics and chemistry). For example an atom of a monatomic gas (e.g He) at 1K having 3 degrees of freedom possess about 3 x 1.38 = 4.14x10-23J whereas a diatomic gas molecule having 5 degrees of freedom (e.g. O2) would possess 67% more energy at 1K. Thus two gas molecules at the same temperature contain different amounts of energy, molecules of Helium contain 60% of the energy of Oxygen at the same temperature. This simple example illustrates why temperature cannot be defined as energy.--Damorbel (talk) 20:49, 7 October 2010 (UTC)
- No, the Boltzmann constant has nothing much to do directly with degrees of freedom, and the concept of degrees of freedom doesn't make any statements about energy. The article Boltzmann constant gets along just fine defining itself without referencing degrees of freedom. 'Degrees of freedom' is a concept of multiplicity of oscillators, independent directions of movement etc, just like the term 'number of atoms' reflects the multiplicity of atomic entities. Naturally, if counting energy, one sums over all available particles and over all available degrees of freedom accessible. The Boltzmann constant is the conversion factor to relate microscopic properties to macroscopic properties. By virtue of this, it is plausible that it introduces a change in units. To understand the pure physics of it all, set k = 1. Kbrose (talk) 21:59, 7 October 2010 (UTC)
- Kbrose,you write "the Boltzmann constant has nothing much to do directly with degrees of freedom" but the article on the Boltzmann constant metions 'degrees of freedom' at least six times in different contexts e.g. here where it says "Monatomic ideal gases possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 1.5kT per atom". This simple statement makes it clear that the temperature of an atom is a function of the energy of that atom or more accurately the energy in the number of available degrees of freedom available in that atom. Diatomic molecules such as H2 and O2 with the same amount of energy as monatomic molecules with 3 degrees of freedom have a lower temperature because the energy is spread over more (5) degrees of freedom. If He and O2 with equal energy per molecule are mixed the molecules of each type will exchange energy by colliding according to kinetic theory until, on average they all have the same temperature; the O2 molecules will then have, on average, about 67% more energy than the the He because they have 5 degrees of freedom but they will, on average, have the same temperature. --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- This is truly maddening and a waste of time. Perhaps you can stay on focus only once and not constantly slide into your favorite energy calculations. The article defines the Boltzmann constant without reference to degrees of freedom. Is that so hard to discover? Of course, in any energy calculation that relates to microscopic properties the constant will be involved, since our system of units demands it, that's the whole purpose of the constant. Kbrose (talk) 16:05, 8 October 2010 (UTC)
- WP:Waste of Time :) . Count Iblis (talk) 16:39, 8 October 2010 (UTC)
- Kbrose you will find written in the article here "temperature is a measure of the thermal energy held by a body of matter." (first sentence). You should be aware that this contradicts the introduction to the article where it says "the thermodynamic definition of temperature can be interpreted as a direct measure of the average energy in each degree of freedom of the particles in a thermodynamic system" (last paragraph), after this somebody has added a note 'dubious, discuss' which is what I am doing. As part of the invited discussion I suggest you say if you also consider these statements to be contradictory and why. --Damorbel (talk) 08:53, 9 October 2010 (UTC)
- WP:Waste of Time :) . Count Iblis (talk) 16:39, 8 October 2010 (UTC)
- This is truly maddening and a waste of time. Perhaps you can stay on focus only once and not constantly slide into your favorite energy calculations. The article defines the Boltzmann constant without reference to degrees of freedom. Is that so hard to discover? Of course, in any energy calculation that relates to microscopic properties the constant will be involved, since our system of units demands it, that's the whole purpose of the constant. Kbrose (talk) 16:05, 8 October 2010 (UTC)
- Kbrose,you write "the Boltzmann constant has nothing much to do directly with degrees of freedom" but the article on the Boltzmann constant metions 'degrees of freedom' at least six times in different contexts e.g. here where it says "Monatomic ideal gases possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 1.5kT per atom". This simple statement makes it clear that the temperature of an atom is a function of the energy of that atom or more accurately the energy in the number of available degrees of freedom available in that atom. Diatomic molecules such as H2 and O2 with the same amount of energy as monatomic molecules with 3 degrees of freedom have a lower temperature because the energy is spread over more (5) degrees of freedom. If He and O2 with equal energy per molecule are mixed the molecules of each type will exchange energy by colliding according to kinetic theory until, on average they all have the same temperature; the O2 molecules will then have, on average, about 67% more energy than the the He because they have 5 degrees of freedom but they will, on average, have the same temperature. --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- No, the Boltzmann constant has nothing much to do directly with degrees of freedom, and the concept of degrees of freedom doesn't make any statements about energy. The article Boltzmann constant gets along just fine defining itself without referencing degrees of freedom. 'Degrees of freedom' is a concept of multiplicity of oscillators, independent directions of movement etc, just like the term 'number of atoms' reflects the multiplicity of atomic entities. Naturally, if counting energy, one sums over all available particles and over all available degrees of freedom accessible. The Boltzmann constant is the conversion factor to relate microscopic properties to macroscopic properties. By virtue of this, it is plausible that it introduces a change in units. To understand the pure physics of it all, set k = 1. Kbrose (talk) 21:59, 7 October 2010 (UTC)
- Damorbel, you're saying the same thing as the article: Temperature is a measure of the kinetic energy per degree of freedom. The O2 molecule and He atom both have 1.38x10-23J per degree of freedom. Does the article contradict this somewhere? If so, we should fix it. Spiel496 (talk) 21:37, 7 October 2010 (UTC)
- Yeah, Damorbel you have a good point about Temperature#Overview. It leads with temperature is a measure of the thermal energy held by a body. That's wrong, because it doesn't say "per degree of freedom". For example, because of the two rotational modes, oxygen will have more thermal energy than argon at the same temperature. In fact, the statement doesn't even say "per mole", so if taken literally, it implies temperature is proportional to quantity! There is at least one other place where the prose doesn't specifically normalize by the degrees of freedom. Damorbel (and others), would you consider the article more accurate if it consistently said "temperature is a measure of the average kinetic energy per degree of freedom per particle"? (That's horrible prose, but humor me.) Or is the problem deeper than that? Spiel496 (talk) 21:48, 8 October 2010 (UTC)
- Spiel496 - I think its better to say energy per translational degree of freedom (i.e. kinetic energy of the whole molecule as a result of its motion). The other degrees of freedom will have kT/2 only if they are not in the quantum regime, where their energy is contained in a few energy levels, or maybe the zero level if they are frozen out. That practically never happens for the translational degrees of freedom. PAR (talk) 22:37, 8 October 2010 (UTC)
- I agree that limiting the statement to translational KE avoids quantum effects, and indeed it avoids mention of the other degrees of freedom. But then can the statement be made to apply to solids and liquids? (Assume high enough temperatures, for the moment.) Spiel496 (talk) 00:10, 9 October 2010 (UTC)
- Spiel496, PAR the crucial factor is the 'per degree of freedom', the translational degree of freedom is the most important not just because 'monatomics' don't have any other (!) but also because 'particles' of gas can only exchange energy by collision. Liqids and solids exchange energy along the bonds that keep them together (as liquids, crystals, amorphous solids etc.). These bonds also contain energy which is seen as latent heat, heat of fusion etc. The Wiki article on heat capacity seems a good ref. to me it, also discusses quantum effects extensively but I haven't checked it through. You should also be aware that, by explaining apparent anomalies in various specific heats, quantum theory achieved a major triumph. --Damorbel (talk) 08:53, 9 October 2010 (UTC)
Spiel496, the relationship between atmospheric pressure and the height of a column of mercury is a simple ratio of the effect of gravity on a column of mercury and a column of gas. The matter becomes far more complicated when considering actual height of the two columns because mercury has a density almost independent of pressure so the force exerted by a column of mercury is very closely related to its height, whereas the height of a column of gas (should you be able to decide what its height is in the first place) depends strongly on temperature, not so with mercury. --Damorbel (talk) 20:49, 7 October 2010 (UTC)
- Whoa! Stop after your first sentence. I'm talking about a simple mercury barometer -- one column of mercury, no column of gas, no temperature stuff. The point I was going to make is that your objection about units sounds equivalent to someone saying "the height of a column of mercury cannot be a measure of pressure, because the height is in meters, whereas the pressure is in Pascals". I expect your objection goes deeper than just the units of Kelvin vs Joules. Does it? Spiel496 (talk) 21:20, 7 October 2010 (UTC)
You can also take this perspective. Thermodynamical quantitities like temperature stay relevant in the macroscopic realm. If you take the universe and scale it down and down until the microworld vanishes from view you can still readily measure temperature, heat flow etc. etc. The directly observable effects of this being due to statistical effects at the molecular level, (e.g. Brownian motion) can be made arbitrarily small while scaling things down.
Observers living close to the scaling limit, will likely discover thermodynamics without the proper theoretical foundations. Temperature then cannot be related in a universal way directly to the known mechanical quantities, so it will be assigned a new unit that cannot be expressed in terms of the known units. When later people do discover the fundamental relations, people will tend to stick to the old units, which means that the constant relating temperature to energy will have to cancel out the previously assigned unit for temperature.
This is why kb = 1.38...*10^(-23) Joule/Kelvin
The Kelvin cancels out the unit we've decided to give temperature. And the 1.38...*10^(-23) is small because we are pretty far removed from the microworld. We have chosen a unit for energy such that typical daily life phenomena will have energies within a few order of magnitudes of 1. In the exact scaling limit, one unit for energy would be sent to infinity and the magnitude of Boltzmann constant would thus become zero. So exactly at the scaling limit, you lose the relation between temperature and (renormalized) energy. At that scaling limit, you really have to define a unit for temperature. The reason why we have an independent unit for temperatre is because, being so close to the scaling limit, our ancestors could not readily observe the effects that demonstrate that we are not precisely at the scaling limit.
See here for a similar argument about the speed of light. Count Iblis (talk) 23:30, 7 October 2010 (UTC)
- Count Iblis, if you mean that temperature is an intensive property i.e. it is defined locally (it can change markedly over small distances) and discretely (per degree of freedom), I do not see how it can 'lose the relation between temperature and (renormalized) energy' --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- If e is some energy difference in some mcroscopic process (e.g. neded to excite a vibratinal mode), then e/(your units of energy) ---> 0, as you let "your units for energy" -----> infinity. The larger you are, the larger the unit for mass you would use. Giants of size x will use a giant version of the kilogram of mass proportional to x^3 and the energy needed to heat water one of their kilogram from freezing point to boiling point scales as x^3. But that neergy will be renormalized to a new unit. So, as you increase x, that amount of energy will stay constant, so
e/(your units of energy) will scale as x^(-3). Count Iblis (talk) 16:47, 8 October 2010 (UTC)
Definition of temperature in Lead (3)
I quickly scanned your discussion, and I think no one realized that the energy, in general, has no fixed relationship with temperature. Most of your formulas are only applicable to ideal gas, or even only monatomic ideal gas, but temperature is a universal property that any macroscopic body has. The only way to define it is through the zeroth law.--Netheril96 (talk) 01:09, 8 October 2010 (UTC)
- Temperature is directly proportional to the energy per (availiable) degree of freedom at equilibrium. That sounds very fixed to me. Just because the units of temperature are different from the units of energy does not mean that temperature is not a measure of energy. Because of the fact that temperature is proportional (not equal, proportional via the constant k/2 !) to the energy per degree of freedom, it is therefore a "measure" of that energy. This follows from the definition of "measure". Why is this so hard to understand? Why do the people who argue against temperature as a measure of energy refuse to confront and deal with these simple facts? PAR (talk) 01:28, 8 October 2010 (UTC)
- Because your "facts" are oversimplified. Again, that relationship only applies to monatomic idea gas. Even triatomic idea gas like thin water vapor deviates from the theory, not to mention real gas (liquid, solid) whose molecules interact with each other. Or maybe you can tell me how to measure the energy of water on a particular degree of freedom. I only can measure U,H,G,F,S which are functions of both T and V and sometimes other variables (for real substance).
- Even in natural units when k=1 and temperature has the same dimension as energy, that does not mean they are one thing.--Netheril96 (talk) 03:40, 8 October 2010 (UTC)
- I answered the non-monatomic gas question in the discussion that you "quickly scanned". If you disagree with something I posted, quote me and point out the error. The kT/2 formula applies to everything, unless someone brings up a counterexample or, heaven forbid, a reliable reference. Spiel496 (talk) 06:31, 8 October 2010 (UTC)
- I searched all your words in this page and didn't think that you had answered my question. Let's set it aside for now and (probably) come back later. Now I give you the mass, amount of substance, U,H,F,G of some water vapor, and you may take it as Van der waals gas for simplicity, please tell me T. Or you can directly measure the average kinetic energy or the energy distributed on a particle degree of freedom?--Netheril96 (talk) 09:27, 8 October 2010 (UTC)
- I assume you include in your data other thermodynamic data like the specific heats of the substance. In that case, the internal energy is equal to pV/2 times the effective number of degrees of freedom availiable per molecule, and the effective number of degrees of freedom is equal to the molar specific heat at constant volume divided by the gas constant. (Definitely take a look at specific heat, particularly the specific heat#Energy storage mode "freeze-out" temperatures section.) I'm sure you will note that the "effective degrees of freedom" is not a fixed number except for certain limiting cases. For a thin monatomic gas at high enough temperature, it is essentially 3. For a diatomic gas at high enough temperature, it approaches 5. The reason for this fuzziness is that the energy is quantized in every degree of freedom, and when the quantum levels become too coarse in a particular degree of freedom, it will not hold an energy of kT/2 exactly. At cold enough temperatures, it will hold practically none and that degree of freedom gets frozen out. Even for a monatomic ideal gas of point particles, if you contain it in a box with one dimension much shorter than the others, at low enough temperature, that short dimension (short thermal wavelength, high thermal frequency) degree of freedom will freeze out, and the kinetic energy of the particles will not be 3/2 kT per particle, but rather 2/2 kT per particle. In short, I agree, if you take quantum considerations into account, temperature is, strictly speaking, not a good measure of energy per degree of freedom under all conditions. But I think the point that people are making is that the kinetic energy of translational motion is almost always negligibly affected by quantum considerations. The case of a small box with one side smaller than the other at extremely cold temperatures is hardly ever part of the problem. Neglecting this bizarre case, the temperature is an excellent measure of this translational kinetic energy per degree of freedom. PAR (talk) 13:51, 8 October 2010 (UTC)
- No, I didn't intend to tell you the specific heat. Without defining temperature itself before, how could get specific heat as heat absorbed per temperature raise? You said a lot, and the only thing I learned from it is that translational kinetic energy per degree of freedom is only a rough estimate for temperature, not definition. Besides, still you don't answer my question: how to measure the average translational energy when there are definitely interference of potential energy.
- And I think we may have got lost about what we want. I don't object to taking temperature as a rough measure of translational energy, --Netheril96 (talk) 01:17, 9 October 2010 (UTC)
- Oh thats right, using specific heat makes it a circular argument. I will think more about that. But why do you say rough? If you say that its a measure, then its a very precise measure, the constant of proportionality between translational energy and temperature is 3k/T to within parts in billions or more for most cases. Thats not rough. PAR (talk) 01:47, 9 October 2010 (UTC)
- this is more rigorous, but I haven't finshed writing this up (as it stands the text so far is a bit misleading, I have to invoke ergodicity and other subtle points). Then, from the working in the canonical ensemble, you can derive the equipartition theorem in the classical limit. Count Iblis (talk) 01:30, 9 October 2010 (UTC)
- What you give me IS a rigorous definition. But that is totally different from what we are talking about. And before how many times I ask will any of you answer me how to measure the translational energy.--Netheril96 (talk) 04:52, 9 October 2010 (UTC)
- this is more rigorous, but I haven't finshed writing this up (as it stands the text so far is a bit misleading, I have to invoke ergodicity and other subtle points). Then, from the working in the canonical ensemble, you can derive the equipartition theorem in the classical limit. Count Iblis (talk) 01:30, 9 October 2010 (UTC)
- Oh thats right, using specific heat makes it a circular argument. I will think more about that. But why do you say rough? If you say that its a measure, then its a very precise measure, the constant of proportionality between translational energy and temperature is 3k/T to within parts in billions or more for most cases. Thats not rough. PAR (talk) 01:47, 9 October 2010 (UTC)
- I assume you include in your data other thermodynamic data like the specific heats of the substance. In that case, the internal energy is equal to pV/2 times the effective number of degrees of freedom availiable per molecule, and the effective number of degrees of freedom is equal to the molar specific heat at constant volume divided by the gas constant. (Definitely take a look at specific heat, particularly the specific heat#Energy storage mode "freeze-out" temperatures section.) I'm sure you will note that the "effective degrees of freedom" is not a fixed number except for certain limiting cases. For a thin monatomic gas at high enough temperature, it is essentially 3. For a diatomic gas at high enough temperature, it approaches 5. The reason for this fuzziness is that the energy is quantized in every degree of freedom, and when the quantum levels become too coarse in a particular degree of freedom, it will not hold an energy of kT/2 exactly. At cold enough temperatures, it will hold practically none and that degree of freedom gets frozen out. Even for a monatomic ideal gas of point particles, if you contain it in a box with one dimension much shorter than the others, at low enough temperature, that short dimension (short thermal wavelength, high thermal frequency) degree of freedom will freeze out, and the kinetic energy of the particles will not be 3/2 kT per particle, but rather 2/2 kT per particle. In short, I agree, if you take quantum considerations into account, temperature is, strictly speaking, not a good measure of energy per degree of freedom under all conditions. But I think the point that people are making is that the kinetic energy of translational motion is almost always negligibly affected by quantum considerations. The case of a small box with one side smaller than the other at extremely cold temperatures is hardly ever part of the problem. Neglecting this bizarre case, the temperature is an excellent measure of this translational kinetic energy per degree of freedom. PAR (talk) 13:51, 8 October 2010 (UTC)
PAR, you contribution, the second paragraph in this section. Temperature is a measure of specific energy, not energy in general, that is what needs to be made clear. Temperature is somewhat comparable with specific gravity (SG) which is Kg/m3; you would (I hope!) never quote or use SG as kilograms (unit of mass). Temperature has a big brother called Entropy which is a measure of the distribution of energy and is by contrast an extensive property which is defined in the theory Statistical mechanics. Temperature does not need statistical mechanics to define it, it is one of the assumptions on which statistical mechanics is based. --Damorbel (talk) 09:15, 9 October 2010 (UTC)
- Damorbel - I WOULD use the weight (in kilograms) of a 1-inch cube of material as a "measure" of its specific gravity. If the weight is w kilograms, then the specific gravity is Kw where K=6.102 x . The weight of the 1-inch cube is not EQUAL to its specific gravity, but it is a MEASURE of its specific gravity. If something is a "measure" of something else, the two things do not need to have the same dimensions. If I know the weight of the one inch cube, then I can calculate the specific gravity. That makes the weight of the one inch cube a "measure" of its specific gravity. I would NEVER quote the specific gravity as the weight of the cube. I WOULD say that the weight of the cube in kilograms is a "measure" of the specific gravity. By the way, entropy is not defined in statistical mechanics. It too has a thermodynamic definition which does not depend on molecules and microstates, etc. Entropy was defined by Clausius, long before Boltzmann explained it using statistical mechanics. Statistical mechanics gives an explanation of entropy, not a definition. I agree that temperature too, is not defined by statistical mechanics, but is rather explained by statistical mechanics. This means that the idea that temperature is a measure of the energy per translational degree of freedom is not true by definition, it is true as a result of the assumptions of statistical mechanics. PAR (talk) 13:18, 9 October 2010 (UTC)
- Netheril96 - regarding the measurement of energy per degree of freedom. I think that it cannot be measured thermodynamically, using the parameters you have given. The idea that temperature is proportional (via Boltzmann's constant) to energy per degree of freedom is a statistical mechanical idea, not a thermodynamic idea, and so cannot be measured thermodynamically. But does this mean it is not true? The situation is the same for entropy. In statistical mechanics, entropy is proportional (via Boltzmann's constant) to the log of the number of microstates that could possibly yield the given macrostate, or thermodynamic state. Given the same thermodynamic parameters you have provided, there is no way that they can be used to calculate the number of microstates availiable. To be consistent, you must also say that entropy is not a measure of the number of microstates. Are you also saying this? PAR (talk) 13:38, 9 October 2010 (UTC)
- (I was editing at the same time as PAR and duplicated some of his points.) Netheril96, I think the answer is that one cannot directly measure the kinetic energy per degree of freedom or infer it from U,H,F and G. With a monotonic ideal gas, everything is proportional to temperature and it's easy. But with a non-ideal gas or a liquid, where potential energies between particles are important, there isn't any kind of simple relationship between internal energy and temperature. Is this the point you're making? I'll assume "yes" and forge ahead: Some of the energy is in chemical bonds, some of it is in the potential energy associated with Van der Waals and hydrogen-bonding forces, and some of it is in the various K.E. components. I think what Kbrose and others are saying is that, while all of that makes it very difficult to calculate the heat capacity, one relationship remains very simple: The amount of the energy in each K.E. component is proportional to that thing that becomes equal when two systems are in thermal contact -- the temperature. Spiel496 (talk) 13:40, 9 October 2010 (UTC)
- The Boltzmann relationship is different from equipartition theorem in that the predictions of former have been tested precisely while the latter can only give approximate results (I mean results that can be experimentally tested). For example, it predicts all diatomic gas has molar specific heat capacity of 2.5R at room temperature, but in fact, the molar specific capacity of H2 ,NO, Cl2 are 2.53R, 2.57R, 3.02R respectively. And before we continue, I think we are consenting to specify in the article that your definition would only be valid when quantum effects are negligible.--Netheril96 (talk) 02:11, 10 October 2010 (UTC)
- Yes, it predicts 2.5 if quantum effects are neglected, and the variations from 2.5 are due to quantum effects, none of which affect the translational degrees of freedom. Also, yes, the question remains, how do you measure the energy per translational degree of freedom? You say that the Boltzmann relationship relating entropy to the log of the number of microstates has been verified experimentally. Please explain how that has been done. Temperature and entropy are conjugate variables - their product is a measureable energy. You can say its TdS or you can say where is the translational energy per molecule per degree of freedom and you can say where f is the number of microstates availiable, and then k cancels and you get . If you are saying that cannot be measured, then I would say neither can f. Because the energy T dS CAN be measured. Or, to put it another way, the energy per degree of freedom is , which is measureable, if, as you say, "The Boltzmann relationship has been tested precisely", which means is measureable. PAR (talk) 18:05, 10 October 2010 (UTC)
- OK, I admit I was wrong. But for your information, the deviation is not due to quantum effects; it is because of potential energy.--Netheril96 (talk) 05:38, 11 October 2010 (UTC)
- There's no deviation due to potential energy, and you both have the heat capacity incorrect. The prediction is 3 d.o.f. for the translational KE, 2 for the rotational modes, 1 for the vibrational KE and 1 for the vibrational potential energy. That's 7 total, so CV is 7/2*R=3.5R. That is the prediction for a diatomic gas, but quantization of the vibrational mode reduces it, especially for hydrogen. And so what? The energy in a non-frozen d.o.f. is still kT/2. Spiel496 (talk) 13:45, 11 October 2010 (UTC)
- The total number of degrees of freedom is 6 not 7. That the vibration degree of freedom counts double in the energy (because of the potential energy term) doesn't mean that it should count twice as a degree of freedom. The whole point of d.o.f. is to describe in how many independent ways the molecule can move. In case of a two atomic molecule in which the two atoms can move independently, that is clearly 6. Count Iblis (talk) 14:45, 11 October 2010 (UTC)
- That cannot be right. If its one degree of freedom, then it gets energy kT/2, end of story. Off the top of my head, I believe the potential energy and kinetic energy form the total energy kT/2 for the one vibrational degree of freedom, but this is at odds with Specific heat#Diatomic gas. PAR (talk) 14:53, 11 October 2010 (UTC)
- Any quadratic term in the Hamiltonian gets a k T/2 contribution to the internal energy in the classical limit by the equipartition theorem. But d.o.f. is a kinematic concept that you want to use as a book keeping device. So, you want to say that a two atom system has 6 d.o.f. because the two atoms can move independently and that you can also decompose the motion in terms of center of mass movement, rotation and vibration, and then you get 6 = 3 + 2 + 1. Then the fact that the 1 vibrational d.o.f. counts double when the two atoms are bound, is due to the potential energy term, but a potential energy term is always present, even if the two atoms are not bound.
- That cannot be right. If its one degree of freedom, then it gets energy kT/2, end of story. Off the top of my head, I believe the potential energy and kinetic energy form the total energy kT/2 for the one vibrational degree of freedom, but this is at odds with Specific heat#Diatomic gas. PAR (talk) 14:53, 11 October 2010 (UTC)
- The total number of degrees of freedom is 6 not 7. That the vibration degree of freedom counts double in the energy (because of the potential energy term) doesn't mean that it should count twice as a degree of freedom. The whole point of d.o.f. is to describe in how many independent ways the molecule can move. In case of a two atomic molecule in which the two atoms can move independently, that is clearly 6. Count Iblis (talk) 14:45, 11 October 2010 (UTC)
- So, to award it an extra degree of freedom, cannot be done merely because there is potential energy term, you would have to consider the precise form of the effective potential energy function. If this were to be an x^n potential, the contribution to the internal energy would be k T/n, so in the end it wouldn't even satisfy the desire to make the internal energy in the classical limit equal to d.o.f./2 k T in all cases. But to even consider the form of the effective potential energy function would be strange, because that already would in princple be based on a thermodynamic calculation (the potential is not precisely a harmonic potential, you can approximate it like that but with a cut-off, the molecule will dissociate at high temperatures). Count Iblis (talk) 15:25, 11 October 2010 (UTC)
- Well, clearly I should stop talking off the top of my head and revisit this subject. But if what you are saying is true, then the statement that each molecular degree of freedom contains an energy of kT/2 per molecule is false. That's what you are saying, right? PAR (talk) 20:58, 11 October 2010 (UTC)
- Hopefully my wording won't shoot me in the foot but I believe that this would mean that each molecular degree of freedom "contains" a 'kinetic' energy of kT/2. Consideration of potential energy as well adds an additional amount of kT/2 for each degree of freedom. As Count Iblis mentioned, the amount should depend on the nature of the potential. In the case of a diatomic ideal gas, the only dof that is associated with a potential is the vibration. Since rotations and translations (for ideal gasses at least) don't have an appreciable potential, only their kinetic energy contributes to the heat capacity and thus to the total energy. Another example would be to look at a solid where the classical heat capacity (before Einstein and Debye - or at high temperatures) should be equal to 3Nk, N being the number of particles in the solid. This result appears to have a heat capacity of k for each degree of freedom and a total energy of 3NkT, which can, in principle, be interpreted as kT/2 worth of kinetic energy and kT/2 worth of potential energy for each dof. Sirsparksalot (talk) 21:59, 11 October 2010 (UTC)
To Netheril96 - ok, but wrong about what? I think you are wrong that the number of microstates can be directly measured, but that still leaves your original question: How do you measure translational energy per degree of freedom, and if you can't, what does the concept really mean? For this article, I would be in favor of a statement something like "If quantum effects are negligible, temperature is a measure of the energy per molecule per degree of freedom of a substance. In any macroscopic situation, the translational degrees of freedom are practically unaffected by quantum considerations, and therefore the temperature is a very accurate measure of the translational kinetic energy per molecule of a substance." But I still wonder what that means if you cannot measure it. PAR (talk) 14:48, 11 October 2010 (UTC)
- In a word, I am confused. I had thought the equipartition theorem only hold precisely for gas whose molecules don't interact before I read some books on statistical mechanics and found that it holds under any circumstances in classical statistical mechanics. The microstates in fact cannot be measured, but the corollaries derived from work perfectly, which is why I said Boltzmann relationship had been experimentally verified. But since equipartition theorem is derived from the statistical mechanics postulates I think maybe it no longer needs experimental test. So far this is what I have in mind on this topic. When I actually finish my course of statistical mechanics (maybe next year) I will have more definite opinion.--Netheril96 (talk) 15:52, 12 October 2010 (UTC)
---
Wow! A lot has been said in the last week since I've been able to check in on the discussion. But I am glad that this debate has finally opened up as it is LONG overdue. I agree that the article has to be heavily edited, particularly with regard to the statement relating temperature and energy. My main issue is that so many people are making so many over-generalized statements. While it is true that one can determine the mean translational kinetic energy from the temperature (and vice versa) it is only one specific case that has a simple (i.e. direct) proportionality between temperature and energy. Yes, quantum effects are negligible for translational degrees of freedom, but unless we are talking about a monatomic ideal gas (which we almost never do when considering the number of thermodynamic systems of interest!) then quantum effects are VERY important and absolutely cannot be overlooked. It is perfectly OK to say that there is a relationship between energy and temperature, but there are far too many details to ever be able to put into a wiki article such as this. As nice as it would be to be comprehensive, I think it is more poignant to get the main points across. Rather than bicker about the details about how they are related, why can't we just say that they are related and move on? We could put a few examples of these relationships for a few specific cases (e.g. the average translational temperature of a gas is Ekin=3/2kT or, since ideal monatomic gasses only have translational degrees of freedom E=3/2kT can be used for the average total energy of an ideal gas per d.o.f.). What do people think about this approach. Sirsparksalot (talk) 19:41, 11 October 2010 (UTC)
- I agree with everything you say, except I hope we can craft a statement for the Lead section that would encompass a broader class of systems than monatomic gases. Saying the temperature is a measure of the translational kinetic energy of the molecules is accurate enough for molecular gases. This statement remains true even when there are significant inter-molecular forces. For solids, yes, one must bring up a bunch of caveats for quantum effects plus the fact that the motion is solely vibrational. I say leave that level of detail out of the Lead.Spiel496 (talk) 00:51, 12 October 2010 (UTC)
- Sorry for my miss-communication, the intention of putting the monatomic gas stuff in there was to be an example of what examples we could list. I didn't intend it to be exclusive. What about putting a new section in the article? We could clean up a lot of the stuff that is spread all over and lump it into one section like "relationship between energy and temperature" and put a few examples for different systems. Sirsparksalot (talk) 16:19, 12 October 2010 (UTC)