See? We agree that there is an equilibrium macrostate. When I fist asked you to clarify what did you mean by macrostate you told me that “it changes with time; even at equilibrium.”
We agree that the entropy depends on the macrostate. Note that the macrostate is our description of the system and depends on how we choose to describe it which normally depends on what are the constraints, how it was prepared., etc. It’s not just a property of the position of those balls.
It’s because we agree that Gibbs’ entropy is a function of the macrostate that I asked how did you define it in your example. You told me: “Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate.”
I asked “How does your configuration where the balls were near one corner in the cube cause mercury to rise to a different level than the configuration where they occupy a larger volume near the center?” and the answer “The balls have to touch thermometer” doesn’t cut it. The balls don’t touch the thermometer in either case.
You seemed to imply that the higher concentration means a different macrostate with lower entropy. Or maybe the low entropy in you example is because the balls are near a corner?
Anyway, it would indeed have been easier to say that definition of macrostate included the density of particles in each octant of the cube - or something like that.
>See? We agree that there is an equilibrium macrostate. When I fist asked you to clarify what did you mean by macrostate you told me that “it changes with time; even at equilibrium.”
I still stand by my statement. Even at equilibrium it can lower in entropy. The equilibrium is simply the highest entropy state.
>I asked “How does your configuration where the balls were near one corner in the cube cause mercury to rise to a different level than the configuration where they occupy a larger volume near the center?” and the answer “The balls have to touch thermometer” doesn’t cut it. The balls don’t touch the thermometer in either case.
I stated this is pedantism. The concept and intuition remain true. I changed the definition so that it's a volume around the thermometer if the particle is in that volume and heading for the thermometer is counts as a collision.
I stated all of this already.
>You seemed to imply that the higher concentration means a different macrostate with lower entropy. Or maybe the low entropy in you example is because the balls are near a corner?
Yes. The higher concentration has a lower probability of occurring. And occupies a different temperature reading on the thermometer. Each temperature reading is a different macrostate.
>Anyway, it would indeed have been easier to say that definition of macrostate included the density of particles in each octant of the cube - or something like that.
Sure, Divide the box into a bunch of cubes. If 1 or more particles are in the cube then that cube represents 1, otherwise 0. Add those numbers up and that represents a macrostate.
The inuition remains the same. For all particles to be concentrated in 1 cube is a very low probability. And the macrostate will be quite low too. With enough cubes and boxes such a state has a very low probability of occuring.
But all of this is, again, independent of your knowledge of where the particles are in each cube.
> And occupies a different temperature reading on the thermometer.
You are unable to explain how reducing the space occupied by the initial configuration you proposed could change the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?
> With enough cubes and boxes such a state has a very low probability of occuring.
Sure. The thing is that if you calculate an entropy using Gibbs formula from the distribution of microstates for a given macrostate the value that you obtain depends on how many cubes are used to define the macrostate. There is no entropy of the microstate - the entropy depends on how you decide to define the macrostate. If the lattice is fine enough, and the particles indistinguishable, in the limit the entropy is zero - the macrostate becomes the same as the microstate and one single microstate is possible.
It all depends on the description we make of the system. The "I have balls in a box and I know their positions" example was incomplete.
If for example I have N of those balls in equilibrium in an isolated box of volume V and I know the energy E I know the equilibrium macrostate. The energy won't change because it's isolated. The macrostate will not change. The (equi)probability distribution of the possible microstates corresponding to the macrostate won't change. Gibbs entropy which is calculated using that distribution won't change.
If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is. It doesn't matter if you know the position of each particle. The entropy for the thermodynamical system described in the previous paragraph is independent of your knowledge of where the particles are or how unlikely you think their positions are.
I agree that you could use alternative ways of defining the macrostate where that happens! (And calculating Gibbs entropy will give different results. One can even get zero entropy using the microstate as macrostate - and one may actually want to do that if the microstate is known!)
> Even at equilibrium it can lower in entropy. The equilibrium is simply the highest entropy state.
I would understand that you said either:
"the equilibrium is the highest entropy state but there may be fluctuations that shift the system temporarily out equilibrium"
or
"the equilibrium is the highest entropy state and possibly a distribution of states around it."
Both concepts are used. When you say that the equilibrium is simply the highest entropy state but the equilibrium entropy can also be lower I'm not sure if it can be read as one of those or you're saying something else entirely.
>You are unable to explain how reducing the space occupied by the initial configuration you proposed could change the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?
You're unable to read my explanation which i've already repeated twice. Go back and look it up. I even went with your cubical definition.
I don't agree. Entropy is lower when the current macrostate has a low probability of occuring.
>There is no entropy of the microstate - the entropy depends on how you decide to define the macrostate
The macrostate is defined in terms of possible microstates. Thus Entropy relies on both microstate and macrostate.
>If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is.
Wrong. Energy wouldn't change. Macrostate changes. Entropy changes.
>It doesn't matter if you know the position of each particle. The entropy for the thermodynamical system described in the previous paragraph is independent of your knowledge of where the particles are or how unlikely you think their positions are.
Isn't this my point? And weren't you against my point? Now you're in agreement? My point was entropy is independent of knowledge. Your point was that it is dependent.
>I agree that you could use alternative ways of defining the macrostate where that happens! (And calculating Gibbs entropy will give different results. One can even get zero entropy using the microstate as macrostate - and one may actually want to do that if the microstate is known!)
Except you don't have to do this. My point remains true given MY stated definitions of macrostate.
>Both concepts are used. When you say that the equilibrium is simply the highest entropy state but the equilibrium entropy can also be lower I'm not sure if it can be read as one of those or you're saying something else entirely.
I made a true statement that from what I can gather you agree it's true. You're just trying to extrapolate my reasoning behind the statement. It's pointless. Example: If I give you a number, say -1, do I mean i^2 or 0 - 1? because one of those equations is impossible to express in reality.
>>If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is.
> Wrong. Energy wouldn't change. Macrostate changes. Entropy changes.
That's my thermodynamical system - that I decided to describe using the state variables N, V, E. S_kgwgk = constant
I have agreed that you can chose to define macrostates as you wish! (You cannot measure them though. You don't have access to my system. They are purely hypothetical.) And then you can say things like "If the position of the particles in your system is X, S_deltasevennine = whatever."
Hopefully you will agree that someone else could choose to define the macrostate differently and say things like "If the position of the particles in kgwgk's system is X, S_anon = somethingelse."
Let's imagine that the position of the particles is actually X. What is the entropy of the system then? S_kgwgk? S_deltasevennine? S_anon? They are all different!
If you think that you can define macrostates for my system in some arbitrary way (your own words: "I'll choose something arbitrary.") and others can't - or that yours are somehow more real - I really wonder why.
And as I said, someone could also come and say "If the position [and velocities] of the particles in kgwgk's system is X, S_vonneumann = 0." [a complete definition of the microstate requires knowing both position and momentum - there might have been some ambiguity about that before but it shouldn't distract us from the main points]
I thought we agreed on my definition. If you want to make your own definitions of macrostate sure. No rule against that, I just don't see your point.
>If you think that you can define macrostates for my system in some arbitrary way (your own words: "I'll choose something arbitrary.") and others can't - or that yours are somehow more real - I really wonder why.
I really wonder why you even say this. Anyone can make up a macrostate we just choose to agree on one for discussions sake. But in the real world things like pressure and temperature are some universal agreed upon ones. I simply made one up so we can center our discussion around it and you took it into other territories.
You can switch up definitions all you want. But no matter your definition of macrostate, this remains true:
Entropy is independent of knowledge. Your initial argument was against that. I haven't seen you make any argument in your own favor. Just side discussions on who's definition of macro state to use.
> If you want to make your own definitions of macrostate sure.
Note that I was the first one to propose a particular way to define the macrostate of the system - using energy - trying to understand what you meant. Your reply was "that's not the definition of macrostate. It's just one arbitrary choice you have chosen." And you gave your own arbitrary choice "I'll choose something arbitrary. Temperature as measured by a thermometer".
The point is that they are all arbitrary. (Energy is a conserved quantity for a closed system so it's arguably more natural - but let's say that any choice is equally arbitrary.)
> But no matter your definition of macrostate, this remains true:
> Entropy is independent of knowledge.
Maybe be can agree that [Gibbs] entropy [which is a property of the distribution of microstates corresponding to our description of the system based one some macroscopic properties] is independent of knowledge [other than about that particular macroscopic description].
The discussion started with me trying to understand what did you mean by "low entropy" when you said:
"Let's say those balls all have a random initial velocity at a random direction but all those balls are initially positioned near one corner in the cube. Thus the balls from a position stand point start with low entropy."
We agree that if the box is isolated and we define the state using N,V,E there is one single macrostate and one single value for the entropy - independent of the position of the particles. We agree that we can define thermodynamical systems using other variables and calculate other entropies. We agree that the entropy is not determined by the configuration of the particles alone.
(Even when a thermodynamical system is defined the entropy is not necesarily a function of the microstate when it refers to only a part of the system. The microstate may not fully determine the macroscopic description. If that box is in equilibrium in a heat bath the temperature is fixed and there is some corresponding entropy. But the same microstate can happen for equilibrium systems at different temperatures and therefore with different entropies.)
You said that you're using temperature as measured by a thermometer as the macroscopic property describing the system. And I'm still trying to understand how "higher concentration" means "lower entropy" even in the context of your own arbitrary choice.
Temperature works well as a state variable for a system in thermal equilibrium which has the same temperature everywhere. You mention the level of mercury and the way a thermometer works is by reaching thermal equilibrium between the mercury and the thing being measured. Note that if for some reason all the particles in the gas go away from the thermometer for a second the temperature of the mercury won't change (ignoring that it will radiate energy over time). The reading of the thermometer doesn't drop to zero just because there is nothing there.
Let's assume that you are somehow measuring the local temperature in a small region around some specific point. What you measure will not depend in any way on the particles that are elsewhere. The particles in the rest of the box could be well spread or all near one corner and you would have no reason to say that the latter configuration is lower entropy than the former based on your macrostate.
Another way to look at it: you reasoning seems to be that the state with the particles near one corner has low entropy because they are close to each other and there is some temperature reading in your far-away thermometer lower than the equilibrium temperature and then the entropy is low. If a similar cluster of particles was close to the thermometer rather than far from it I guess that you will tell me that the temperature reading would be higher than in equilibrium (and the entropy would also be lower).
But if you put that cluster of particles at some intermediate position the reading of the thermometer will be the same as in equilibrium! Your macrostate will be the same as in equilibrium. Your entropy will not be lower than in equilibrium. Even though you said -if I understood you correctly- that the entropy should be lower because the higher concentration has a lower probability of occurring.
> Note that I was the first one to propose a particular way to define the macrostate (energy) of the system to understand what you meant. Your reply was "that's not the definition of macrostate. It's just one arbitrary choice you have chosen." And you gave your own arbitrary choice "I'll choose something arbitrary. Temperature as measured by a thermometer".
No. You explicitly ASKED for my definition, then I said that.
When I said that's not the "definition of macrostate" I did not make an arbitrary choice there. I simply stated that IF you think energy was the definition of macrostate, then you are wrong.
> Maybe be can agree that [Gibbs] entropy [which is a property of the distribution of microstates corresponding to our description of the system based one some macroscopic properties] is independent of knowledge [other than about that particular macroscopic description].
This is the point of the ENTIRE thread starting from your INITIAL reply. Technically this argument is over. You weren't in agreement with me, now you are, so you were wrong and I was right. This sentence admits that.
The rest of the stuff here is side tangents and it's hard to see the main point. I can entertain it though for a little bit but this here is essentially the end.
>The discussion started with me trying to understand what did you mean by "low entropy" when you said:
This discussion started with you saying that if I know the microstate of the system entropy is zero. If you wanted to understand What I thought about "low entropy" then this was NEVER stated. The thread is based off what is stated and what is not stated. It is not based off of what your internal thoughts and intentions are. If you want the topic to based off your own thoughts, they need to be expressed explicitly in statements. You did so just now, but way too late.
>Note that if for some reason all the particles in the gas go away from the thermometer for a second the temperature of the mercury won't change (ignoring that it will radiate energy over time). The reading of the thermometer doesn't drop to zero just because there is nothing there.
This is pedantic. Obviously. I can take it into account. Let's say the greatest average speed measured of a particle and the time it takes for a particle at this speed to travel across the box is the time it takes for the thermometer to drop to zero degrees from the maximum temperature. It's quite fast but particles within the vicinity will keep the mercury level stable but if they were concentrated in a corner, the mercury drops fast enough to change the reading of the thermometer.
>Let's assume that you are somehow measuring the temperature in a small region somewhere with a very high reaction time. What you measure will not depend in any way on the particles that are elsewhere. The particles in the rest of the box could be well spread or all near one corner and you would have no reason to say that the latter configuration is lower entropy than the former based on your macrostate.
I specifically defined it as a thermometer to make location matter. Switching the location of the thermometer is switching the definition as well.
I really don't see what your point is. You think I'm wrong about something? What am I wrong about?
>But at some intermediate point the reading of the thermometer will be the same as in equilibrium. Your macrostate will be the same as in equilibrium. Your entropy will not be lower than in equilibrium. Even though you said -if I understood you correctly- that the entropy should be lower because the higher concentration has a lower probability of occurring.
Your point seems to be buried in here somewhere and I can't parse it. There is a macrostate that is equilibrium, yes.
Are you referring to mercury level mid transition? This is pedant-ism to the max if you are. Yes the mercury level will display the WRONG temperature if it's mid transition. Not willing to constantly adjust the model to little flaws you find. Last time: Let's switch to a digital thermometer that displays temperature at time intervals that are equal to the length of time it takes for the maximum speed particle to travel across the box. There is no transition value now. All temperature readings reflect an instantaneous observed truth at an instantaneous point in time, but that temperature is displayed at non-instantaneous intervals.
It also seems to me that your definition of macrostate is meaningless. The total energy of the universe is hypothetically the same all the time. If the macrostate was just energy there's no point to it, because entropy would then be an unchanging constant.
I think we're done here. I'm just trying to guess what you're driving at. You'll need to clarify your point if you want me to continue. What exactly are you trying to say here?
> Let's say the greatest average speed measured of a particle and the time it takes for a particle at this speed to travel across the box is the time it takes for the thermometer to drop to zero degrees from the maximum temperature. It's quite fast but particles within the vicinity will keep the mercury level stable but if they were concentrated in a corner, the mercury drops fast enough to change the reading of the thermometer.
That's not how a mercury thermometer works. The mercury "drops" when the temperature of the mercury goes down. This happens when the is a flow of energy between the thermometer and the matter inside the box. If all the particles were concentrated elsewhere there would be no interaction, no transfer of energy, no change in the temperature of the mercury, no change in its level. It's not a question of waiting a second to see it go to zero in a vacuum. (Again, I'm considering thermal contact which is the main effect in play and and ignoring radiation energy losses.)
But that's an irrelevant aside and I had already accepted that you can get that kind of reading with some kind of thermometer. I'll concentrate on what I think is the main open point:
> I specifically defined it as a thermometer to make location matter. Switching the location of the thermometer is switching the definition as well.
I'm not switching the location of the thermometer. I'm switching the location of the particles. I'm trying to understand how the entropy - that you calculate from the temperature reading in one thermometer placed in some specific place - depends on the location of the particles.
You've written that: "The higher concentration has a lower probability of occurring. And occupies a different temperature reading on the thermometer. Each temperature reading is a different macrostate."
But if only the particles in the vicinity of the thermometer matter for the local temperature number that you take as describing the macrostate you would get the same reading (and the same entropy) if the rest of the particles changed positions -being more or less concentrated- as long as they remained out of the vicinity of the thermometer. Is that wrong?
> Your point seems to be buried in here somewhere and I can't parse it.
Maybe it was a mistake to talk again about equilibrium... I was trying to show that there are also concentrated configurations of the particles that nevertheless you will consider maximum entropy - using your determination of the macrostate as "what this thermometer says" if the temperature is equal to the equilibrium temperature then the entropy is the same, right?
The main point is that instead of having the balls "positioned near one corner in the cube" they could be positioned in exactly the same relative configuration (the same concentration) near any other corner and the thermometer reading would be different. The entropy is not directly related to the concentration - it depends also on the location of the particles relative to the thermometer. Or maybe the thermometer reading will be same when the particles are in any corners but then it will also be the same for other relative positions so you would also have different levels of concentration corresponding to the same entropy.
I was trying to point out that having a concentration of the particles (different from the homogeneous equilibrium state) doesn't automatically mean lower entropy in your example. The same relative position of the particles among themselves can correspond to different temperatures in your thermometer - including the equilibrium temperature. (The idea was that if the concentrated particles are too far the temperature is "too cold". And maybe if they are too close the temperature is "too hot". But there is at least some goldilocks intermediate distance such that if you put there the initial concentrated state the temperature according to you would be "just right".)
>That's not how a mercury thermometer works. The mercury "drops" when the temperature of the mercury goes down. This happens when the is a flow of energy between the thermometer and the matter inside the box. If all the particles were concentrated elsewhere there would be no interaction, no transfer of energy, no change in the mercury level. (Again, I'm considering thermal contact which is the main effect in play and and ignoring radiation energy losses.)
More pedantism. I'm using an example to make a point and I CLEARLY already edited the example enough times so that it's obvious that I'm Not building a physically accurate model. Yes as you said it is completely irrelevant.
>But if only the particles in the vicinity of the thermometer matter for the local temperature number that you take as describing the macrostate you would get the same reading (and the same entropy) if the rest of the particles changed positions as long as they remained out of the vicinity of the thermometer. Is that wrong?
No that's right. The particles are large in number. Any state where no particles are in the viscinity of the thermometer is a low probability. All microstates that encompass this property occupy at the 0 temperature macrostate.
>The main point is that instead of having the balls "positioned near one corner in the cube" they could be positioned in exactly the same relative configuration (the same concentration) near any other corner and the thermometer reading would be different. The entropy is not directly related to the concentration - it depends also on the location of the particles relative to the thermometer. Or maybe the thermometer reading would be same when the particles are in any corners but then it will also be the same for other relative positions so you would also have different levels of concentration corresponding to the same entropy.
In this definition macrostate is related to vicinity. And in a physical thermometer in reality it is also directly related to particles in actual contact with the thermometer. Vicinity matters for both cases.
>I was trying to point out that having a concentration of the particles (different from the homogeneous equilibrium state) doesn't automatically mean lower entropy in your example. The same relative position of the particles among themselves can correspond to different temperatures in your thermometer - including the equilibrium temperature. (The idea was that if the concentrated particles are too fa" the temperature is "too cold". And maybe if they are too close the temperature is "too hot". But there is some goldilocks intermediate distance such that if you put there the initial concentrated state the temperature according to you would be "just right".)
You're not getting it. The entropy states I'm describing are so low in probability that they basically never will happen in your lifetime or the lifetime of this universe.
The reality is, if particles are concentrated in the corner of a box then yes, a thermometer will register 0, but you will never see this.
They don't jive with your intuition of temperature because of this. But also because the word temperature is poorly defined when applied to a box. What does it mean when you say "The temperature of the room." What part of the room at what point in time? That's why I referenced the thermometer, to prevent this type of tangent in the discussion.
There's also ambiguities like what is the temperature of a volume of space smaller than any particle can occupy? But it's pointless to discuss all of this because this isn't a discussion about anything other then linguistics. We are arguing about language and vocabulary here and that to me is not an interesting thing to talk about. It's an illusion. It's like when people argue about "what is life?" without realizing they're just arguing about the definition of a arbitrary vocabulary word. It's sort of what's going on here, I believe.
If you are in agreement with me that entropy is independent of knowledge then we're good; because as far as I was concerned this was the point you were trying to make.
> Any state where no particles are in the viscinity of the thermometer is a low probability. All microstates that encompass this property occupy at the 0 temperature macrostate.
Ok. I'm glad to read that we finally agree that in your example -using the macrostate you defined- the entropy of the "particles near a corner" state is not different from the entropy of any other "particles not near the thermometer" state.
Right after you said "Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate." - I asked "How does your configuration where the balls where near one corner in the cube causes mercury to rise to a different level than the configuration where they occupy a larger volume near the center?"
You could have answered "it doesn't - that's the same macrostate with the same temperature and the same entropy" back then.
Later I pointed out that "Reducing the space occupied by the initial configuration [doesn't change] the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?"
Instead of answering "I don't agree. Entropy is lower when the current macrostate has a low probability of occuring." you could have said "I agree. I describe the temperature as zero in all those cases, it's the same macrostate and the entropy will be the same."
Better late than never, anyway.
> the word temperature is poorly defined when applied to a box.
I agree! You're the one who picked the reading of that thermometer as the thing that would determine the macrostate and the entropy - and maintained that higher concentrations would mean different thermometer readings and lower entropy.
> If you are in agreement with me that entropy is independent of knowledge then we're good; because as far as I was concerned this was the point you were trying to make.
If we agree that the entropy is not a physical property of (the microstate of) the system but a property of a particular (thermodynamical) description of the system - and that different descriptions of the same physical system are possible resulting in different entropies - we're good. That's the clarification I wanted to make.
>Ok. I'm glad to read that we finally agree that in your example -using the macrostate you defined- the entropy of the "particles near a corner" state is not different from the entropy of any other "particles not near the thermometer" state.
This has been the definition since I brought it up. That's how thermometers work. It's the obvious consequence of my description and it's how physicall thermometers generally work. You never made it clear that was your problem with it. Well, now that we're both clear there is no problem with it, I think initially it most likely it just didn't jive with your intuition about temperature.
>You could have answered "it doesn't - that's the same macrostate with the same temperature and the same entropy" back then.
I did answer this: "The balls have to touch thermometer. Otherwise the thermometer observes nothing." You failed to pick up on what I meant. If the balls don't touch the thermometer is means "it doesn't" change the thermometer and therefore it doesn't change the macrostate.
You just didn't get it. OK you can accuse me of not explaining it thoroughly. But I can equally at the same time accuse you of not being capable to interpret what I said. I think what's going on here is that you're too stuck in your headspace of thinking that your intuition of entropy was more accurate then mine and you wanted me to come to this realization. I think what happened here is the reverse of this. Your intuition about entropy was less crystallized then mine and most of the communication issues we had was because you thought it was the other way around.
>Instead of answering "I don't agree. Entropy is lower when the current macrostate has a low probability of occuring." you could have said "I agree. I describe the temperature as zero in all those cases, it's the same macrostate and the entropy will be the same."
I told you to refer to my explanation of the macrostate. Yeah in all those cases it will be the same. Ok fine. I could've said that. But I didn't realize that was what you were hung up on. To throw it back at you, you could've said this:
"it doesn't make sense to me that different concentrations of particles in different parts of the box yield the same temperature reading AKA the macrostate. Please explain how this can be possible"
But again I think you didn't realize this was your hang up. I think you thought the my definition of macrostate was a "bad" or something and you were trying to make me realize it?
>I agree! You're the one who picked the reading of that thermometer as the thing that would determine the macrostate and the entropy - and maintained that higher concentrations would mean different thermometer readings and lower entropy.
Yeah I picked it because it's the most intuitive one to illustrate rising entropy. The "2nd law". It's the classic model everyone has in there heads of the "heat death" of the universe.
There are technical issues with the word, but it's the most commonly used one, and we only discussed those technicalities because of a fundamental misunderstanding on your end. IF the intuition was correct, the technicalities do not need to be discussed, or thought about.
>If we agree that the entropy is not a physical property of (the microstate of) the system but a property of a particular (thermodynamical) description of the system -
Sort of. Entropy is directly tied to macrostate, but macrostate is determined by microstates. There is a relationship here from entropy to microstate but not direct.
Also ALL possible macrostate configurations, ARE independent of knowledge of the system. There is no case where knowledge can influence it. There is no case where your initial reply: "If you know the positions and velocities of the balls at all times arguably the entropy is always zero." is correct.
Unless of course you define a macrostate where it's zero all the time. But obviously what you're saying here is that knowledge changes the entropy. Which is not true.
> Entropy is directly tied to macrostate, but macrostate is determined by microstates.
There is not one true macrostate determined by the microstate - it depends on how we chose to describe the system. Your "reading of this thermometer" macrostate is not more real than the "energy" macrostate.
> There is a relationship here from entropy to microstate but not direct.
There is a relationship that depends on what is the thermodynamical model used to describe the system.
> There is no case where your initial reply: "If you know the positions and velocities of the balls at all times arguably the entropy is always zero." is correct.
Theoretically I could make the entropy go as close to zero as I want by making the macrostate more and more detailed. Quantum thermodynamics is one case in which entropy is zero when a state is perfectly known.
> But obviously what you're saying here is that knowledge changes the entropy. Which is not true.
What I'm saying is that "the entropy" doesn't exist outside of our minds. It's the product of our description of the thermodynamical system. Depending on the model that we think of we produce a different entropy.
>What I'm saying is that "the entropy" doesn't exist outside of our minds. It's the product of our description of the thermodynamical system. Depending on the model that we think of we produce a different entropy.
This whole discussion is pointless. It kind of angers me that we dived into this pointless pedantic side quest. If this is all you're trying to say then clearly when I defined a custom macrostate then you already knew I knew this. Why bother arguing about useless details? I think you're not being entirely honest here. I think you were mistaken about entropy OR you have certain issues going on with you as you spent so much time pushing this pointless thread forward. But whatever.
>What I'm saying is that "the entropy" doesn't exist outside of our minds. It's the product of our description of the thermodynamical system. Depending on the model that we think of we produce a different entropy.
No offense but this is probably the worst way to describe it. I have a mathematical model of a projectile motion. The projectile motion is a product of my mind because I made up that model. This is a COMPLETELY pointless statement and is analogous to yours.
I think this is snobbery. Either you assumed I didn't know what entropy was and in your snobbery wanted to teach me with some "revelatory" counter examples (which wasted so much time) or YOU didn't know what entropy was and you're just not admitting it.
>There is a relationship that depends on what is the thermodynamical model used to describe the system.
Right before you wrote this you wrote: "entropy is not a physical property of (the microstate of) the system but a property of a particular (thermodynamical) description of the system" which is completely contradictory. There is no way for entropy to ONLY be about macrostate WHEN macrostate is defined in TERMS of microstates.
It's like saying Trees have nothing to with atoms because trees are made of wood. Well guess what? Wood is made out of atoms!
This whole line is contradictory.
>Theoretically I could make the entropy go as close to zero as I want by making the macrostate more and more detailed. Quantum thermodynamics is one case in which entropy is zero when a state is perfectly known.
Wow this whole conversation started because you wanted to make light of some pedantic detail. Yeah if you define macrostate as a microstate sure. But if you want to go pedantic we can, your statement is STILL incorrect in light of your new example. Entropy is not ALWAYS zero if you know position and velocity of the balls, simply because this is not ALWAYS the case.
There are counter examples to your ALWAYS qualifier which makes your statement definitively false.
You have not answered my question. You aren't getting it. Rising Entropy is just an example for illustrative purposes, entropy is simply a probabilistic phenomenon. The question is deeper then that and is about the nature of probability and reality itself.
>Only if we keep adding uncertainty / forgetting information as we go forward will entropy increase.
Bro. Forgetting stuff doesn't change entropy. We already agreed that entropy is independent of knowledge. Yes Entropy is defined generically in that it requires an additional definition of a macrostate in order for the equation to be fully utilized.... but it is still in this case independent of knowledge.
You could say that forgetting stuff requires you to use lower resolution macrostate models in practice but this independent of the theoretical equation and is just a really stupid and over-complicated way of describing it.
For example: you could say that the volume of a rectangular prism is wlh. If someone forgets h, does that change the volume of the prism? Only if that someone is an ass hole (aka philosopher). Don't be an ass hole. This kind of "knowledge" related stuff can literally be applied to everything and is a rabbit hole with no final destination. The heart of it lies deeper.
"In every single initial configuration the particles will be at some distance from the thermometer and will cause it to read 0 according to you reasoning (until some time passes). I guess they all correspond to the same "absolute zero macrostate".
What I still don't see is how do you calculate the entropy using Gibbs entropy formula in such a way that the entropy is lower or higher for some initial configuration depending on the position of the particles."
Or somewhere up this thread I said very clearly "The balls don’t touch the thermometer in either case. You seemed to imply that the higher concentration means a different macrostate with lower entropy."
and you insisted that "Yes. The higher concentration has a lower probability of occurring. And occupies a different temperature reading on the thermometer. Each temperature reading is a different macrostate."
It's not my fault that you insisted that the higher concentration has a lower probability of occurring and occupies a different temperature reading on the thermometer even when I was clearly talking about what you say is the same macrostate with the same temperature.
See? We agree that there is an equilibrium macrostate. When I fist asked you to clarify what did you mean by macrostate you told me that “it changes with time; even at equilibrium.”
We agree that the entropy depends on the macrostate. Note that the macrostate is our description of the system and depends on how we choose to describe it which normally depends on what are the constraints, how it was prepared., etc. It’s not just a property of the position of those balls.
It’s because we agree that Gibbs’ entropy is a function of the macrostate that I asked how did you define it in your example. You told me: “Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate.”
I asked “How does your configuration where the balls were near one corner in the cube cause mercury to rise to a different level than the configuration where they occupy a larger volume near the center?” and the answer “The balls have to touch thermometer” doesn’t cut it. The balls don’t touch the thermometer in either case.
You seemed to imply that the higher concentration means a different macrostate with lower entropy. Or maybe the low entropy in you example is because the balls are near a corner?
Anyway, it would indeed have been easier to say that definition of macrostate included the density of particles in each octant of the cube - or something like that.