Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> As time increases, entropy increases. The balls as a system begin to increase in entropy. Entropy continues to rise towards an equilibrium.

If you know the positions and velocities of the balls at all times arguably the entropy is always zero. What defintion of entropy are you using?



The statistical version. The equation is independent of knowledge.

Your logic is flawed. Think about it. Knowing more or less about a system does not change it's entropy.

If I knew the position and velocity of every particle in a gas chamber does that somehow lower it's temperature? No.

Macro-states summarize perfect information. But with knowledge of perfect information, you still have the ability to summarize it. Or in other words, probability still exists even when you have perfect knowledge of everything and the future.


If you mean Boltzmann's formula it will be valid for systems in thermodynamical equilibrium. Is your system in thermodynamical equilibrium?

You talk about macrostates. How do you define a macrostate for your initial configuration? Why this one and not another? Does it change over time?

Think of it from another angle. You have a box with some of these balls inside - all you know is that they all have the same energy. Do you agree that the macrostate won't change over time and the entropy will remain constant?

However, if you knew the positions they will be concentrated in some regions more than in others. That's what you called lower entropy in your example.

So what is it, does knowing more about a system change it's entropy or not?


> If you mean Boltzmann's formula it will be valid for systems in thermodynamical equilibrium. Is your system in thermodynamical equilibrium?

Gibbs

>You talk about macrostates. How do you define a macrostate for your initial configuration? Why this one and not another? Does it change over time?

I'll choose something arbitrary. Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate. And yes it changes with time; even at equilibrium.

>Think of it from another angle. You have a box with some of these balls inside - all you know is that they all have the same energy. Do you agree that the macrostate won't change over time and the entropy will remain constant?

No don't agree. If balls are located on one side of the box and the thermometer is on the other side of the box. The thermometer reads nothing. When the balls increase in entropy they collide with the thermometer producing a reading.

>However, if you knew the positions they will be concentrated in some regions more than in others. That's what you called lower entropy in your example.

Yeah I already don't agree with you, so the rest of your argument is gone. If you define macrostate as total energy in a system then yeah it never changes. But that's not the definition of macrostate. It's just one arbitrary choice you have chosen.


> If you define macrostate as total energy in a system then yeah it never changes. But that's not the definition of macrostate.

The more I think about your reply the less sense it makes to me.

What do you think is the definition of macrostate?

The standard notion in statistical mechanics is that if we have, for example, a volume of gas in equilibrium at (constant) ambient temperature and we measure the pressure it doesn’t change. The macrostate doesn’t change. That’s what being in equilibrium means. The macrostate is in that case defined by the variables P,T,V. If all you knew was the value of these three variables and they didn’t change how could the macrostate - or the entropy - change?

You tell me that if you know the precise position of the particles of that gas then the macrostate and the entropy change all the time.

But you also tell me that “Knowing more or less about a system does not change it's entropy.” Which is in flagrant contradiction with the two previous paragraphs.


>The more I think about your reply the less sense it makes to me.

I think the issue is more with your understanding then my explanation.

>What do you think is the definition of macrostate?

We first introduce the very fundamental statistical ideas of microstates and macrostates. Given a system (e.g., a gas), we view it as built from some elementary constituents, (e.g., molecules). Each constituent has a set of possible states it can be in. The thermodynamic state of the system (which characterizes the values of macroscopic observables such as energy, pressure, volume, etc. ) corresponds to many possible states of the constituents (the molecules). The collection of states of all the constituents is the microstate. To keep things clear, we refer to the macroscopic, thermodynamic state as the macrostate. The vast disparity between the number of possible macrostates versus microstates is at the heart of thermodynamic behavior! The number of distinct microstates giving the same macrostate is called the multiplicity of the macrostate. The multiplicity is a sort of micro-scopic observable which can be assigned to a macrostate.

>The standard notion in statistical mechanics is that if we have, for example, a volume of gas in equilibrium at (constant) ambient temperature and we measure the pressure it doesn’t change. The macrostate doesn’t change. That’s what being in equilibrium means. The macrostate is in that case defined by the variables P,T,V. If all you knew was the value of these three variables and they didn’t change how could the macrostate - or the entropy - change?

This notion is wrong. It CAN change. It just has an extreme low probability of changing from equilibrium to some low entropy state. The probability is low enough that you practically don't need to consider it, but you must consider it from a technical standpoint.

If by sheer luck all gas particles moved to the exact left side of the container. My measurement tool (thermometer) for macrostate was on the right side of the container then it registers zero. There is nothing in the laws of physics that prevents this from happening. Only probability makes this situation unlikely to happen.

>You tell me that if you know the precise position of the particles of that gas then the macrostate and the entropy change all the time.

No. I'm saying that the temperature reading on the thermometer a macroscopic measurement is INDEPENDENT of knowledge. Your mind doesn't control microstates and thus the macrostate of the system.

>But you also tell me that “Knowing more or less about a system does not change it's entropy.” Which is in flagrant contradiction with the two previous paragraphs.

Flagrant? You're offended? Well you can leave if you're offended. Using words like this also Offends me so we can end the conversation.


> I think the issue is more with your understanding then my explanation.

That's what I said, that it doesn't make sense to me... hopefully it does make some sense to you!

> This notion is wrong. It CAN change. It just has an extreme low probability of changing from equilibrium to some low entropy state.

You also said before that the macrostate "changes with time; even at equilibrium".

There is no point in discussing fluctuations if we can't agree on the notion of equilibrium and the corresponding macrostate defined by its (average) macroscopic properties.

It's fine that you have your very own notion of things. Unfortunately it makes the discussion difficult.

BTW, no need to be offended: https://www.classicthesaurus.com/flagrant_contradiction/syno...


>There is no point in discussing fluctuations if we can't agree on the notion of equilibrium and the corresponding macrostate defined by its (average) macroscopic properties.

You mean microscopic. Equilibrium would then be the temperature that the thermometer is most likely to read at any point in time given infinite time. Do you agree or disagree. If you agree it's fine.

So with time eventually the particles will configure itself into a macrostate that reads 0 on the thermometer.


What I’m trying to say is this: https://books.google.ch/books?id=vsU2DwAAQBAJ&lpg=PA45&dq=eq...

“An isolated system is in equilibrium if the sharp constraints and expectation values which define its macrostate pertain to constants of the motion only.”

“Whenever the macroscopic data include an expectation value, a measurement of the pertinent observable may yield a wide range of outcomes.”

I’m not making up the idea of an equilibrium state being associated with a macrostate.


This is my definition: https://www.sciencedirect.com/topics/mathematics/equilibrium...

"The equilibrium macrostate is that with the most microstates, and this is the state of greatest entropy. A macroscopic flux is most likely in the direction of increasing entropy."

>I’m not making up the idea of an equilibrium state being associated with a macrostate.

I know you aren't. TBH not even sure why equilibrium needs to be brought up. The topic "is entropy is independent of knowledge?" You disagree with that statement.

Macrostate is defined in terms of microstates and is independent of your knowledge of the exact configuration of the microstates. You have some blunt tool like the thermometer that gives you macroscopic data and that hides the microstates from you.

But if I had some precision tool that reads the position of every particle I can still identify how that configuration of microstates will influence the blunt tool. The blunt tool hides knowledge, but your knowledge of the microstate does not change the reading on that blunt tool.

Because entropy is defined in termms of macrostate it stays the same regardless of which tool you used to do the measurement.


> This is my definition

See? We agree that there is an equilibrium macrostate. When I fist asked you to clarify what did you mean by macrostate you told me that “it changes with time; even at equilibrium.”

We agree that the entropy depends on the macrostate. Note that the macrostate is our description of the system and depends on how we choose to describe it which normally depends on what are the constraints, how it was prepared., etc. It’s not just a property of the position of those balls.

It’s because we agree that Gibbs’ entropy is a function of the macrostate that I asked how did you define it in your example. You told me: “Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate.”

I asked “How does your configuration where the balls were near one corner in the cube cause mercury to rise to a different level than the configuration where they occupy a larger volume near the center?” and the answer “The balls have to touch thermometer” doesn’t cut it. The balls don’t touch the thermometer in either case.

You seemed to imply that the higher concentration means a different macrostate with lower entropy. Or maybe the low entropy in you example is because the balls are near a corner?

Anyway, it would indeed have been easier to say that definition of macrostate included the density of particles in each octant of the cube - or something like that.


>See? We agree that there is an equilibrium macrostate. When I fist asked you to clarify what did you mean by macrostate you told me that “it changes with time; even at equilibrium.”

I still stand by my statement. Even at equilibrium it can lower in entropy. The equilibrium is simply the highest entropy state.

>I asked “How does your configuration where the balls were near one corner in the cube cause mercury to rise to a different level than the configuration where they occupy a larger volume near the center?” and the answer “The balls have to touch thermometer” doesn’t cut it. The balls don’t touch the thermometer in either case.

I stated this is pedantism. The concept and intuition remain true. I changed the definition so that it's a volume around the thermometer if the particle is in that volume and heading for the thermometer is counts as a collision.

I stated all of this already.

>You seemed to imply that the higher concentration means a different macrostate with lower entropy. Or maybe the low entropy in you example is because the balls are near a corner?

Yes. The higher concentration has a lower probability of occurring. And occupies a different temperature reading on the thermometer. Each temperature reading is a different macrostate.

>Anyway, it would indeed have been easier to say that definition of macrostate included the density of particles in each octant of the cube - or something like that.

Sure, Divide the box into a bunch of cubes. If 1 or more particles are in the cube then that cube represents 1, otherwise 0. Add those numbers up and that represents a macrostate.

The inuition remains the same. For all particles to be concentrated in 1 cube is a very low probability. And the macrostate will be quite low too. With enough cubes and boxes such a state has a very low probability of occuring.

But all of this is, again, independent of your knowledge of where the particles are in each cube.


> And occupies a different temperature reading on the thermometer.

You are unable to explain how reducing the space occupied by the initial configuration you proposed could change the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?

> With enough cubes and boxes such a state has a very low probability of occuring.

Sure. The thing is that if you calculate an entropy using Gibbs formula from the distribution of microstates for a given macrostate the value that you obtain depends on how many cubes are used to define the macrostate. There is no entropy of the microstate - the entropy depends on how you decide to define the macrostate. If the lattice is fine enough, and the particles indistinguishable, in the limit the entropy is zero - the macrostate becomes the same as the microstate and one single microstate is possible.

It all depends on the description we make of the system. The "I have balls in a box and I know their positions" example was incomplete.

If for example I have N of those balls in equilibrium in an isolated box of volume V and I know the energy E I know the equilibrium macrostate. The energy won't change because it's isolated. The macrostate will not change. The (equi)probability distribution of the possible microstates corresponding to the macrostate won't change. Gibbs entropy which is calculated using that distribution won't change.

If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is. It doesn't matter if you know the position of each particle. The entropy for the thermodynamical system described in the previous paragraph is independent of your knowledge of where the particles are or how unlikely you think their positions are.

I agree that you could use alternative ways of defining the macrostate where that happens! (And calculating Gibbs entropy will give different results. One can even get zero entropy using the microstate as macrostate - and one may actually want to do that if the microstate is known!)

> Even at equilibrium it can lower in entropy. The equilibrium is simply the highest entropy state.

I would understand that you said either:

"the equilibrium is the highest entropy state but there may be fluctuations that shift the system temporarily out equilibrium"

or

"the equilibrium is the highest entropy state and possibly a distribution of states around it."

Both concepts are used. When you say that the equilibrium is simply the highest entropy state but the equilibrium entropy can also be lower I'm not sure if it can be read as one of those or you're saying something else entirely.


>You are unable to explain how reducing the space occupied by the initial configuration you proposed could change the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?

You're unable to read my explanation which i've already repeated twice. Go back and look it up. I even went with your cubical definition.

I don't agree. Entropy is lower when the current macrostate has a low probability of occuring.

>There is no entropy of the microstate - the entropy depends on how you decide to define the macrostate

The macrostate is defined in terms of possible microstates. Thus Entropy relies on both microstate and macrostate.

>If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is.

Wrong. Energy wouldn't change. Macrostate changes. Entropy changes.

>It doesn't matter if you know the position of each particle. The entropy for the thermodynamical system described in the previous paragraph is independent of your knowledge of where the particles are or how unlikely you think their positions are.

Isn't this my point? And weren't you against my point? Now you're in agreement? My point was entropy is independent of knowledge. Your point was that it is dependent.

>I agree that you could use alternative ways of defining the macrostate where that happens! (And calculating Gibbs entropy will give different results. One can even get zero entropy using the microstate as macrostate - and one may actually want to do that if the microstate is known!)

Except you don't have to do this. My point remains true given MY stated definitions of macrostate.

>Both concepts are used. When you say that the equilibrium is simply the highest entropy state but the equilibrium entropy can also be lower I'm not sure if it can be read as one of those or you're saying something else entirely.

I made a true statement that from what I can gather you agree it's true. You're just trying to extrapolate my reasoning behind the statement. It's pointless. Example: If I give you a number, say -1, do I mean i^2 or 0 - 1? because one of those equations is impossible to express in reality.


>>If by sheer luck all gas particles moved to the exact left side of the container the energy wouldn't change, the macrostate wouldn't change, the entropy wouldn't change - it doesn't matter how unlikely that is.

> Wrong. Energy wouldn't change. Macrostate changes. Entropy changes.

That's my thermodynamical system - that I decided to describe using the state variables N, V, E. S_kgwgk = constant

I have agreed that you can chose to define macrostates as you wish! (You cannot measure them though. You don't have access to my system. They are purely hypothetical.) And then you can say things like "If the position of the particles in your system is X, S_deltasevennine = whatever."

Hopefully you will agree that someone else could choose to define the macrostate differently and say things like "If the position of the particles in kgwgk's system is X, S_anon = somethingelse."

Let's imagine that the position of the particles is actually X. What is the entropy of the system then? S_kgwgk? S_deltasevennine? S_anon? They are all different!

If you think that you can define macrostates for my system in some arbitrary way (your own words: "I'll choose something arbitrary.") and others can't - or that yours are somehow more real - I really wonder why.

And as I said, someone could also come and say "If the position [and velocities] of the particles in kgwgk's system is X, S_vonneumann = 0." [a complete definition of the microstate requires knowing both position and momentum - there might have been some ambiguity about that before but it shouldn't distract us from the main points]


I thought we agreed on my definition. If you want to make your own definitions of macrostate sure. No rule against that, I just don't see your point.

>If you think that you can define macrostates for my system in some arbitrary way (your own words: "I'll choose something arbitrary.") and others can't - or that yours are somehow more real - I really wonder why.

I really wonder why you even say this. Anyone can make up a macrostate we just choose to agree on one for discussions sake. But in the real world things like pressure and temperature are some universal agreed upon ones. I simply made one up so we can center our discussion around it and you took it into other territories.

You can switch up definitions all you want. But no matter your definition of macrostate, this remains true:

Entropy is independent of knowledge. Your initial argument was against that. I haven't seen you make any argument in your own favor. Just side discussions on who's definition of macro state to use.


  > If you want to make your own definitions of macrostate sure.
Note that I was the first one to propose a particular way to define the macrostate of the system - using energy - trying to understand what you meant. Your reply was "that's not the definition of macrostate. It's just one arbitrary choice you have chosen." And you gave your own arbitrary choice "I'll choose something arbitrary. Temperature as measured by a thermometer".

The point is that they are all arbitrary. (Energy is a conserved quantity for a closed system so it's arguably more natural - but let's say that any choice is equally arbitrary.)

  > But no matter your definition of macrostate, this remains true:
  > Entropy is independent of knowledge.
Maybe be can agree that [Gibbs] entropy [which is a property of the distribution of microstates corresponding to our description of the system based one some macroscopic properties] is independent of knowledge [other than about that particular macroscopic description].

The discussion started with me trying to understand what did you mean by "low entropy" when you said:

"Let's say those balls all have a random initial velocity at a random direction but all those balls are initially positioned near one corner in the cube. Thus the balls from a position stand point start with low entropy."

We agree that if the box is isolated and we define the state using N,V,E there is one single macrostate and one single value for the entropy - independent of the position of the particles. We agree that we can define thermodynamical systems using other variables and calculate other entropies. We agree that the entropy is not determined by the configuration of the particles alone.

(Even when a thermodynamical system is defined the entropy is not necesarily a function of the microstate when it refers to only a part of the system. The microstate may not fully determine the macroscopic description. If that box is in equilibrium in a heat bath the temperature is fixed and there is some corresponding entropy. But the same microstate can happen for equilibrium systems at different temperatures and therefore with different entropies.)

You said that you're using temperature as measured by a thermometer as the macroscopic property describing the system. And I'm still trying to understand how "higher concentration" means "lower entropy" even in the context of your own arbitrary choice.

Temperature works well as a state variable for a system in thermal equilibrium which has the same temperature everywhere. You mention the level of mercury and the way a thermometer works is by reaching thermal equilibrium between the mercury and the thing being measured. Note that if for some reason all the particles in the gas go away from the thermometer for a second the temperature of the mercury won't change (ignoring that it will radiate energy over time). The reading of the thermometer doesn't drop to zero just because there is nothing there.

Let's assume that you are somehow measuring the local temperature in a small region around some specific point. What you measure will not depend in any way on the particles that are elsewhere. The particles in the rest of the box could be well spread or all near one corner and you would have no reason to say that the latter configuration is lower entropy than the former based on your macrostate.

Another way to look at it: you reasoning seems to be that the state with the particles near one corner has low entropy because they are close to each other and there is some temperature reading in your far-away thermometer lower than the equilibrium temperature and then the entropy is low. If a similar cluster of particles was close to the thermometer rather than far from it I guess that you will tell me that the temperature reading would be higher than in equilibrium (and the entropy would also be lower).

But if you put that cluster of particles at some intermediate position the reading of the thermometer will be the same as in equilibrium! Your macrostate will be the same as in equilibrium. Your entropy will not be lower than in equilibrium. Even though you said -if I understood you correctly- that the entropy should be lower because the higher concentration has a lower probability of occurring.


> Note that I was the first one to propose a particular way to define the macrostate (energy) of the system to understand what you meant. Your reply was "that's not the definition of macrostate. It's just one arbitrary choice you have chosen." And you gave your own arbitrary choice "I'll choose something arbitrary. Temperature as measured by a thermometer".

No. You explicitly ASKED for my definition, then I said that.

When I said that's not the "definition of macrostate" I did not make an arbitrary choice there. I simply stated that IF you think energy was the definition of macrostate, then you are wrong.

> Maybe be can agree that [Gibbs] entropy [which is a property of the distribution of microstates corresponding to our description of the system based one some macroscopic properties] is independent of knowledge [other than about that particular macroscopic description].

This is the point of the ENTIRE thread starting from your INITIAL reply. Technically this argument is over. You weren't in agreement with me, now you are, so you were wrong and I was right. This sentence admits that.

The rest of the stuff here is side tangents and it's hard to see the main point. I can entertain it though for a little bit but this here is essentially the end.

>The discussion started with me trying to understand what did you mean by "low entropy" when you said:

This discussion started with you saying that if I know the microstate of the system entropy is zero. If you wanted to understand What I thought about "low entropy" then this was NEVER stated. The thread is based off what is stated and what is not stated. It is not based off of what your internal thoughts and intentions are. If you want the topic to based off your own thoughts, they need to be expressed explicitly in statements. You did so just now, but way too late.

>Note that if for some reason all the particles in the gas go away from the thermometer for a second the temperature of the mercury won't change (ignoring that it will radiate energy over time). The reading of the thermometer doesn't drop to zero just because there is nothing there.

This is pedantic. Obviously. I can take it into account. Let's say the greatest average speed measured of a particle and the time it takes for a particle at this speed to travel across the box is the time it takes for the thermometer to drop to zero degrees from the maximum temperature. It's quite fast but particles within the vicinity will keep the mercury level stable but if they were concentrated in a corner, the mercury drops fast enough to change the reading of the thermometer.

>Let's assume that you are somehow measuring the temperature in a small region somewhere with a very high reaction time. What you measure will not depend in any way on the particles that are elsewhere. The particles in the rest of the box could be well spread or all near one corner and you would have no reason to say that the latter configuration is lower entropy than the former based on your macrostate.

I specifically defined it as a thermometer to make location matter. Switching the location of the thermometer is switching the definition as well.

I really don't see what your point is. You think I'm wrong about something? What am I wrong about?

>But at some intermediate point the reading of the thermometer will be the same as in equilibrium. Your macrostate will be the same as in equilibrium. Your entropy will not be lower than in equilibrium. Even though you said -if I understood you correctly- that the entropy should be lower because the higher concentration has a lower probability of occurring.

Your point seems to be buried in here somewhere and I can't parse it. There is a macrostate that is equilibrium, yes.

Are you referring to mercury level mid transition? This is pedant-ism to the max if you are. Yes the mercury level will display the WRONG temperature if it's mid transition. Not willing to constantly adjust the model to little flaws you find. Last time: Let's switch to a digital thermometer that displays temperature at time intervals that are equal to the length of time it takes for the maximum speed particle to travel across the box. There is no transition value now. All temperature readings reflect an instantaneous observed truth at an instantaneous point in time, but that temperature is displayed at non-instantaneous intervals.

It also seems to me that your definition of macrostate is meaningless. The total energy of the universe is hypothetically the same all the time. If the macrostate was just energy there's no point to it, because entropy would then be an unchanging constant.

I think we're done here. I'm just trying to guess what you're driving at. You'll need to clarify your point if you want me to continue. What exactly are you trying to say here?


> Let's say the greatest average speed measured of a particle and the time it takes for a particle at this speed to travel across the box is the time it takes for the thermometer to drop to zero degrees from the maximum temperature. It's quite fast but particles within the vicinity will keep the mercury level stable but if they were concentrated in a corner, the mercury drops fast enough to change the reading of the thermometer.

That's not how a mercury thermometer works. The mercury "drops" when the temperature of the mercury goes down. This happens when the is a flow of energy between the thermometer and the matter inside the box. If all the particles were concentrated elsewhere there would be no interaction, no transfer of energy, no change in the temperature of the mercury, no change in its level. It's not a question of waiting a second to see it go to zero in a vacuum. (Again, I'm considering thermal contact which is the main effect in play and and ignoring radiation energy losses.)

But that's an irrelevant aside and I had already accepted that you can get that kind of reading with some kind of thermometer. I'll concentrate on what I think is the main open point:

> I specifically defined it as a thermometer to make location matter. Switching the location of the thermometer is switching the definition as well.

I'm not switching the location of the thermometer. I'm switching the location of the particles. I'm trying to understand how the entropy - that you calculate from the temperature reading in one thermometer placed in some specific place - depends on the location of the particles.

You've written that: "The higher concentration has a lower probability of occurring. And occupies a different temperature reading on the thermometer. Each temperature reading is a different macrostate."

But if only the particles in the vicinity of the thermometer matter for the local temperature number that you take as describing the macrostate you would get the same reading (and the same entropy) if the rest of the particles changed positions -being more or less concentrated- as long as they remained out of the vicinity of the thermometer. Is that wrong?

> Your point seems to be buried in here somewhere and I can't parse it.

Maybe it was a mistake to talk again about equilibrium... I was trying to show that there are also concentrated configurations of the particles that nevertheless you will consider maximum entropy - using your determination of the macrostate as "what this thermometer says" if the temperature is equal to the equilibrium temperature then the entropy is the same, right?

The main point is that instead of having the balls "positioned near one corner in the cube" they could be positioned in exactly the same relative configuration (the same concentration) near any other corner and the thermometer reading would be different. The entropy is not directly related to the concentration - it depends also on the location of the particles relative to the thermometer. Or maybe the thermometer reading will be same when the particles are in any corners but then it will also be the same for other relative positions so you would also have different levels of concentration corresponding to the same entropy.

I was trying to point out that having a concentration of the particles (different from the homogeneous equilibrium state) doesn't automatically mean lower entropy in your example. The same relative position of the particles among themselves can correspond to different temperatures in your thermometer - including the equilibrium temperature. (The idea was that if the concentrated particles are too far the temperature is "too cold". And maybe if they are too close the temperature is "too hot". But there is at least some goldilocks intermediate distance such that if you put there the initial concentrated state the temperature according to you would be "just right".)


>That's not how a mercury thermometer works. The mercury "drops" when the temperature of the mercury goes down. This happens when the is a flow of energy between the thermometer and the matter inside the box. If all the particles were concentrated elsewhere there would be no interaction, no transfer of energy, no change in the mercury level. (Again, I'm considering thermal contact which is the main effect in play and and ignoring radiation energy losses.)

More pedantism. I'm using an example to make a point and I CLEARLY already edited the example enough times so that it's obvious that I'm Not building a physically accurate model. Yes as you said it is completely irrelevant.

>But if only the particles in the vicinity of the thermometer matter for the local temperature number that you take as describing the macrostate you would get the same reading (and the same entropy) if the rest of the particles changed positions as long as they remained out of the vicinity of the thermometer. Is that wrong?

No that's right. The particles are large in number. Any state where no particles are in the viscinity of the thermometer is a low probability. All microstates that encompass this property occupy at the 0 temperature macrostate.

>The main point is that instead of having the balls "positioned near one corner in the cube" they could be positioned in exactly the same relative configuration (the same concentration) near any other corner and the thermometer reading would be different. The entropy is not directly related to the concentration - it depends also on the location of the particles relative to the thermometer. Or maybe the thermometer reading would be same when the particles are in any corners but then it will also be the same for other relative positions so you would also have different levels of concentration corresponding to the same entropy.

In this definition macrostate is related to vicinity. And in a physical thermometer in reality it is also directly related to particles in actual contact with the thermometer. Vicinity matters for both cases.

>I was trying to point out that having a concentration of the particles (different from the homogeneous equilibrium state) doesn't automatically mean lower entropy in your example. The same relative position of the particles among themselves can correspond to different temperatures in your thermometer - including the equilibrium temperature. (The idea was that if the concentrated particles are too fa" the temperature is "too cold". And maybe if they are too close the temperature is "too hot". But there is some goldilocks intermediate distance such that if you put there the initial concentrated state the temperature according to you would be "just right".)

You're not getting it. The entropy states I'm describing are so low in probability that they basically never will happen in your lifetime or the lifetime of this universe.

The reality is, if particles are concentrated in the corner of a box then yes, a thermometer will register 0, but you will never see this.

They don't jive with your intuition of temperature because of this. But also because the word temperature is poorly defined when applied to a box. What does it mean when you say "The temperature of the room." What part of the room at what point in time? That's why I referenced the thermometer, to prevent this type of tangent in the discussion.

There's also ambiguities like what is the temperature of a volume of space smaller than any particle can occupy? But it's pointless to discuss all of this because this isn't a discussion about anything other then linguistics. We are arguing about language and vocabulary here and that to me is not an interesting thing to talk about. It's an illusion. It's like when people argue about "what is life?" without realizing they're just arguing about the definition of a arbitrary vocabulary word. It's sort of what's going on here, I believe.

If you are in agreement with me that entropy is independent of knowledge then we're good; because as far as I was concerned this was the point you were trying to make.


> Any state where no particles are in the viscinity of the thermometer is a low probability. All microstates that encompass this property occupy at the 0 temperature macrostate.

Ok. I'm glad to read that we finally agree that in your example -using the macrostate you defined- the entropy of the "particles near a corner" state is not different from the entropy of any other "particles not near the thermometer" state.

Right after you said "Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate." - I asked "How does your configuration where the balls where near one corner in the cube causes mercury to rise to a different level than the configuration where they occupy a larger volume near the center?"

You could have answered "it doesn't - that's the same macrostate with the same temperature and the same entropy" back then.

Later I pointed out that "Reducing the space occupied by the initial configuration [doesn't change] the temperature - which you say is 0 in all those cases - or the macrostate - which you call absolute zero macrostate in all those cases. The entropy would be the same whenever the particules are more concentrated than in your example - even though they would have lower probability of ocurrence. Don't you agree?"

Instead of answering "I don't agree. Entropy is lower when the current macrostate has a low probability of occuring." you could have said "I agree. I describe the temperature as zero in all those cases, it's the same macrostate and the entropy will be the same."

Better late than never, anyway.

> the word temperature is poorly defined when applied to a box.

I agree! You're the one who picked the reading of that thermometer as the thing that would determine the macrostate and the entropy - and maintained that higher concentrations would mean different thermometer readings and lower entropy.

> If you are in agreement with me that entropy is independent of knowledge then we're good; because as far as I was concerned this was the point you were trying to make.

If we agree that the entropy is not a physical property of (the microstate of) the system but a property of a particular (thermodynamical) description of the system - and that different descriptions of the same physical system are possible resulting in different entropies - we're good. That's the clarification I wanted to make.


>Ok. I'm glad to read that we finally agree that in your example -using the macrostate you defined- the entropy of the "particles near a corner" state is not different from the entropy of any other "particles not near the thermometer" state.

This has been the definition since I brought it up. That's how thermometers work. It's the obvious consequence of my description and it's how physicall thermometers generally work. You never made it clear that was your problem with it. Well, now that we're both clear there is no problem with it, I think initially it most likely it just didn't jive with your intuition about temperature.

>You could have answered "it doesn't - that's the same macrostate with the same temperature and the same entropy" back then.

I did answer this: "The balls have to touch thermometer. Otherwise the thermometer observes nothing." You failed to pick up on what I meant. If the balls don't touch the thermometer is means "it doesn't" change the thermometer and therefore it doesn't change the macrostate.

You just didn't get it. OK you can accuse me of not explaining it thoroughly. But I can equally at the same time accuse you of not being capable to interpret what I said. I think what's going on here is that you're too stuck in your headspace of thinking that your intuition of entropy was more accurate then mine and you wanted me to come to this realization. I think what happened here is the reverse of this. Your intuition about entropy was less crystallized then mine and most of the communication issues we had was because you thought it was the other way around.

>Instead of answering "I don't agree. Entropy is lower when the current macrostate has a low probability of occuring." you could have said "I agree. I describe the temperature as zero in all those cases, it's the same macrostate and the entropy will be the same."

I told you to refer to my explanation of the macrostate. Yeah in all those cases it will be the same. Ok fine. I could've said that. But I didn't realize that was what you were hung up on. To throw it back at you, you could've said this:

"it doesn't make sense to me that different concentrations of particles in different parts of the box yield the same temperature reading AKA the macrostate. Please explain how this can be possible"

But again I think you didn't realize this was your hang up. I think you thought the my definition of macrostate was a "bad" or something and you were trying to make me realize it?

>I agree! You're the one who picked the reading of that thermometer as the thing that would determine the macrostate and the entropy - and maintained that higher concentrations would mean different thermometer readings and lower entropy.

Yeah I picked it because it's the most intuitive one to illustrate rising entropy. The "2nd law". It's the classic model everyone has in there heads of the "heat death" of the universe.

There are technical issues with the word, but it's the most commonly used one, and we only discussed those technicalities because of a fundamental misunderstanding on your end. IF the intuition was correct, the technicalities do not need to be discussed, or thought about.

>If we agree that the entropy is not a physical property of (the microstate of) the system but a property of a particular (thermodynamical) description of the system -

Sort of. Entropy is directly tied to macrostate, but macrostate is determined by microstates. There is a relationship here from entropy to microstate but not direct.

Also ALL possible macrostate configurations, ARE independent of knowledge of the system. There is no case where knowledge can influence it. There is no case where your initial reply: "If you know the positions and velocities of the balls at all times arguably the entropy is always zero." is correct.

Unless of course you define a macrostate where it's zero all the time. But obviously what you're saying here is that knowledge changes the entropy. Which is not true.


> Gibbs

Ok. If you have a system at constant energy they are the same (all the microstates have the same probability). If you have a system a constant temperature a microstate can always correspond to different macrostates (it seems you’re not concerned about that ambiguity though).

In any case, the entropy is a property of (the ensemble of microstates that conform) the macrostate and not a property of the microstate. For a given macrostate you cannot use Gibbs entropy to talk about low-entropy and high-entropy microstates.

> Temperature as measured by a thermometer is the macrostate. Every possible configuration of particles (microstates) that causes mercury to rise to a certain degree represents a different macrostate.

How does your configuration where the balls where near one corner in the cube causes mercury to rise to a different level than the configuration where they occupy a larger volume near the center?

> And yes it changes with time; even at equilibrium.

What’s your definition of equilibrium? It’s hard for me to see what are we talking about, really.

> If balls are located on one side of the box and the thermometer is on the other side of the box. The thermometer reads nothing.

One could also say that if the balls are located anywhere within the box the thermometer reads nothing. The thermometer reading exists only when enough time has passed and the mercury-balls composite system is in equilibrium.

> When the balls increase in entropy they collide with the thermometer producing a reading.

Apparently you say that the entropy increases because the macrostate changes. But the macrostate is undefined until the balls hit the thermometer. But the entropy has to increase for the balls to move. It’s all a bit confusing, don’t you think?

Won’t the balls collide with the thermometer even if the entropy doesn’t change? It’s their movement that will take them there.

Consider your state - or any other state, for that matter - at t=0 and the state at t=1ps where the balls have advanced slightly. Is the macrostate different? How? Is the entropy different? How?


>How does your configuration where the balls where near one corner in the cube causes mercury to rise to a different level than the configuration where they occupy a larger volume near the center?

The balls have to touch thermometer. Otherwise the thermometer observes nothing. Deriving the exact formula of this interaction is complex but the intuition makes sense.

Interaction with the thermometer at a certain energy level produces a macrostate at Temperature T that is high probability. You see this macrostate throughout your lifetime. But nothing precludes the low probability macrostate (where all balls are at another corner of the box) from occuring. It's just that microstate has such a low probability of occuring you never see it. When nothing touches the thermometer, that temperature reading is the absolute zero macrostate.

>Apparently you say that the entropy increases because the macrostate changes. But the macrostate is undefined until the balls hit the thermometer. But the entropy has to increase for the balls to move. It’s all a bit confusing, don’t you think?

No. The macrostate is never undefined. I defined it as the measurement on that thermometer. Whatever you read on that thermometer (let's assume mercury levels move instantaneously and not be pedantic) then THAT is the current macrostate. I chose this set of macrostates, thus I pick that definition. There is no notion of undefined, the thermometer always HAS a reading.

>Won’t the balls collide with the thermometer even if the entropy doesn’t change? It’s their movement that will take them there.

They can't collide with the thermometer if they're on the other side of the box.

>Consider your state - or any other state, for that matter - at t=0 and the state at t=1ps where the balls have advanced slightly. Is the macrostate different? How? Is the entropy different? How?

Macrostate doesn't change if they're still far away from the thermometer. Still absolute zero. Only the microstate changed. The absolute zero macrostate includes all configurations of particles that will cause the thermometer to read 0. The probability of this macrostate occuring is quite low.

Entropy in this state is low. It increases as more and more particles interact with the thermometer.

The most probable series of events is that particles on the left corner of the box will begin to spread more evenly around the box. More and more particles will begin interacting with the thermometer causing the temperature to rise until it reaches some equilibrium. What can occur is all particles can by sheer luck suddenly concentrate to another corner of the box, but this is a low probability event.

To bring it back around to the main point. All of this is independent of your knowledge of the microstate. Your knowledge does not effect the outcome.


> The absolute zero macrostate includes all configurations of particles that will cause the thermometer to read 0. The probability of this macrostate occuring is quite low.

"Let's say those balls all have a random initial velocity at a random direction but all those balls are initially positioned near one corner in the cube. Thus the balls from a position stand point start with low entropy."

In every single initial configuration the particles will be at some distance from the thermometer and will cause it to read 0 according to you reasoning (until some time passes). I guess they all correspond to the same "absolute zero macrostate".

What I still don't see is how do you calculate the entropy using Gibbs entropy formula in such a way that the entropy is lower or higher for some initial configuration depending on the position of the particles. Gibbs entropy is not a property of the microstate - it's a property of the macrostate.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: