[lammps-users] compute temp


I am using fix temp/rescale to control temperature in my system. Say, if i define a group of atoms first and then apply the temp/rescale on them to maintian them at 300K at every timestep, later if i subdivide the same group of atoms into more groups of atoms and calculate the temperature of each of those groups using compute temp, the weighted average(sum of ratio of atoms * temp for each group) of all those groups should give me 300K, but i am observing a higher value around 310K in my simulation. I am including all the atoms of main group in my subgroups. If i output the temperature, the temp of main group is 300K always but the weighted avg doesnot give me the same value.
I am not sure why.


Possibly the extra 3 dof that you asked about earlier.