Temperature in NEMD method continuously rising between fix nve while executing on GPU server

Dear LAMMPS-USERS:

I use NEMD method to calculata thermal conductivity on CPU server. Everything was all right. But when I submit my lammps script to GPU server, the temperature begins to rise continuously between fix nve. I get really confused about it. I didn’t change my lammps script on GPU server. I don’t know if there are any differences between GPU script and CPU script. Thanks a lot! Here is my script below.

##################################NEMD method to calculate thermal conductivity #####################################

units metal
dimension 2
boundary p p f
atom_style atomic

#------------------------------Atom Defination------------------------------------------------

lattice custom 10.35 a1 3 0 0 a2 0 0.1 0.0 a3 0.0 0.0 80 &
basis 0.3 0.85 0.5 basis 0.3 0.75 0.75 &
basis 0.7 0.75 0.95 basis 0.8 0.25 0.15
region box block 0 $L 0 $W 0 H create_box 2 box create_atoms 2 region box basis 1 2 basis 2 1 basis 3 2 basis 4 1 region hot block {HOT_L} ${HOT_R} 0 $W 0 H region cold block {COLD_L} ${COLD_R} 0 $W 0 $H
group hot region hot
group cold region cold
mass 1 14.811
mass 2 24.0067

#Force-field parameters
#B=1,N=1

pair_style tersoff
pair_coeff * * BNC.tersoff B N

neighbor 9.0 nsq
neigh_modify delay 1 every 13

#---------------------------------Define Setting----------------------------------------------

variable ke equal ke
variable pe equal pe
variable press equal press
variable vol equal vol
variable etotal equal etotal
variable temp equal temp
compute tot_temp all temp
compute myKE all ke/atom
variable temp1 atom c_myKE
compute hot_temp all temp/region hot
compute cold_temp all temp/region cold

min_style wg
minimize 1.0e-9 1.0e-45 80 111
reset_timestep 0

#---------------------------------------Run---------------------------------------------------
timestep 0.01
thermo 10000

velocity all create 300 13487594375 mom yes rot yes dist gaussian units box

#-------------------------------------fix npt-------------------------------------------------
fix NPT all npt temp 300 300 1
run 15000000
unfix NPT

#-------------------------------------fix nvt-------------------------------------------------
fix NVT all nvt temp 300 300 1
run 2000000
unfix NVT

#------------------------------------fix nve--------------------------------------------------
fix NVE all nve
run 1000000

#------------------------------------fix heat--------------------------------------------------
fix hot hot heat 1 65
fix cold cold heat 1 -65
compute layers all chunk/atom bin/1d x lower0.05 units box
fix 3 all ave/chunk 1 200200 layers v_temp1 file profile.heat
run 20000000

Dear LAMMPS-USERS:

I use NEMD method to calculata thermal conductivity on CPU server.
Everything was all right. But when I submit my lammps script to GPU server,
the temperature begins to rise continuously between fix nve. I get really
confused about it. I didn't change my lammps script on GPU server. I don't
know if there are any differences between GPU script and CPU script. Thanks
a lot! Here is my script below.

​what version of LAMMPS are you using (on the CPU, and on the GPU)? and how
was the GPU version compiled?​
if possible, please provide the first part of the screen output, which
contains all kinds of helpful status and settings data (including GPU
initialization).

thanks,
     axel.