EMC Release v9.4.4 20250801

Dear users,

A new v9.4.4 of EMC (August 1, 2025) was added to http://montecarlo.sourceforge.net/. Targets are available for LINUX x86_64 and aarch64, Mac OS, and Windows. The version includes addition of GROMACS as an MD engine, updates to emc.pl for EMC Setup - the EMC simulation workflow environment, and updates CHARMM c36a and OPLS, both are converted to use bond increments for partial charge assignment. I also addressed issues as reported on EMC - Materials Science Community Discourse.

As of August 2, 2025, initial issues with the Windows target – as described below – have been resolved. Please note that the Windows target is an order of magnitude slower than other targets, which became apparent when I compared emc_linux_aarch64 (WSL Ubuntu on native ARM) and emc_linux_x86_64 (WSL Ubuntu on native Intel) with emc_win32.exe in a PowerShell for both architectures. I am somewhat puzzled as to the cause, but it finds its origin in the mingw-w64 cross-compiler I use. I do not have a direct solution for this problem other than using the Linux targets in combination with WSL (Windows Subsystem for Linux).

2 Likes

Thanks for your work!

After updating to the latest version (emc_win32_v9.4.4_20250801) ,

"Error: core/format.c:363 FormatOpen:
       Error opening file "build.emc".
       Program aborted." 

is encountered when we run “emc_win32.exe build.emc”.

However, when using the old version (emc_win32_v9.4.4_20240801) yesterday, our pipeline was still working properly.

We would like to seek your expertise on where the error occurred in the process described above.

1 Like

Dear Siryokait,

I am sorry to hear. I am somewhat puzzled. Do you have an example .esh? That would help me to pinpoint the issue.

Best,
Pieter

I’ve gotten the same error message as well:

Error: core/format.c:363 FormatOpen:
Error opening file “build.emc”.
Program aborted.

My code worked perfectly with the previous version, which expired yesterday, but it broke with the new version. For your reference, here is my example.esh and build.emc files(I use emc-pypi to run it):

example.esh:

#!/usr/bin/env emc.pl

ITEM OPTIONS

replace true
field pcff
location field
density 0.05
ntotal 1200

ITEM END

# Groups

ITEM GROUPS

DP *c1ccc(cc1)c1ccc*(cc1),1,DP:2
methyl *C,1,DP:1,1,DP:2

ITEM END

# Clusters

ITEM CLUSTERS

poly alternate 1

ITEM END

# Polymers

ITEM POLYMERS

poly
30 DP,10,methyl,2

ITEM END

build.emc:

(* EMC: Script *)
(* Created by C:\Users\jihoon\Desktop\softwares\emc-pypi\venv\Lib\site-packages\pyemc\emc\scripts\emc.pl v5.1, July 11, 2024
   on Fri Aug 01 18:07:21 KST 202 *)
(* define variables *)
variables	= {
  seed		-> -1,
  ntotal	-> 1200,
  fshape	-> 1,
  output	-> "example",
  field		-> "pcff/pcff",
  location	-> "C:/Users/jihoon/Desktop/softwares/emc_win32_v9.4.4_20250801/v9.4.4/field/",
  nav		-> 0.6022141179,
  temperature	-> 300,
  radius	-> 5,
  nrelax	-> 100,
  weight_nonbond -> 0.0001,
  weight_bond	-> 0.0001,
  weight_focus	-> 1,
  cutoff	-> 9.5,
  charge_cutoff	-> 9.5,
  kappa		-> 4,

  density1	-> 0.05,
  lprevious	-> 0,
  lphase	-> 0,

  f_poly	-> 1,

  chem_DP	-> "*c1ccc(cc1)c1ccc*(cc1)",
  chem_methyl	-> "*C"
};

output		= {
  debug		-> false,
  exit		-> true,
  info		-> true,
  warning	-> true
};

(* define force field *)

field		= {
  id		-> pcff,
  mode		-> cff,
  name		-> {location+field+".frc", location+field+"_templates.dat"},
  compress	-> false
};

(* define regular groups *)

groups		= {
  group		-> {
    id		-> DP, depth -> 8, chemistry -> chem_DP,
    connects	-> {
      {source	-> $end1, destination -> {DP, $end2}},
      {source	-> $end1, destination -> {methyl, $end1}},
      {source	-> $end2, destination -> {DP, $end1}},
      {source	-> $end2, destination -> {methyl, $end1}}
    }
  },
  group		-> {
    id		-> methyl, depth -> 8, chemistry -> chem_methyl,
    connects	-> {
      {source	-> $end1, destination -> {DP, $end1}},
      {source	-> $end1, destination -> {DP, $end2}}
    }
  }
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* determine simulation sizing *)

variables	= {

  (* lengths *)

  lg_DP		-> nsites(DP),
  lg_methyl	-> nsites(methyl),

  l_poly	-> 10*lg_DP+6*lg_TP+2*lg_methyl,

  (* masses *)

  mg_DP		-> mass(DP),
  mg_methyl	-> mass(methyl),

  m_poly	-> 10*mg_DP+6*mg_TP+2*mg_methyl,

  (* mol fractions *)

  f_poly	-> f_poly*l_poly,

  (* normalization *)

  norm		-> f_poly,

  f_poly	-> f_poly/norm,

  (* sizing *)

  n_poly	-> int(f_poly*ntotal/l_poly+0.5),


  (* system sizing *)

  ntotal	-> 0,
  mtotal	-> 0,
  vtotal	-> 0
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* define interactions *)

simulation	= {
  units		-> {
    permittivity -> 1,
    seed	-> seed
  },
  types		-> {
    coulomb	-> {
      pair	-> {active -> true, cutoff -> charge_cutoff}
    }
  }
};

(* clusters phase 1 *)

clusters	= {
  progress	-> none,
  polymer	-> {
    id		-> poly, system -> main, type -> alternate,
    n		-> n_poly,
    groups	-> {DP, TP, methyl},
    weights	-> {1, 1, 1},
    nrepeat	-> {10, 6, 2}
  }
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* build phase 1 *)

variables	= {
  nphase1	-> ntotal()-ntotal,
  mphase1	-> mtotal()-mtotal,
  vphase1	-> mphase1/nav/density1,
  lbox		-> (vphase1/fshape)^(1/3),
  lphase1	-> fshape*lbox,
  lphase	-> lphase1,
  lxx		-> lphase,
  lyy		-> lbox,
  lzz		-> lbox,
  lzy		-> 0,
  lzx		-> 0,
  lyx		-> 0,
  ntotal	-> nphase1,
  mtotal	-> mphase1,
  vtotal	-> vphase1
};

types		= {
  inverse	-> {
    cutoff	-> 0.001
  },
  cff		-> {
    pair	-> {active -> true, mode -> repulsive}
  }
};

build		= {
  system	-> {
    id		-> main,
    split	-> false,
    geometry	-> {xx -> lxx, yy -> lyy, zz -> lzz,
		    zy -> lzy, zx -> lzx, yx -> lyx},
    temperature	-> temperature,
    flag	-> {charge -> true, geometry -> true, map -> true, pbc -> true}
  },
  select	-> {
    progress	-> list,
    frequency	-> 1,
    name	-> "error",
    order	-> random,
    cluster	-> {poly},
    relax	-> {ncycles -> nrelax, radius -> radius},
    grow	-> {
      method	-> energetic,
      check	-> all,
      nbonded	-> 20,
      ntrials	-> 20,
      niterations -> 1000,
      theta	-> 0,
      weight	-> {
	bonded	-> weight_bond, nonbonded -> weight_nonbond,
	focus	-> weight_focus}
    }
  }
};

force		= {style -> none, message -> nkt};
force		= {style -> init, message -> nkt};

(* LAMMPS profile variables *)

variables	= {
  nl_poly	-> nclusters(clusters -> poly)
};

(* storage *)

put		= {name -> output, compress -> true};

lammps		= {name -> output, mode -> put, forcefield -> cff,
		   parameters -> true, types -> false, unwrap -> true,
		   charges -> true, ewald -> true};

pdb		= {name -> output, compress -> true, extend -> false,
		   forcefield -> cff, detect -> false, hexadecimal -> false,
		   unwrap -> true, pbc -> true, atom -> index, residue -> index,
		   segment -> index, rank -> false, vdw -> false, cut -> false,
		   fixed -> true, rigid -> true, connectivity -> false,
		   parameters -> false};

Dear veld

Thank you again, the content of our ‘build.emc’ is listed below:

(* EMC: Script *)

(* Created by emc.pl v5.1, July 11, 2024
   on Tue Jul 22 11:11:05)

(* define variables *)

variables	= {
  seed		-> -1,
  ntotal	-> 29,
  fshape	-> 1,
  output	-> "setup",
  field		-> "pcff/pcff",
  location	-> "D:/EMC/v9.4.4/field/",

  nav		-> 0.6022141179,
  temperature	-> 300,
  radius	-> 5,
  nrelax	-> 100,
  weight_nonbond -> 0.0001,
  weight_bond	-> 0.0001,
  weight_focus	-> 1,
  cutoff	-> 9.5,
  charge_cutoff	-> 9.5,
  kappa		-> 4,

  density1	-> 1,
  lprevious	-> 0,
  lphase	-> 0,

  f_Mol		-> 100,

  chem_Mol	-> "CC(C)C(P)=CC1=CCOCC1"
};

output		= {
  debug		-> false,
  exit		-> true,
  info		-> true,
  warning	-> true
};

(* define force field *)

field		= {
  id		-> pcff,
  mode		-> cff,
  name		-> {location+field+".frc", location+field+"_templates.dat"},
  compress	-> false
};

(* define regular groups *)

groups		= {
  group		-> {
    id		-> Mol,
    depth	-> 8,
    chemistry	-> chem_Mol
  }
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* determine simulation sizing *)

variables	= {

  (* lengths *)

  lg_Mol	-> nsites(Mol),

  l_Mol		-> lg_Mol,

  (* masses *)

  mg_Mol	-> mass(Mol),

  m_Mol		-> mg_Mol,

  (* mass fractions *)

  f_Mol		-> f_Mol*l_Mol/m_Mol,

  (* normalization *)

  norm		-> f_Mol,

  f_Mol		-> f_Mol/norm,

  (* sizing *)

  n_Mol		-> int(f_Mol*ntotal/l_Mol+0.5),

  (* system sizing *)

  ntotal	-> 0,
  mtotal	-> 0,
  vtotal	-> 0
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* define interactions *)

simulation	= {
  units		-> {
    permittivity -> 1,
    seed	-> seed
  },
  types		-> {
    coulomb	-> {
      pair	-> {active -> true, cutoff -> charge_cutoff}
    }
  }
};

(* clusters phase 1 *)

clusters	= {
  progress	-> none,
  cluster	-> {
    id		-> Mol, system -> main, group -> Mol, n -> n_Mol}
};

field		= {
  mode		-> apply,
  check		-> {
    atomistic	-> true,
    charge	-> true
  },
  debug		-> false
};

(* build phase 1 *)

variables	= {
  nphase1	-> ntotal()-ntotal,
  mphase1	-> mtotal()-mtotal,
  vphase1	-> mphase1/nav/density1,
  lbox		-> (vphase1/fshape)^(1/3),
  lphase1	-> fshape*lbox,
  lphase	-> lphase1,
  lxx		-> lphase,
  lyy		-> lbox,
  lzz		-> lbox,
  lzy		-> 0,
  lzx		-> 0,
  lyx		-> 0,
  ntotal	-> nphase1,
  mtotal	-> mphase1,
  vtotal	-> vphase1
};

types		= {
  cff		-> {
    pair	-> {active -> true, mode -> repulsive}
  }
};

build		= {
  system	-> {
    id		-> main,
    split	-> false,
    geometry	-> {xx -> lxx, yy -> lyy, zz -> lzz,
		    zy -> lzy, zx -> lzx, yx -> lyx},
    temperature	-> temperature,
    flag	-> {charge -> true, geometry -> true, map -> true, pbc -> true}
  },
  select	-> {
    progress	-> list,
    frequency	-> 1,
    name	-> "error",
    order	-> random,
    cluster	-> {Mol},
    relax	-> {ncycles -> nrelax, radius -> radius},
    grow	-> {
      method	-> energetic,
      check	-> all,
      nbonded	-> 20,
      ntrials	-> 20,
      niterations -> 1000,
      theta	-> 0,
      weight	-> {
	bonded	-> weight_bond, nonbonded -> weight_nonbond,
	focus	-> weight_focus}
    }
  }
};

force		= {style -> none, message -> nkt};
force		= {style -> init, message -> nkt};

(* storage *)

put		= {name -> output, compress -> true};

lammps		= {name -> output, mode -> put, forcefield -> cff,
		   parameters -> true, types -> false, unwrap -> true,
		   charges -> true, ewald -> true};

pdb		= {name -> output, compress -> true, extend -> false,
		   forcefield -> cff, detect -> false, hexadecimal -> false,
		   unwrap -> true, pbc -> true, atom -> index, residue -> index,
		   segment -> index, rank -> false, vdw -> false, cut -> false,
		   fixed -> true, rigid -> true, connectivity -> false,
		   parameters -> false};

Dear users,

I checked out the code. Apparently Windows is now interpreting opening of files slightly different, causing a previously working code to fail currently. I removed the bug, recompiled the Windows target, and uploaded this target to https://montecarlo.sourceforge.net/.

As an aside, I noticed that the Windows target is about an order of magnitude slower when compared to running a native Linux target under WSL Ubuntu (Windows Subsystem for Linux with Ubuntu as Linux distribution). I myself have access to ARM and Intel architectures and thus tested emc_linux_aarch64 and emc_linux_x86_64 on each respective architecture in comparison with emc_win32.exe on both.

1 Like

Thanks for resolving the bug issue, it works perfectly now. However, running EMC via emc-pypi still fails with a “Validity has run out” error. I found that the internal EMC call in pyemc does not correctly locate or invoke the emc_win32.exe binary using the EMC_ROOT environment variable. To fix this, I had to manually set the path in runner.py:

emc_exe = “C:/Path-to-EMC/v9.4.4/bin/emc_win32.exe”

This change allowed pyemc.build() to work as expected.

This is probably because pyemc uses its own emc installation (for example, C:\ProgramData\miniconda3\envs\ffpenv\Lib\site-packages\pyemc\emc), which is the expired 2024 version. I assume the problem will be fixed when emc-pypi is updated.

@hadi971 @kgk emc-pypi has been updated to using 2025 source. Feel free to update the package to continue to use it.