Issue reading POSCAR file

Hello, I’m currently stuck in an error encountered during reading vasp files.

Code:

test_path = "110 (1).vasp"
struc_file_path = os.path.join(unconverged_OH_input_dir, test_path)
test_atoms = ase.io.vasp.read_vasp(struc_file_path)

Error:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-50-9496c771ba88> in <cell line: 0>()
      4 struc_file_path = os.path.join(unconverged_OH_input_dir, test_path)
      5 print(struc_file_path)
----> 6 test_atoms = ase.io.vasp.read_vasp(struc_file_path)

2 frames
/usr/local/lib/python3.11/dist-packages/ase/utils/__init__.py in iofunc(file, *args, **kwargs)
    597                 else:
    598                     fd = file
--> 599                 obj = func(fd, *args, **kwargs)
    600                 return obj
    601             finally:

/usr/local/lib/python3.11/dist-packages/ase/io/vasp.py in read_vasp(fd)
    154     """
    155     atoms = read_vasp_configuration(fd)
--> 156     velocities = read_velocities_if_present(fd, len(atoms))
    157     if velocities is not None:
    158         atoms.set_velocities(velocities)

/usr/local/lib/python3.11/dist-packages/ase/io/vasp.py in read_velocities_if_present(fd, natoms)
    286     for atom in range(natoms):
    287         words = fd.readline().split()
--> 288         assert len(words) == 3
    289         atoms_vel[atom] = (float(words[0]), float(words[1]), float(words[2]))
    290 

AssertionError:

However, my vasp files only consists of atomic coordinates and does not have any velocity block. Here is a sample of my vasp file in txt:

Pd Au H  O
 1.0000000000000000
     5.5691730086252491    0.0000000000000000    0.0000000000000000
     0.0000000000000000    5.5691730086252491    0.0000000000000000
     0.0000000000000000    0.0000000000000000   25.8449999999999989
 Pd  Au  H   O
  18   6   1   1
Selective dynamics
Cartesian
  1.3922932521563123  1.3922932521563123  7.9999999999999840   F   F   F
  4.1768797564689368  1.3922932521563123  7.9999999999999840   F   F   F
  1.3922932521563123  4.1768797564689368  7.9999999999999840   F   F   F
  0.0000000000000000  0.0000000000000000  9.9689999999999905   F   F   F
  2.7845865043126246  0.0000000000000000  9.9689999999999905   F   F   F
  0.0000000000000000  2.7845865043126246  9.9689999999999905   F   F   F
  4.1768797564689368  1.3922932521563123 11.9379999999999971   F   F   F
  1.3922932521563123  4.1768797564689368 11.9379999999999971   F   F   F
  4.1768797564689368  4.1768797564689368 11.9379999999999971   F   F   F
  0.0000000000000000  0.0000000000000000 13.9070000000000018   F   F   F
  2.7845865043126246  0.0000000000000000 13.9070000000000018   F   F   F
  2.7845865043126246  2.7845865043126246 13.9070000000000018   F   F   F
  1.4029211019873069  1.3852510747159048 15.8773733993134503   T   T   T
  4.1649540695319152  1.3818252980473880 15.8620491422807426   T   T   T
  4.1742032551870603  4.1841260272710574 15.8774233609107878   T   T   T
  2.7902089164397355  0.0117218844916841 17.8631478750884547   T   T   T
  5.5537646063669106  2.7723249075390983 17.8299663138113829   T   T   T
  2.8103647347527998  2.7611848877951846 17.8234527575932447   T   T   T
  4.1768797564689368  4.1768797564689368  7.9999999999999840   F   F   F
  2.7845865043126246  2.7845865043126246  9.9689999999999905   F   F   F
  1.3922932521563123  1.3922932521563123 11.9379999999999971   F   F   F
  0.0000000000000000  2.7845865043126246 13.9070000000000018   F   F   F
  1.4086076651133190  4.1857036811018391 15.8940681169793656   T   T   T
  5.5502905347148177  0.0147530854615806 17.9413354810592871   T   T   T
  5.5520275705408642  1.3935389965003395 20.3856508974353332   T   T   T
  5.5520275705408642  1.3935389965003395 19.4065808974353331   T   T   T
                                                                            

Could anyone help me with this issue? Thank you

I can’t reproduce this. I copied the sample data to a file named POSCAR and opened it using ase 3.26.0. I opened with ase gui from the shell and ase.io.read and ase.io.vasp.read_vasp from Python.

Do you still have this problem?

Hello, ohh do you mean to say you did not get any errors when running this? Yes, I’m still having the same errors, could they be due to dependency issues?

It works on my machine. To rule out a few variables, can you please try:

  • copy your sample above by clicking the clipboard icon at top right
  • paste it into a new text file named POSCAR
  • read this file with ase.io.read('POSCAR')

Looking at the relevant code

@reader
def read_vasp(fd):
    """Import POSCAR/CONTCAR type file.

    Reads unitcell, atom positions and constraints from the POSCAR/CONTCAR
    file and tries to read atom types from POSCAR/CONTCAR header, if this
    fails the atom types are read from OUTCAR or POTCAR file.
    """
    atoms = read_vasp_configuration(fd)
    velocities = read_velocities_if_present(fd, len(atoms))
    if velocities is not None:
        atoms.set_velocities(velocities)
    return atoms
...

def read_velocities_if_present(fd, natoms) -> np.ndarray | None:
    """Read velocities from POSCAR/CONTCAR if present, return in ASE units."""
    ac_type = fd.readline().strip()

    # Check if velocities are present
    if not ac_type:
        return None

    atoms_vel = np.empty((natoms, 3))
    for atom in range(natoms):
        ...

We see that read_velocities_if_present runs after the positions have been read. If the remaining lines in the file are empty, it is supposed to return None before getting to the part that raised an Error.

That suggests two ways it might have gone wrong:

  • The positions reader exited too early
  • The velocity reader found something it shouldn’t have after the positions

Hello, it works now after doing this. Thank you for the help!

How odd! It’s possible that something didn’t like the “space” character in your original filename, but all the ASE IO code should be resilient against this. Perhaps some strange whitespace sneaked into the file while viewing/editing.

Glad to hear you are up and running again, anyway!