Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Dyntools: Dealing with many .OUT files when converting to .csv

Hi everybody, I'm running into errors when dealing with large numbers of .OUT files. I'm trying to convert a large number of .OUT files to .csv files for further processing in python. I use the code below, and I'm working in Spyder.

At a certain point, the following error messages start popping up:

All file units in use. genroecon0102.out (RWFIL)

(genroecon0102.out is one of the filenames) The error message pops up for all subsequent files. In total 66 files are processed before the error messages start popping up. It seems like Dyntools has some kind of limitation. Has anybody come across this problem before?

files = os.listdir(os.curdir)
for filename in files:
    if filename.endswith(".out"):
        output_obj = dyntools.CHNF(filename)
        output_obj.csvout(channels='', csvfile='', outfile='', ntimestep=1)
        del output_obj

Dyntools: Dealing with many .OUT files when converting to .csv

Hi everybody, I'm running into errors when dealing with large numbers of .OUT files. I'm trying to convert a large number of .OUT files to .csv files for further processing in python. I use the code below, and I'm working in Spyder.

At a certain point, the following error messages start popping up:

All file units in use. genroecon0102.out genroe_con01_02.out (RWFIL)

(genroecon0102.out (genroe_con01_02.out is one of the filenames) The error message pops up for all subsequent files. In total 66 files are processed before the error messages start popping up. It seems like Dyntools has some kind of limitation. Has anybody come across this problem before?

files = os.listdir(os.curdir)
for filename in files:
    if filename.endswith(".out"):
        output_obj = dyntools.CHNF(filename)
        output_obj.csvout(channels='', csvfile='', outfile='', ntimestep=1)
        del output_obj

Dyntools: Dealing with many .OUT files when converting to .csv

Hi everybody, I'm running into errors when dealing with large numbers of .OUT files. I'm trying to convert a large number of .OUT files to .csv files for further processing in python. I use the code below, and I'm working in Spyder.

At a certain point, the following error messages start popping up:

All file units in use. genroe_con01_02.out (RWFIL)

(genroe_con01_02.out is one of the filenames) The error message pops up for all subsequent files. In total 66 files are processed before the error messages start popping up. It seems like Dyntools has the problems start after 64 (2^6) files. which suggests that some kind of limitation. register is filling up. Though I have also seen 66 files being processed. Removing all variables and restarting the program will not do the trick, the only option is to restart the kernel.

edit: I have also noticed that after removing all variables and restarting, I'm also unable to load other PSSE files, receiving the error messages below. Again, requiring to restart the kernel.

All file units in use. PSSEcasedata.cnv (OpnApplFil/OPNPTI)

All file units in use. PSSEcasedata.snp (OpnApplFil/OPNPTI)

Has anybody come across this problem before?

files = os.listdir(os.curdir)
for filename in files:
    if filename.endswith(".out"):
        output_obj = dyntools.CHNF(filename)
        output_obj.csvout(channels='', csvfile='', outfile='', ntimestep=1)
        del output_obj