Ask Your Question

TBOE's profile - activity

2019-10-16 02:57:48 -0500 received badge  Famous Question (source)
2019-10-16 02:44:08 -0500 commented answer Dyntools: Dealing with many .OUT files when converting to .csv

Another few comments; I'm running PSSE v34.4 from Spyder using pyhton 2.7. OUT file sizes shouldn't be a problem as they are only 20kB each. For now I'm good with a work around, but I might raise a ticket with Siemens. If I find out more, I'll post it here.

2019-10-16 02:37:35 -0500 commented answer Dyntools: Dealing with many .OUT files when converting to .csv

Thanks for this extensive response. I have tested your code on a large batch of files and was able to process 264 files using 4 processes. So this means 66 files per process. After those, the previously mentioned error message popped up again. So the problem persists, but this is a nice work around!

2019-10-15 01:15:57 -0500 commented question Dyntools: Dealing with many .OUT files when converting to .csv

Also, it seems like the problems start after 64 (2^6) files. which suggests that some kind of register fills up. Though I have also seen 66 files being processed. (I'll edit this in the question.)

2019-10-15 01:11:43 -0500 commented question Dyntools: Dealing with many .OUT files when converting to .csv

Thanks for the suggestion. I have given it a try, but see similar results. The problem arrises when reading the out-file. It seems like some process is keeping track of the out-files that have been read, even while the object is deleted.

2019-10-13 19:36:55 -0500 received badge  Notable Question (source)
2019-10-13 19:36:55 -0500 received badge  Popular Question (source)
2019-10-11 07:24:59 -0500 received badge  Editor (source)
2019-10-11 07:23:54 -0500 asked a question Dyntools: Dealing with many .OUT files when converting to .csv

Hi everybody, I'm running into errors when dealing with large numbers of .OUT files. I'm trying to convert a large number of .OUT files to .csv files for further processing in python. I use the code below, and I'm working in Spyder.

At a certain point, the following error messages start popping up:

All file units in use. genroe_con01_02.out (RWFIL)

(genroe_con01_02.out is one of the filenames) The error message pops up for all subsequent files. It seems like the problems start after 64 (2^6) files. which suggests that some kind of register is filling up. Though I have also seen 66 files being processed. Removing all variables and restarting the program will not do the trick, the only option is to restart the kernel.

edit: I have also noticed that after removing all variables and restarting, I'm also unable to load other PSSE files, receiving the error messages below. Again, requiring to restart the kernel.

All file units in use. PSSEcasedata.cnv (OpnApplFil/OPNPTI)

All file units in use. PSSEcasedata.snp (OpnApplFil/OPNPTI)

Has anybody come across this problem before?

files = os.listdir(os.curdir)
for filename in files:
    if filename.endswith(".out"):
        output_obj = dyntools.CHNF(filename)
        output_obj.csvout(channels='', csvfile='', outfile='', ntimestep=1)
        del output_obj