loading successive large .out files with dyntools causing incomplete data
Hello all—it's been a while! I have a suite of tools, including one for batch analyzing .out files and am running into a peculiar issue. My tool lets me plot multiple channels from multiple .out files, but it appears that after a large .out file is read, the second one causes this problem that prints out:
Exception occured while reading channel data:
(but the program still runs) and a partially complete dictionary of data is returned from the second .out file (first 1200 channels and 'time' in the dictionary are populated lists as expected, but the remaining are empty lists).
This problem is not related to a specific .out file: say that I have two large .out files called out1 and out2, and for three different scripts I try:
[script 1]| for o in [out1]: ...get_data() # this works
[script 2]| for o in [out2]: ...get_data() # this works
[script 3]| for o in [out1, out2]: ...get_data() # data is returned fine for out1, but corrupt for out2
I tried a few different things like deleting the object dyntools.CHNF
creates and deleting and reloading dyntools
. Neither of these worked.
I also tried using gc.collect()
and can see the memory of python.exe dropping, but the failure still occurs. Further, I tried running a multiprocess.Process()
, and of course encountered a memory error (it was also painfully slow). The other option I tried was grabbing chunks of data with .get_data(channels=)
, same problem.
I can load a 50 MB .out file an unlimited amount of times, but a 150 MB file fails the second time I load it.
I'm currently running Python 2.7 (unfortunately my colleagues are all stuck on it still).