Ask Your Question
2

Help Refining Multiple ACCC Excel Report

asked 2013-07-20 19:42:32 -0600

knucklehead gravatar image

updated 2017-07-18 20:16:29 -0600

I Determined that the opening and closing of pssarrays.accc_solution() for each contingency across multiple cases takes a lot of time. Even had memory leak problem when analyzing N-2 contingencies. I think version 34 has resolved the memory leak problem, but still need to break up the analysis into 'chunks' based on the number of cases, monitored elements and contingencies to keep from hitting the 2GB limit per process. I ultimately used this logic:

GroupFactor=10       #Split Very Large Analysis Into Smaller Chuncks to Avoid Memory Limitations
GFactor=GroupFactor*1e6                 #Decrease GroupFactor for More Groups (Smaller Steps)
numberofcons=len(alllabels)               #Assign how many contingencies will be processed
NumBuses=sum([len(casebuses[x]) for x in xrange(numcases)])
NumBranches=sum([len(casebranches[x]) for x in xrange(numcases)])
numgroups=int(ceil(1.0*numberofcons*(NumBuses+NumBranches)/GFactor))
groupsize=int(ceil(1.0*numberofcons/numgroups))

3phaseee.com

edit retag flag offensive close merge delete

3 answers

Sort by ยป oldest newest most voted
1

answered 2013-08-05 14:35:08 -0600

chip gravatar image

I ran the tool on a accc file I had lying around and here's the profile dump:

accc_report.profile% sort cumtime
accc_report.profile% stats 5
Mon Aug 05 14:07:37 2013    accc_report.profile

         44061406 function calls (44061393 primitive calls) in 160.941 seconds

   Ordered by: cumulative time
   List reduced from 2713 to 5 due to restriction <5>

   ncalls  tottime  percall  cumtime  percall filename:lineno(function)
        1   64.068   64.068  161.037  161.037 accc_report.py:15(<module>)
  5786418   47.012    0.000   47.012    0.000 {method 'index' of 'tuple' objects}
     4802    2.327    0.000   25.199    0.005 .\pssarrays.py:1239(accc_solution)
     1200    0.646    0.001   11.804    0.010 accc_report.py:544(GetN1results)
        1    0.048    0.048   10.357   10.357 accc_report.py:517(BuildMasterConlist)

Things i noticed:

  • Most of the time is spent on the 5786418 calls {method 'index' of 'tuple' objects} at 47sec.
  • Actually gettting the data using pssarrays.accc_solution() is quicker than I remembered at 25sec.
  • excelpy is only 7sec which may be completely inaccurate, excel via COM threw some warnings when writing the xls file. I just grabbed a random accc file so the comparisons were probably meaningless.

I'm not sure what is hammering the {method 'index' of 'tuple' objects}. A small refactor to break up some of the main loop into functions would give us some hints, or you could figure it out just by staring at the code for a while.

edit flag offensive delete link more

Comments

@knucklehead I agree with @chip, see how 3/4 of your time is spent in the module, but not inside any function? Chunk up your program into some smaller functions, it will improve readability, but also give you a better idea about where time is being wasted.

JervisW gravatar imageJervisW ( 2013-08-05 21:42:19 -0600 )edit
0

answered 2013-08-09 08:49:18 -0600

SC gravatar image

updated 2013-08-09 08:53:49 -0600

I can't find excelpy module in any version of my PSSE? Even in PSSE 33, the only thing I found is excelpy.pyc, poutexcel.py, pvexcel.py and qv_excel.py.

edit flag offensive delete link more

Comments

2

excelpy.pyc is the excelpy module. It is common for software to ship with compiled .pyc instead of the source .py file.

chip gravatar imagechip ( 2013-08-09 11:34:09 -0600 )edit
0

answered 2013-07-31 02:59:14 -0600

JervisW gravatar image

updated 2013-08-04 17:56:17 -0600

Hi there,

I haven't run your program, but here are my thoughts on what would be taking so long:

  • Communication with Excel over COM is very slow

The less you communicate with excel the faster your program will run. There is a very good ExcelWriter project where you can write .xlsx files without needing to open Microsoft Excel. It will run a bit faster than what you have here:

https://xlsxwriter.readthedocs.org/en/latest/index.html

edit

The best way to check what is eating up the time in your program is to profile it. Profiling is very easy. Assuming the name of your script file is accc_multiple_reports.py

python -m cProfile -o output.profile accc_multiple_reports.py
python -m pstats output.profile

The first line runs Python through the profiling tool, it will take a measurement of how long every single function took to run. The second line invokes pstats which is a tool that can read the profiling results. Once you are in the profiling tool:

output.prof% sort time
output.prof% stats 10

Where output.prof% is just the prompt, kind of like >>> in the python command line. stats 10 will print a list of the 10 worst functions in your program sorted by the amount of time spent in it.

You will most likely find that the 80/20 rule holds true here and 80% of your time is spent in 20% of your code. Another interesting way to sort and show the results is by cumulative time:

output.prof% sort cumulative
output.prof% stats 10

You can use both cumulative and time sorting to work out which are the worst functions in your program. To exit just type quit.

This will definitely tell you what is slow in your program, and will do so very quickly, much faster than guessing. Next you can tell me what's slow in the program :)

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

[hide preview]

Question Tools

Stats

Asked: 2013-07-20 19:42:32 -0600

Seen: 1,804 times

Last updated: Jul 18 '17