Ask Your Question
1

parallel dynamic simulations using python and PSS/E

asked 2013-11-15 14:22:37 -0500

wassup_doc gravatar image

Hello all,

If "best practice" types of questions are not allowed I can remove this question, but I am interested to see if there is anyone that has experience running parallel dynamic simulations in PSS/E using python.

My question, then, is a) is this possible? b) what would be the most "pythonic" method of running this type of simulation?

Here is some pseudo-code of what I am trying to do:

**python instance 1:**
1. initalize PSS/E 
2. Load working case and dynamic simulation models/settings
3. psspy.run(tpause = 1sec)
4. psspy.branch_contingency()#simulate a fault on a heavily loaded line causing a contingency
5. use subprocess module to start 2 new instances of Python/PSS/E #??

**python/PSSE instance 2**
1. reclose the line at 2sec
2. continue simulation psspy.run(tpause = 5sec)
3. send results back to instance 1

**python/PSSE instance 3**
1. reclose the line at 3sec
2. continue simulation psspy.run (tpause = 5sec) 
3. send results back to instance 1

**python instance 1**
6.visualize the output of instance 2 and instance 3

Is it possible to create different instances of psspy in one python script and communicate with different PSS/E processes? How would the API know what process to issue commands to?

It seems obvious to simply serialize this process (eliminating the need for multiple processes) by saving the working case at time 1 and running the first contingency, saving, then running the second contingency, saving. However, this would lead to huge computation times. This simple example of opening and re closing a line will hopefully be implemented, en masse, with different contingencies and RAS's.

Thanks for any suggestions on how to get started.

edit retag flag offensive close merge delete

3 answers

Sort by » oldest newest most voted
1

answered 2013-11-20 21:17:45 -0500

jconto gravatar image

Have you consider re-designing the overall study to perform parallel dynamic runs by assigning an 'equal' number of full dynamic runs to each available CPUs. For large networks, hundred of runs are required to assess the system reliability. These runs need the same type of PSSe activities, but testing different faults (dynamic event). In a multiple-cpu pc, several PSSe instances can run in separate folders (to avoid run conflicts) and each running an equal number of dynamic events. For example, in a 4-cpu pc, each PSSe instance could run 250 events 'in parallel' for a grand total of 1000 simulations.
For a large amount of system studies, it is good to set up the 'parallel' runs for overnight or weekend processing.

edit flag offensive delete link more

Comments

@jconto -- Good thought. unfortunately in my scenario processes must be started mid simulation and carried out from there. This is a good thought to keep in mind for large stability analysis simulations of a single topology though.

wassup_doc gravatar imagewassup_doc ( 2013-11-21 22:36:53 -0500 )edit

I have written a python code [“https://drive.google.com/open?id=0B7uS9L2Woq_7YzYzcGhXT2VQYXc&authuser=0”] that allow running as many instances of PSSe as CPU’s in a pc, runs PSSe in parallel at the process level. This tool is suitable for repetitive studies like dynamic fault studies.

jconto gravatar imagejconto ( 2015-05-13 10:05:52 -0500 )edit
1

answered 2013-11-18 22:12:46 -0500

yfwing gravatar image

I believe that you can try to use python Symmetric Multiprocessing modules, such as 'processing'. Run your simulation to the time 1 second, save one snapshot. Then you can use 'process' or 'pool' in the 'processing' module to parallel the rest of simulations by creating multiple threads.

check the manual of 'processing' at https://pypi.python.org/pypi/processing

edit flag offensive delete link more
1

answered 2013-11-20 17:55:48 -0500

EBahr gravatar image

updated 2013-11-20 17:59:42 -0500

I have had good luck with the multiprocessing module and use it to run 5 simulations at a time. I can get it to max out most of my processors, compared to just one with a single simulation. It is available for download for Python 2.5 (PSSE 32) and comes standard with Python 2.7 (PSSE 33). You can then use the multiprocessing.Pipe if you wish to send data between the processes. One suggestion I have is to run each simulation in its own temporary directory and make sure to change the working directory to that directory. I ran into a few instances where two instances were trying to access the same temporary file and it was crashing.

edit flag offensive delete link more

Comments

@EBahr -- Great I will look into the multiprocessing module -- will try to post back when I get something figured out. Any comments on the difference in usage between this module and the process module that @yfwing suggested?

wassup_doc gravatar imagewassup_doc ( 2013-11-21 22:40:04 -0500 )edit
1

@rich It is probably a very similar module, but I would say the good thing about the `multiprocessing` module is that is now built into the standard python library. So I would tend to think it would be supported more.

EBahr gravatar imageEBahr ( 2013-11-22 10:04:02 -0500 )edit

@EBahr, ive been using the multiprocessing module to run several dynamic studies in parallel; without issue. Was the crash you mentioned caused when accessing some internal PSS/E temporary file, or are you referring to the *.out files we select at the beginning of simulation?

liamv gravatar imageliamv ( 2013-12-05 04:54:23 -0500 )edit

new posting at:"Python Parallel Dynamics Simulations"

jconto gravatar imagejconto ( 2015-04-17 12:00:30 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

[hide preview]

Question Tools

Stats

Asked: 2013-11-15 14:22:37 -0500

Seen: 2,963 times

Last updated: Nov 20 '13