许可优化
许可优化
产品
产品
解决方案
解决方案
服务支持
服务支持
关于
关于
软件库
当前位置:服务支持 >  软件文章 >  abaqus利用Python实现部件合并及调用Abaqus Python并行化嵌套循环

abaqus利用Python实现部件合并及调用Abaqus Python并行化嵌套循环

阅读数 3
点赞 0
article_banner

I have the nested loops below. How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?

for r in range(4):

for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):

# - Write Abaqus INP file - #

writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])

# - Delete LCK file to Enable Another Analysis - #

delFile(aPath[k]+"/"+inpFiles[k]+".lck")

# - Run Analysis - #

runABQfile(inpFiles[k],aPath[k])

I tried using multiprocess.pool as but it never gets in:

def parRunABQfiles(nA,nP,r,ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_):

from os import path

from auxFunctions import writeABQfile, runABQfile

print("I am Here")

for k in range( r*nA/nP, (r+1)*nA/nP ):

# - Write Abaqus INP file - #

writeABQfile(ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_)

# - Delete LCK file to Enable Another Analysis - #

delFile(aPath_+"/"+inpFiles[k]+".lck")

# - Run Analysis - #

runABQfile(inpFiles_,aPath_)

# - Make Sure Analysis is not Bypassed - #

while os.path.isfile(aPath_+"/"+inpFiles[k]+".lck") == True:

sleep(0.1)

return k

results = zip(*pool.map(parRunABQfiles, range(0, 4, 1)))

The runABQfile is just a subprocess.call to a sh script that runs abaqus

def runABQfile(inpFile,path):

import subprocess

import os

prcStr1 = ('sbatch '+path+'/runJob.sh')

process = subprocess.call(prcStr1, stdin=None, stdout=None, stderr=None, shell=True )

return

I have no errors showing up so I am not sure why is not getting in there. I know because the writeABQfile does not write the input file. The question again is:

How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?

解决方案

Use concurrent.futures module if multiprocessing is what you want.

from concurrent.futures import ProcessPoolExecutor

def each(r):

for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):

writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])

delFile(aPath[k]+"/"+inpFiles[k]+".lck")

runABQfile(inpFiles[k],aPath[k])

with ProcessPoolExecutor(max_workers=4) as executor:

output = executor.map(each, range(4)) # returns an iterable

If you just want to "do" stuff rather than "produce", check out as_completed function from the same module. There are direct examples in the doc.


免责声明:本文系网络转载或改编,未找到原创作者,版权归原作者所有。如涉及版权,请联系删

相关文章
QR Code
微信扫一扫,欢迎咨询~
customer

online

联系我们
武汉格发信息技术有限公司
湖北省武汉市经开区科技园西路6号103孵化器
电话:155-2731-8020 座机:027-59821821
邮件:tanzw@gofarlic.com
Copyright © 2023 Gofarsoft Co.,Ltd. 保留所有权利
遇到许可问题?该如何解决!?
评估许可证实际采购量? 
不清楚软件许可证使用数据? 
收到软件厂商律师函!?  
想要少购买点许可证,节省费用? 
收到软件厂商侵权通告!?  
有正版license,但许可证不够用,需要新购? 
联系方式 board-phone 155-2731-8020
close1
预留信息,一起解决您的问题
* 姓名:
* 手机:

* 公司名称:

姓名不为空

姓名不为空

姓名不为空
手机不正确

手机不正确

手机不正确
公司不为空

公司不为空

公司不为空