本文介绍了使用Asyncio等待子进程的结果的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的Python脚本包含一个使用subprocess在脚本外部运行命令的循环。每个子流程都是独立的。我监听返回的消息,以防出现错误;我不能忽略子流程的结果。下面是不带异步的脚本(我用sleep替换了我的计算开销很大的调用):

from subprocess import PIPE  # https://docs.python.org/3/library/subprocess.html
import subprocess

def go_do_something(index: int) -> None:
    """
    This function takes a long time
    Nothing is returned
    Each instance is independent
    """
    process = subprocess.run(["sleep","2"],stdout=PIPE,stderr=PIPE,timeout=20)
    stdout = process.stdout.decode("utf-8")
    stderr = process.stderr.decode("utf-8")
    if "error" in stderr:
        print("error for "+str(index))
    return

def my_long_func(val: int) -> None:
    """
    This function contains a loop
    Each iteration of the loop calls a function
    Nothing is returned
    """
    for index in range(val):
        print("index = "+str(index))
        go_do_something(index)

# run the script
my_long_func(3) # launch three tasks

我认为我可以使用asyncio来加速此活动,因为Python脚本正在等待外部subprocess完成。我认为threadingmultiprocessing是不必要的,尽管它们也可能导致更快的执行。使用任务队列(例如,芹菜)是另一种选择。

我尝试实现asyncio方法,但缺少一些内容,因为以下尝试不会更改总体执行时间:

import asyncio
from subprocess import PIPE  # https://docs.python.org/3/library/subprocess.html
import subprocess


async def go_do_something(index: int) -> None:
    """
    This function takes a long time
    Nothing is returned
    Each instance is independent
    """
    process = subprocess.run(["sleep","2"],stdout=PIPE,stderr=PIPE,timeout=20)
    stdout = process.stdout.decode("utf-8")
    stderr = process.stderr.decode("utf-8")
    if "error" in stderr:
        print("error for "+str(index))
    return

def my_long_func(val: int) -> None:
    """
    This function contains a loop
    Each iteration of the loop calls a function
    Nothing is returned
    """
    # https://docs.python.org/3/library/asyncio-eventloop.html
    loop = asyncio.get_event_loop()
    tasks = []
    for index in range(val):
        task = go_do_something(index)
        tasks.append(task)
    # https://docs.python.org/3/library/asyncio-task.html
    tasks = asyncio.gather(*tasks)
    loop.run_until_complete(tasks)
    loop.close()
    return

my_long_func(3) # launch three tasks
如果我希望监视每个subprocess的输出,而不是在每个subprocess运行时等待,我可以从asyncio获益吗?或者这种情况需要multiprocessing或芹菜之类的东西吗?

推荐答案

尝试使用asyncio而不是subprocess执行命令。

定义run()函数:

import asyncio

async def run(cmd: str):
    proc = await asyncio.create_subprocess_shell(
        cmd,
        stderr=asyncio.subprocess.PIPE,
        stdout=asyncio.subprocess.PIPE
    )

    stdout, stderr = await proc.communicate()

    print(f'[{cmd!r} exited with {proc.returncode}]')
    if stdout:
        print(f'[stdout]
{stdout.decode()}')
    if stderr:
        print(f'[stderr]
{stderr.decode()}')

然后可以像调用任何async函数一样调用它:

asyncio.run(run('sleep 2'))

#=>

['sleep 2' exited with 0]

该示例取自官方documentation。也可用here

这篇关于使用Asyncio等待子进程的结果的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

05-17 02:59