Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SIGTERM handling in pebble.ProcessPool #137

Open
xiaodongsg opened this issue Aug 29, 2024 · 0 comments
Open

SIGTERM handling in pebble.ProcessPool #137

xiaodongsg opened this issue Aug 29, 2024 · 0 comments
Labels

Comments

@xiaodongsg
Copy link

xiaodongsg commented Aug 29, 2024

With the following code:

from pebble import ProcessPool
import concurrent.futures
import asyncio
import time


def blocking_sleep():
    for i in range(2):
        print('sleeping for 1 second')
        time.sleep(1)
    print("finished")

async def main():
    loop = asyncio.get_running_loop()
    with ProcessPool(max_workers=5) as pool:
        tasks = []
        tasks.append(loop.run_in_executor(
            pool,
            blocking_sleep,
            None
            ))

        await asyncio.gather(*tasks, return_exceptions=False)

import signal
signal.signal(signal.SIGTERM, signal.default_int_handler)   # <---- signal handler installed

asyncio.run(main())

with the installed signal handler, the above code crashed with

Process pebble_pool_worker:
Traceback (most recent call last):
  File "some/path/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "some/path/lib/python3.11/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "some/path/lib/python3.11/site-packages/pebble/pool/process.py", line 427, in worker_process
    for task in worker_get_next_task(channel, params.max_tasks):
  File "some/path/lib/python3.11/site-packages/pebble/pool/process.py", line 443, in worker_get_next_task
    yield fetch_task(channel)
          ^^^^^^^^^^^^^^^^^^^
  File "some/path/lib/python3.11/site-packages/pebble/pool/process.py", line 455, in fetch_task
    while channel.poll():
          ^^^^^^^^^^^^^^
  File "some/path/lib/python3.11/site-packages/pebble/pool/channel.py", line 59, in unix_poll
    return bool(poll.poll(timeout))
                ^^^^^^^^^^^^^^^^^^
KeyboardInterrupt

The concurrent.futures.ProcessPoolExecutor works fine with this setup.

from pebble import ProcessPool
import concurrent.futures
import asyncio
import time


def blocking_sleep():
    for i in range(2):
        print('sleeping for 1 second')
        time.sleep(1)
    print("finished")

async def main():
    loop = asyncio.get_running_loop()
    with concurrent.futures.ProcessPoolExecutor() as pool:
        tasks = []
        tasks.append(loop.run_in_executor(
            pool,
            blocking_sleep
            ))

        await asyncio.gather(*tasks, return_exceptions=False)

import signal
signal.signal(signal.SIGTERM, signal.default_int_handler)
asyncio.run(main())

Can you please help on this?

@noxdafox noxdafox added the bug label Sep 15, 2024
noxdafox added a commit that referenced this issue Sep 15, 2024
Avoids inheriting SIGTERM handling from parent process.

Signed-off-by: Matteo Cafasso <[email protected]>
noxdafox added a commit that referenced this issue Sep 15, 2024
Avoids inheriting SIGTERM handling from parent process.

Signed-off-by: Matteo Cafasso <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants