parallelization of python code on different machines on different networks

  • You need to create a distributed queue which can work across different networks. Something like rabbit-mq
  • Put all yours tasks in the queue.
  • Create a central worker management tool that let’s you create and manage workers on Computer A and Computer B. Workers will process your tasks.
  • You also need to take care availability of workers to achieve what you said – if Computer A finishes its tasks before Computer B, I would like Computer A to take on some of Computer B’s remaining tasks

Luckily, python has an excellent library “Celery” which let’s you achieve exactly what you want. It’s a well documented library and has a large and diverse community of users and contributors. You just need to setup a broker (or queue) and configure celery.

There are lots of features in Celery that you can use as per your requirement – Monitoring/Scheduling jobs/Celery canvas to name a few.

https://docs.celeryproject.org/en/stable/getting-started/introduction.html https://medium.com/swlh/python-developers-celery-is-a-must-learn-technology-heres-how-to-get-started-578f5d63fab3

CLICK HERE to find out more related problems solutions.

Leave a Comment

Your email address will not be published.

Scroll to Top