The Python Oracle

Celery as networked pub/sub events

This video explains
Celery as networked pub/sub events

--

Become part of the top 3% of the developers by applying to Toptal
https://topt.al/25cXVn

--

Music by Eric Matyas
https://www.soundimage.org
Track title: Peaceful Mind

--

Chapters
00:00 Question
01:35 Accepted answer (Score 5)
02:45 Answer 2 (Score 1)
03:32 Answer 3 (Score 1)
03:57 Thank you

--

Full question
https://stackoverflow.com/questions/3106...

Accepted answer links:
[Redis]: https://pypi.org/project/redis/
[Docker image]: https://hub.docker.com/_/redis
[https://github.com/moritz-biersack/simpl...]: https://github.com/moritz-biersack/simpl...

Answer 2 links:
[celery-pubsub]: https://pypi.org/project/celery-pubsub/

Answer 3 links:
[link]: https://www.cloudamqp.com/blog/2015-05-2...

--

Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...

--

Tags
#python #celery #publishsubscribe

#avk47



ACCEPTED ANSWER

Score 7


I would suggest to skip Celery and directly use Redis with its pub/sub functionality. You can spin up Redis for example by running the Docker image. Then on your input machine, when something is detected, you publish a message to a channel. On your output machine you subscribe to that channel and act on the events.

For example your input machine could use something like this:

import redis

def publish(message):
    r = redis.Redis(host="redis")
    r.publish("test-channel", message)

And then on the output side:

import time
import redis

def main():
    r = redis.Redis(host="redis", decode_responses=True)
    p = r.pubsub(ignore_subscribe_messages=True)
    p.subscribe("test-channel")

    while True:
        message = p.get_message()
        if message:
            print(message.get("data", ""))
            # Do more things...
        time.sleep(0.001)

In this way you can send plain text or JSON data between the input and output machine.

Find a sample implementation here: https://github.com/moritz-biersack/simple-async-pub-sub




ANSWER 2

Score 1


Celery is just a task manager.

RabbitMQ is your message broker. I would implement a RabbitMQ channel between your two machines and use publish/subscribe to manage your input.

Maybe this link can help you




ANSWER 3

Score 1


I was asking myself a similar question and found out that there is a Python package celery-pubsub that brings Pub/Sub capabilities to Celery.

Here is an example usage from the package description:

    import celery
    import celery_pubsub
    
    @celery.task
    def my_task_1(*args, **kwargs):
        return "task 1 done"
    
    
    @celery.task
    def my_task_2(*args, **kwargs):
        return "task 2 done"
    
    
    # First, let's subscribe
    celery_pubsub.subscribe('some.topic', my_task_1)
    celery_pubsub.subscribe('some.topic', my_task_2)
    
    # Now, let's publish something
    res = celery_pubsub.publish('some.topic', data='something', value=42)
    
    # We can get the results if we want to (and if the tasks returned something)
    # But in pub/sub, usually, there's no result.
    print(res.get())
    
    # This will get nowhere, as no task subscribed to this topic
    res = celery_pubsub.publish('nowhere', data='something else', value=23)