The Python Oracle

Celery as networked pub/sub events

--------------------------------------------------
Hire the world's top talent on demand or became one of them at Toptal: https://topt.al/25cXVn
--------------------------------------------------

Music by Eric Matyas
https://www.soundimage.org
Track title: Puzzle Game 2 Looping

--

Chapters
00:00 Celery As Networked Pub/Sub Events
01:24 Answer 1 Score 1
01:42 Accepted Answer Score 7
02:34 Answer 3 Score 1
03:05 Thank you

--

Full question
https://stackoverflow.com/questions/3106...

--

Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...

--

Tags
#python #celery #publishsubscribe

#avk47



ACCEPTED ANSWER

Score 7


I would suggest to skip Celery and directly use Redis with its pub/sub functionality. You can spin up Redis for example by running the Docker image. Then on your input machine, when something is detected, you publish a message to a channel. On your output machine you subscribe to that channel and act on the events.

For example your input machine could use something like this:

import redis

def publish(message):
    r = redis.Redis(host="redis")
    r.publish("test-channel", message)

And then on the output side:

import time
import redis

def main():
    r = redis.Redis(host="redis", decode_responses=True)
    p = r.pubsub(ignore_subscribe_messages=True)
    p.subscribe("test-channel")

    while True:
        message = p.get_message()
        if message:
            print(message.get("data", ""))
            # Do more things...
        time.sleep(0.001)

In this way you can send plain text or JSON data between the input and output machine.

Find a sample implementation here: https://github.com/moritz-biersack/simple-async-pub-sub




ANSWER 2

Score 1


Celery is just a task manager.

RabbitMQ is your message broker. I would implement a RabbitMQ channel between your two machines and use publish/subscribe to manage your input.

Maybe this link can help you




ANSWER 3

Score 1


I was asking myself a similar question and found out that there is a Python package celery-pubsub that brings Pub/Sub capabilities to Celery.

Here is an example usage from the package description:

    import celery
    import celery_pubsub
    
    @celery.task
    def my_task_1(*args, **kwargs):
        return "task 1 done"
    
    
    @celery.task
    def my_task_2(*args, **kwargs):
        return "task 2 done"
    
    
    # First, let's subscribe
    celery_pubsub.subscribe('some.topic', my_task_1)
    celery_pubsub.subscribe('some.topic', my_task_2)
    
    # Now, let's publish something
    res = celery_pubsub.publish('some.topic', data='something', value=42)
    
    # We can get the results if we want to (and if the tasks returned something)
    # But in pub/sub, usually, there's no result.
    print(res.get())
    
    # This will get nowhere, as no task subscribed to this topic
    res = celery_pubsub.publish('nowhere', data='something else', value=23)