Make PuSHing the Pubhubsubbub server an async task (#436, #585)

Notifying the PuSH servers had 3 problems. 

1) it was done immediately after sending of the processing task to celery. So if celery was run in a separate
process we would notify the PuSH servers before the new media was processed/
visible. (#436)

2) Notification code was called in submit/views.py, so submitting via the
   API never resulted in notifications. (#585)

3) If Notifying the PuSH server failed, we would never retry.

The solution was to make the PuSH notification an asynchronous subtask. This
way: 1) it will only be called once async processing has finished, 2) it
is in the main processing code path, so even API calls will result in
notifications, and 3) We retry 3 times in case of failure before giving up.
If the server is in a separate process, we will wait 3x 2 minutes before
retrying the notification.

The only downside is that the celery server needs to have access to the internet
to ping the PuSH server. If that is a problem, we need to make the task belong
to a special group of celery servers that has access to the internet.

As a side effect, I believe I removed the limitation that prevented us from
upgrading celery.

Signed-off-by: Sebastian Spaeth <Sebastian@SSpaeth.de>
This commit is contained in:
Sebastian Spaeth
2013-01-15 14:41:30 +01:00
parent 65969d3fb7
commit 2cfffd5ed8
4 changed files with 54 additions and 45 deletions

View File

@@ -86,7 +86,7 @@ def post_entry(request):
#
# (... don't change entry after this point to avoid race
# conditions with changes to the document via processing code)
run_process_media(entry)
run_process_media(entry, request)
return json_response(get_entry_serializable(entry, request.urlgen))