From 3c4e3584f958d51d853472f3d5e708b30f852384 Mon Sep 17 00:00:00 2001 From: Jorran de Wit <jorrandewit@outlook.com> Date: Mon, 29 Oct 2018 04:50:45 +0000 Subject: [PATCH] RabbitMQ readme --- README.md | 15 +++++++++++---- 1 file changed, 11 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 8426528f3..405aa98b7 100644 --- a/README.md +++ b/README.md @@ -353,19 +353,26 @@ SHELL_PLUS_POST_IMPORTS = ( ## Scheduled tasks The tasks that involve large requests from CR are supposed to run in the background. For this to work, Celery is required. The following commands assume that you are in the `scipost_v1` main folder, inside the right virtual environment. -Celery depends on a broker, for which we use RabbitMQ. Start it with +Celery depends on a broker, for which we use RabbitMQ. On MacOS one may simply install this by executing: + +```shell +$ brew update +$ brew install rabbitmq +``` + +To start the RabbitMQ broker: ```bash -nohup rabbitmq-server > ../logs/rabbitmq.log 2>&1 & +nohup nice rabbitmq-server > ../logs/rabbitmq.log 2>&1 & ``` Then the Celery worker itself: ```bash -nohup celery -A SciPost_v1 worker --loglevel=info -E > ../logs/celery_worker.log 2>&1 & +nohup nice celery -A SciPost_v1 worker --loglevel=info -E > ../logs/celery_worker.log 2>&1 & ``` And finally `beat`, which enables setting up periodic tasks: ```bash -nohup celery -A SciPost_v1 beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler > ../logs/celery_beat.log 2>&1 & +nohup nice celery -A SciPost_v1 beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler > ../logs/celery_beat.log 2>&1 & ``` Note: on the staging server, these commands are contained in two shell scripts in the `scipoststg` home folder. Just run -- GitLab