diff --git a/README.md b/README.md index 8426528f32c11776663634d1a51b149e360e72c7..405aa98b76940c1be82eccc9f29c84811f730edf 100644 --- a/README.md +++ b/README.md @@ -353,19 +353,26 @@ SHELL_PLUS_POST_IMPORTS = ( ## Scheduled tasks The tasks that involve large requests from CR are supposed to run in the background. For this to work, Celery is required. The following commands assume that you are in the `scipost_v1` main folder, inside the right virtual environment. -Celery depends on a broker, for which we use RabbitMQ. Start it with +Celery depends on a broker, for which we use RabbitMQ. On MacOS one may simply install this by executing: + +```shell +$ brew update +$ brew install rabbitmq +``` + +To start the RabbitMQ broker: ```bash -nohup rabbitmq-server > ../logs/rabbitmq.log 2>&1 & +nohup nice rabbitmq-server > ../logs/rabbitmq.log 2>&1 & ``` Then the Celery worker itself: ```bash -nohup celery -A SciPost_v1 worker --loglevel=info -E > ../logs/celery_worker.log 2>&1 & +nohup nice celery -A SciPost_v1 worker --loglevel=info -E > ../logs/celery_worker.log 2>&1 & ``` And finally `beat`, which enables setting up periodic tasks: ```bash -nohup celery -A SciPost_v1 beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler > ../logs/celery_beat.log 2>&1 & +nohup nice celery -A SciPost_v1 beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler > ../logs/celery_beat.log 2>&1 & ``` Note: on the staging server, these commands are contained in two shell scripts in the `scipoststg` home folder. Just run