what is it?

the act of keeping a log of events that occur in a celery task, such as problems, errors or just information on current operations

common logger

best practice is to create a common logger for all tasks at the top of the module with celery’s get_task_logger

from celery.utils.log import get_task_logger
 
logger = get_task_logger(__name__)
 
@app.task
def add(x, y):
    logger.info('Adding {0} + {1}'.format(x, y))
    return x + y
  • with get_task_logger(__name__) the logger name will be the module name
  • get_task_logger also adds the task_name and task_id to the log, which helps with troubleshooting
  • get_task_logger sets up one logger called celery.task, which is the parent of all task loggers. get_task_logger(__name__) builds the relationship for us so we do not need to set up a handler or formatter in the Django logging config

to customize the log format, modify worker_log_format and worker_task_log_format. NB if the CELERY prefix has been enabled, it will be CELERY_WORKER_LOG_FORMAT and CELERY_WORKER_TASK_LOG_FORMAT in settings.py

CELERY_WORKER_LOG_FORMAT default:

  • "[%(asctime)s: %(levelname)s/%(processName)s] %(message)s"

CELERY_WORKER_TASK_LOG_FORMAT default:

  • "[%(asctime)s: %(levelname)s/%(processName)s] %(task_name)s[%(task_id)s]: %(message)s"

custom logging

override using setup_logging

from celery.signals import setup_logging
 
@setup_logging.connect()
def on_setup_logging(**kwargs):
    logging_dict = {
        'version': 1,
        'disable_existing_loggers': False,
        'handlers': {
            'file_log': {
                'class': 'logging.FileHandler',
                'filename': 'celery.log',
            },
            'default': {
                'class': 'logging.StreamHandler',
            }
        },
        'loggers': {
            'celery': {
                'handlers': ['default', 'file_log'],
                'propagate': False,
            },
        },
        'root': {
            'handlers': ['default'],
            'level': 'INFO',
        },
    }
 
    logging.config.dictConfig(logging_dict)
 
    # display task_id and task_name in task log
    from celery.app.log import TaskFormatter
    celery_logger = logging.getLogger('celery')
    for handler in celery_logger.handlers:
        handler.setFormatter(
            TaskFormatter(
                '[%(asctime)s: %(levelname)s/%(processName)s/%(thread)d] [%(task_name)s(%(task_id)s)] %(message)s'
            )
        )
  • used TaskFormatter to make the task_id and task_name available in the log format
  • can also log concurrency info, like processName and threadName. All available attributes are determined by the standard python library LogRecord attributes
  • Review the celery source code for logging

disable worker_hijack_root_logger

  • change worker_hijack_root_logger to False to prevent Celery from clearing the root handlers and resetting the root log level
  • rarely used
settings.py
CELERY_WORKER_HIJACK_ROOT_LOGGER = False
 
LOGGING = {
    'version': 1,
	# loggers which exist when this call is made are left enabled
    'disable_existing_loggers': False,
    'handlers': {
        'file_log': {
            'class': 'logging.FileHandler',
            'filename': 'celery.log',
        },
        'default': {
            'class': 'logging.StreamHandler',
        }
    },
    'loggers': {
        'celery': {
            'handlers': ['default', 'file_log'],
            'propagate': False,
        },
    },
    'root': {
        'handlers': ['default'],
        'level': 'INFO',
    },
}

augment celery’s logger using after_logger

import logging
from celery.signals import after_setup_logger
 
@after_setup_logger.connect()
def on_after_setup_logger(logger, **kwargs):
    formatter = logger.handlers[0].formatter
    file_handler = logging.FileHandler('celery.log')
    file_handler.setFormatter(formatter)
    logger.addHandler(file_handler)
  • creates and writes to celery.log in the root project directory
  • to change the log format (like adding info about the thread), you can do it in the signal handler or by setting worker_log_format and worker_task_log_format in settings.py