I've recently been assigned to a new project because I'm one of the only developers at my company who has used Python extensively in the past. My first task was to clean up the code base and make it more Pythonic – while doing so I came across this:
import logging # The builtin logging module
def logInfo(message):
logger = logging.getLogger('')
logger.info(message)
def logError():
logger = logging.getLogger('')
logger.error('Failed: {0}'.format(traceback.format_exc()))
def logCritical(message):
logger = logging.getLogger('')
logger.critical(message)
def logWarning(message):
logger = logging.getLogger('')
logger.warn(message)
def logDebug(message):
logger = logging.getLogger('')
logger.debug(message)
This smells pretty bad to me – it seems that a much cleaner solution would be to just create a global logger object at the top of the file, and then just use the normal logger.debug|info|error|warn|critical
functions. Am I overthinking this?
Of course, then there is all of the hairiness of a global object – in which case maybe I'd do something like
@memoize
def getLogger(name='__main__'):
return logging.getLogger(name)
With an appropriate memoizing function.
This is a pretty young project still, so we have a lot of freedom in how we set this up.
Best Answer
A common way is, to define one global
logger
at the beginning of the module with:so it's easier to assign different log-levels to different modules. If you don't have a distinguishing logger name, you can directly use the log-functions of
logging
; instead oflogDebug
simply writelogging.debug
.