Python – How to properly handle global parameters for unit testing in python

configurationpythonunit testing

We are implementing many algorithms which typically have lots of shared, publicly known and security-relevant parameters.

Currently, we simply use a class holding all the parameters and two predefined global objects:

class PublicParams(object):
    p = q = 0

    def __init__(self, p, q):
        self.p = p
        self.q = q

# used for tests
publicParams_test = PublicParams(15,7)               

# Some 2048 bit numbers for example
publicParams_secure = PublicParams(128378947298374928374,128378947298374928374)  

The algorithms then take a PublicParams object as an argument that defaults to the productive publicParams_secure

def AlgoOne(n, publicParams = publicParams_secure):
    # do stuff with publicParams.p
    # ...
    AlgoTwo(x, publicParams)

and

def AlgoTwo(x, publicParams= publicParams_secure):
    # do stuff with publicParams.q

This way we can still inject different public parameters for easier unit testing:

class AlgoOneTest(unittest.TestCase):
    def test(self):
        # compare with manually computed result
        self.assertTrue(AlgoOne(1, publicParams_test) == 10) 

What I don't like about this approach:

  • Giving the publicParams a default value makes it optional when calling some algorithm. However, it becomes easy to forget passing it when calling AlgoTwo from within AlgoOne, which would result in two different objects being used if the test object was passed to AlgoOne

Is there a better way which is less prone to but still offers flexibility for unit testing? Is this really best practice?

Best Answer

Create configuration files test_config.py and production_config.py. Select one of them using the environment variable or a command-line argument. Import it (or read/parse, if you choose .json / .txt instead of .py), and make the result available to the entire program through a global object in a module that you can import anywhere.

This is very similar to what you're already doing, except that it takes it one step further, from global scope out to the shell from which you invoke python. The advantage is that there's no longer a risk of accidentally mixing up production and test configuration: you can't read both files in the same python session, since there's only one environment variable / command line.

Related Topic