Python uses duck-typing, rather than static type checking. But many of the same concerns ultimately apply: does an object have the desired methods and attributes? Do those attributes have valid, in-range values?
Whether you're writing constraints in code, or writing test cases, or validating user input, or just debugging, inevitably somewhere you'll need to verify that an object is still in a proper state–that it still "looks like a duck" and "quacks like a duck."
In statically typed languages you can simply declare "int x
", and anytime you create or mutate x
, it will always be a valid int
. It seems feasible to decorate a Python object to ensure that it is valid under certain constraints, and that every time that object is mutated it is still valid under those constraints. Ideally there would be a simple declarative syntax to express "hasattr length and length is non-negative" (not in those words. Not unlike Rails validators, but less human-language and more programming-language). You could think of this as ad-hoc interface/type system, or you could think of it as an ever-present object-level unit test.
Does such a library exist to declare and validate constraint/duck-checking on Python-objects? Is this an unreasonable tool to want? 🙂
(Thanks!)
Contrived example:
rectangle = {'length': 5, 'width': 10}
# We live in a fictional universe where multiplication is super expensive.
# Therefore any time we multiply, we need to cache the results.
def area(rect):
if 'area' in rect:
return rect['area']
rect['area'] = rect['length'] * rect['width']
return rect['area']
print area(rectangle)
rectangle['length'] = 15
print area(rectangle) # compare expected vs. actual output!
# imagine the same thing with object attributes rather than dictionary keys.
Best Answer
Today, python offers a wide spectrum of tests than can be done for testing duck-type coding.
Type Annotations
Implementations
For starters, static, compile time type-checking ability was added to Python 3 in 2015.
Now, there are multiple third-party libraries that provide an implementations for the new type-checking hooks.
MyPy is pretty slick.
It even supports overloaded function parameter constraints, so that a function can have different return value type if it takes different parameter value types, for instance, using @overload decorator.
Examples
If in your example above, 'length' and 'width' values were passed as discrete parameters, like:
then, PyContracts could easily do individual type/value checks.
PyContracts can work with the dictionary you specified too, as-is.
Problems with PyDBC
Though it was slick for its time, nowadays, the drawbacks I see with PyDBC are:
Unit Testing
To take things farther, into the realm of checking post-conditions, which is appropriate for testing in the fashion of TDD or BDD, one might be better off writing unit tests—particularly if the desire is to test side effects.
TDD unit tests with unittest
Python comes with a unittest package that works fine for verifying external behavior effects of a class are correct.
https://docs.python.org/3/library/unittest.html
It's very handy to use. If you want to test all of your boundary cases (like dimensions<=0), normal case, and any exceptions it supports in the future, it's the right place for it.
BDD testing
There are some third-party programming tools for BDD (behavior-driven design). Which ones are popular/supported will probably change over time.
In my opinion, BDD test scenarios tend to be easier to read/understand. They're readable, possibly even slightly extendable, by non-Python programmers. The step language is pretty simple.
doctest comments/tool
Alternatively, one could include executable examples in the area function's doc string and test it by running doctest—a standard feature of Python for far more than a decade.
https://en.wikipedia.org/wiki/Doctest
By using doctest, tests would be part of the documentation, inline and generated, as both useful examples and convenient tests.
Those are appropriate ways to check that external state/logic of the function is working properly, given fully met preconditions. They tend to offer different side benefits as well.
Decorators
One can also use Python 3 decorators to declare preconditions and post conditions. The @precondition would be checked on entry. The @postcondition would be checked on exit.
This would allow you to enforce design by contract (DBC) with very little code at all.
To illustrate this listing 10, the @require decorator sample function, in this Dr. Dobb's magazine article, which shows how to implement a powerfully declarative, yet fully reusable, precondition-checking decorator in just 10 lines of regular Python code.
http://www.drdobbs.com/web-development/python-24-decorators/184406073
You could write a post condition verification decorator based on this @require example, and name them @precondition and @postcondition.
Just be careful. The example uses eval, and there is no parameter sanity-checking or constraining, so someone might put something. Really awkward. Of course, that's a possibility other places as well, but I feel like I should raise awareness since eval/exec uses should always kept in mind as a concern.
Python 3 includes a library called functools that actually offers 2 capabilities handy for your problem.
https://docs.python.org/3/library/functools.html
On the 1st point, that strikes your need to test the memoization feature; supplied, pretested functionality. Memoization is a very handy efficiency tool. In addition to reusing existing code, using lru_cache decorator is going to make your area function much more transparent.
Anyone would be able to see that there is memoization (results caching) going on, without needing to read all of its source code, just by seeing the lru_cache decorator. They would know it's tested code too, not some embedded, one-off logic.
For the second, if you needed something more robust, complex, or implementation specific than the Dr. Dobb's example, functools may speed your efforts.
Particularly, you should look at the @functools.wraps decorator. It not only saves a couple of lines of code, but makes the metadata more intuitively correct an useful.
Error-Proofing
The following suggestions made above will make the example correct, less error-prone as-is not to mention if modified, easier to read/understand its usage and behavior, well-tested.