[Tutor] Should error checking be duplicated for both functions if one function calls another one?

Steven D'Aprano steve at pearwood.info
Mon Jun 1 18:36:01 CEST 2015


On Mon, Jun 01, 2015 at 09:27:07AM -0500, boB Stepp wrote:

> Suppose in a given state of a program, function 1 calls function 2.
> Function 1 includes checks for possible error conditions. If there are
> no issues, then function 2 should execute with no issues as well. The
> question is, should function 2 include the error checking done in
> function 1 if function 2 is only ever called by function 1?

The answer is, "that depends".

Suppose *both* functions are public functions, which anyone can use. 
Some people will call function 1 directly, some will call function 2 
directly. In this case, both functions need to do their own error 
checking, because they cannot trust their arguments will be correct. For 
example:

def greet(name):
    if name == '':
        raise ValueError('you must give a name')
    return "Hello " + name

def long_greet(name):
    if name == '':
        raise ValueError('you must give a name')
    return "Greetings and salutations! " + greet(name)


Of course, if the error checking is complicated, you should factor it 
out into a third function:

def greet(name):
    check(name)
    ...

def long_greet(name):
    check(name)
    ...



There's another possibility. Suppose that only the first function is for 
public consumption, part of your library's public API. Since anyone 
might use the first function, including beginners, idiots and people who 
don't read the documentation, it needs to check the argument. But the 
second function is only used by *you*, as an internal detail.

Of course you give the second function a name starting with an 
underscore, so that others will know not to use it. (Single underscore 
names are "private, don't touch".) In this case, the second function 
doesn't need to check it's arguments because it can trust that the first 
function will always do the right thing.

def function(arg):
    if arg > 0:
        return _other_function(arg)
    else:
        raise ValueError

def _other_function(arg):
    return ...


After all, you would never make a mistake and pass the wrong value, 
would you? Hmmm... perhaps we should be a little more careful...

(Note: in Python, this failure to check arguments is not as dangerous as 
it may be in some other languages. You typically won't crash the 
computer, or cause some horrible security vulnerability that lets 
criminals or the NSA take over your computer. You will probably just get 
an exception. So there are circumstances where you might choose to just 
completely ignore any error checking.)

What we can do is an intermediate level of error checking between the 
full checking of arguments done by public functions, and the 
unconditional trust of the private function, by using assertions. 
Assertions are checks which can be turned off. (Although, in truth, most 
people never bother.) Our public function stays the same, and the 
private one becomes:

def _other_function(arg):
    assert arg > 0
    return ...

If the assertion is ever false, arg is not larger than 0, Python will 
raise an AssertionError exception and stop the program. You will then be 
suitably embarrassed and fix the bug, before it is released to your 
customers and users.

But, unlike the regular form of error checking, the assumption here is 
that assertions should always pass. Since they will always pass, they 
don't do anything, and can be turned off safely. You do that by running 
Python with the -O (for optimize) commandline switch, which disables 
asserts. This style of coding is often called "Design By Contract", and 
it is a powerful system for ensuring the safety of error checking during 
development and the speed of skipping unnecessary checks after 
deployment.


You can read more about the use of assert here:

http://import-that.dreamwidth.org/676.html



-- 
Steve


More information about the Tutor mailing list