Python Enhancement Proposals

PEP 3124 – Overloading, Generic Functions, Interfaces, and Adaptation

PEP
3124
Title
Overloading, Generic Functions, Interfaces, and Adaptation
Author
Phillip J. Eby <pje at telecommunity.com>
Discussions-To
Python 3000 List <python-3000 at python.org>
Status
Deferred
Type
Standards Track
Requires
3107, 3115, 3119
Created
28-Apr-2007
Post-History
30-Apr-2007
Replaces
245, 246

Contents

Deferred

See https://mail.python.org/pipermail/python-3000/2007-July/008784.html.

Abstract

This PEP proposes a new standard library module, overloading, to provide generic programming features including dynamic overloading (aka generic functions), interfaces, adaptation, method combining (ala CLOS and AspectJ), and simple forms of aspect-oriented programming (AOP).

The proposed API is also open to extension; that is, it will be possible for library developers to implement their own specialized interface types, generic function dispatchers, method combination algorithms, etc., and those extensions will be treated as first-class citizens by the proposed API.

The API will be implemented in pure Python with no C, but may have some dependency on CPython-specific features such as sys._getframe and the func_code attribute of functions. It is expected that e.g. Jython and IronPython will have other ways of implementing similar functionality (perhaps using Java or C#).

Rationale and Goals

Python has always provided a variety of built-in and standard-library generic functions, such as len(), iter(), pprint.pprint(), and most of the functions in the operator module. However, it currently:

  1. does not have a simple or straightforward way for developers to create new generic functions,
  2. does not have a standard way for methods to be added to existing generic functions (i.e., some are added using registration functions, others require defining __special__ methods, possibly by monkeypatching), and
  3. does not allow dispatching on multiple argument types (except in a limited form for arithmetic operators, where “right-hand” (__r*__) methods can be used to do two-argument dispatch.

In addition, it is currently a common anti-pattern for Python code to inspect the types of received arguments, in order to decide what to do with the objects. For example, code may wish to accept either an object of some type, or a sequence of objects of that type.

Currently, the “obvious way” to do this is by type inspection, but this is brittle and closed to extension. A developer using an already-written library may be unable to change how their objects are treated by such code, especially if the objects they are using were created by a third party.

Therefore, this PEP proposes a standard library module to address these, and related issues, using decorators and argument annotations (PEP 3107). The primary features to be provided are:

  • a dynamic overloading facility, similar to the static overloading found in languages such as Java and C++, but including optional method combination features as found in CLOS and AspectJ.
  • a simple “interfaces and adaptation” library inspired by Haskell’s typeclasses (but more dynamic, and without any static type-checking), with an extension API to allow registering user-defined interface types such as those found in PyProtocols and Zope.
  • a simple “aspect” implementation to make it easy to create stateful adapters and to do other stateful AOP.

These features are to be provided in such a way that extended implementations can be created and used. For example, it should be possible for libraries to define new dispatching criteria for generic functions, and new kinds of interfaces, and use them in place of the predefined features. For example, it should be possible to use a zope.interface interface object to specify the desired type of a function argument, as long as the zope.interface package registered itself correctly (or a third party did the registration).

In this way, the proposed API simply offers a uniform way of accessing the functionality within its scope, rather than prescribing a single implementation to be used for all libraries, frameworks, and applications.

User API

The overloading API will be implemented as a single module, named overloading, providing the following features:

Overloading/Generic Functions

The @overload decorator allows you to define alternate implementations of a function, specialized by argument type(s). A function with the same name must already exist in the local namespace. The existing function is modified in-place by the decorator to add the new implementation, and the modified function is returned by the decorator. Thus, the following code:

from overloading import overload
from collections import Iterable

def flatten(ob):
    """Flatten an object to its component iterables"""
    yield ob

@overload
def flatten(ob: Iterable):
    for o in ob:
        for ob in flatten(o):
            yield ob

@overload
def flatten(ob: basestring):
    yield ob

creates a single flatten() function whose implementation roughly equates to:

def flatten(ob):
    if isinstance(ob, basestring) or not isinstance(ob, Iterable):
        yield ob
    else:
        for o in ob:
            for ob in flatten(o):
                yield ob

except that the flatten() function defined by overloading remains open to extension by adding more overloads, while the hardcoded version cannot be extended.

For example, if someone wants to use flatten() with a string-like type that doesn’t subclass basestring, they would be out of luck with the second implementation. With the overloaded implementation, however, they can either write this:

@overload
def flatten(ob: MyString):
    yield ob

or this (to avoid copying the implementation):

from overloading import RuleSet
RuleSet(flatten).copy_rules((basestring,), (MyString,))

(Note also that, although PEP 3119 proposes that it should be possible for abstract base classes like Iterable to allow classes like MyString to claim subclass-hood, such a claim is global, throughout the application. In contrast, adding a specific overload or copying a rule is specific to an individual function, and therefore less likely to have undesired side effects.)

@overload vs. @when

The @overload decorator is a common-case shorthand for the more general @when decorator. It allows you to leave out the name of the function you are overloading, at the expense of requiring the target function to be in the local namespace. It also doesn’t support adding additional criteria besides the ones specified via argument annotations. The following function definitions have identical effects, except for name binding side-effects (which will be described below):

from overloading import when

@overload
def flatten(ob: basestring):
    yield ob

@when(flatten)
def flatten(ob: basestring):
    yield ob

@when(flatten)
def flatten_basestring(ob: basestring):
    yield ob

@when(flatten, (basestring,))
def flatten_basestring(ob):
    yield ob

The first definition above will bind flatten to whatever it was previously bound to. The second will do the same, if it was already bound to the when decorator’s first argument. If flatten is unbound or bound to something else, it will be rebound to the function definition as given. The last two definitions above will always bind flatten_basestring to the function definition as given.

Using this approach allows you to both give a method a descriptive name (often useful in tracebacks!) and to reuse the method later.

Except as otherwise specified, all overloading decorators have the same signature and binding rules as @when. They accept a function and an optional “predicate” object.

The default predicate implementation is a tuple of types with positional matching to the overloaded function’s arguments. However, an arbitrary number of other kinds of predicates can be created and registered using the Extension API, and will then be usable with @when and other decorators created by this module (like @before, @after, and @around).

Method Combination and Overriding

When an overloaded function is invoked, the implementation with the signature that most specifically matches the calling arguments is the one used. If no implementation matches, a NoApplicableMethods error is raised. If more than one implementation matches, but none of the signatures are more specific than the others, an AmbiguousMethods error is raised.

For example, the following pair of implementations are ambiguous, if the foo() function is ever called with two integer arguments, because both signatures would apply, but neither signature is more specific than the other (i.e., neither implies the other):

def foo(bar:int, baz:object):
    pass

@overload
def foo(bar:object, baz:int):
    pass

In contrast, the following pair of implementations can never be ambiguous, because one signature always implies the other; the int/int signature is more specific than the object/object signature:

def foo(bar:object, baz:object):
    pass

@overload
def foo(bar:int, baz:int):
    pass

A signature S1 implies another signature S2, if whenever S1 would apply, S2 would also. A signature S1 is “more specific” than another signature S2, if S1 implies S2, but S2 does not imply S1.

Although the examples above have all used concrete or abstract types as argument annotations, there is no requirement that the annotations be such. They can also be “interface” objects (discussed in the Interfaces and Adaptation section), including user-defined interface types. (They can also be other objects whose types are appropriately registered via the Extension API.)

Proceeding to the “Next” Method

If the first parameter of an overloaded function is named __proceed__, it will be passed a callable representing the next most-specific method. For example, this code:

def foo(bar:object, baz:object):
    print "got objects!"

@overload
def foo(__proceed__, bar:int, baz:int):
    print "got integers!"
    return __proceed__(bar, baz)

Will print “got integers!” followed by “got objects!”.

If there is no next most-specific method, __proceed__ will be bound to a NoApplicableMethods instance. When called, a new NoApplicableMethods instance will be raised, with the arguments passed to the first instance.

Similarly, if the next most-specific methods have ambiguous precedence with respect to each other, __proceed__ will be bound to an AmbiguousMethods instance, and if called, it will raise a new instance.

Thus, a method can either check if __proceed__ is an error instance, or simply invoke it. The NoApplicableMethods and AmbiguousMethods error classes have a common DispatchError base class, so isinstance(__proceed__, overloading.DispatchError) is sufficient to identify whether __proceed__ can be safely called.

(Implementation note: using a magic argument name like __proceed__ could potentially be replaced by a magic function that would be called to obtain the next method. A magic function, however, would degrade performance and might be more difficult to implement on non-CPython platforms. Method chaining via magic argument names, however, can be efficiently implemented on any Python platform that supports creating bound methods from functions – one simply recursively binds each function to be chained, using the following function or error as the im_self of the bound method.)

“Before” and “After” Methods

In addition to the simple next-method chaining shown above, it is sometimes useful to have other ways of combining methods. For example, the “observer pattern” can sometimes be implemented by adding extra methods to a function, that execute before or after the normal implementation.

To support these use cases, the overloading module will supply @before, @after, and @around decorators, that roughly correspond to the same types of methods in the Common Lisp Object System (CLOS), or the corresponding “advice” types in AspectJ.

Like @when, all of these decorators must be passed the function to be overloaded, and can optionally accept a predicate as well:

from overloading import before, after

def begin_transaction(db):
    print "Beginning the actual transaction"

@before(begin_transaction)
def check_single_access(db: SingletonDB):
    if db.inuse:
        raise TransactionError("Database already in use")

@after(begin_transaction)
def start_logging(db: LoggableDB):
    db.set_log_level(VERBOSE)

@before and @after methods are invoked either before or after the main function body, and are never considered ambiguous. That is, it will not cause any errors to have multiple “before” or “after” methods with identical or overlapping signatures. Ambiguities are resolved using the order in which the methods were added to the target function.

“Before” methods are invoked most-specific method first, with ambiguous methods being executed in the order they were added. All “before” methods are called before any of the function’s “primary” methods (i.e. normal @overload methods) are executed.

“After” methods are invoked in the reverse order, after all of the function’s “primary” methods are executed. That is, they are executed least-specific methods first, with ambiguous methods being executed in the reverse of the order in which they were added.

The return values of both “before” and “after” methods are ignored, and any uncaught exceptions raised by any methods (primary or other) immediately end the dispatching process. “Before” and “after” methods cannot have __proceed__ arguments, as they are not responsible for calling any other methods. They are simply called as a notification before or after the primary methods.

Thus, “before” and “after” methods can be used to check or establish preconditions (e.g. by raising an error if the conditions aren’t met) or to ensure postconditions, without needing to duplicate any existing functionality.

“Around” Methods

The @around decorator declares a method as an “around” method. “Around” methods are much like primary methods, except that the least-specific “around” method has higher precedence than the most-specific “before” method.

Unlike “before” and “after” methods, however, “Around” methods are responsible for calling their __proceed__ argument, in order to continue the invocation process. “Around” methods are usually used to transform input arguments or return values, or to wrap specific cases with special error handling or try/finally conditions, e.g.:

from overloading import around

@around(commit_transaction)
def lock_while_committing(__proceed__, db: SingletonDB):
    with db.global_lock:
        return __proceed__(db)

They can also be used to replace the normal handling for a specific case, by not invoking the __proceed__ function.

The __proceed__ given to an “around” method will either be the next applicable “around” method, a DispatchError instance, or a synthetic method object that will call all the “before” methods, followed by the primary method chain, followed by all the “after” methods, and return the result from the primary method chain.

Thus, just as with normal methods, __proceed__ can be checked for DispatchError-ness, or simply invoked. The “around” method should return the value returned by __proceed__, unless of course it wishes to modify or replace it with a different return value for the function as a whole.

Custom Combinations

The decorators described above (@overload, @when, @before, @after, and @around) collectively implement what in CLOS is called the “standard method combination” – the most common patterns used in combining methods.

Sometimes, however, an application or library may have use for a more sophisticated type of method combination. For example, if you would like to have “discount” methods that return a percentage off, to be subtracted from the value returned by the primary method(s), you might write something like this:

from overloading import always_overrides, merge_by_default
from overloading import Around, Before, After, Method, MethodList

class Discount(MethodList):
    """Apply return values as discounts"""

    def __call__(self, *args, **kw):
        retval = self.tail(*args, **kw)
        for sig, body in self.sorted():
            retval -= retval * body(*args, **kw)
        return retval

# merge discounts by priority
merge_by_default(Discount)

# discounts have precedence over before/after/primary methods
always_overrides(Discount, Before)
always_overrides(Discount, After)
always_overrides(Discount, Method)

# but not over "around" methods
always_overrides(Around, Discount)

# Make a decorator called "discount" that works just like the
# standard decorators...
discount = Discount.make_decorator('discount')

# and now let's use it...
def price(product):
    return product.list_price

@discount(price)
def ten_percent_off_shoes(product: Shoe)
    return Decimal('0.1')

Similar techniques can be used to implement a wide variety of CLOS-style method qualifiers and combination rules. The process of creating custom method combination objects and their corresponding decorators is described in more detail under the Extension API section.

Note, by the way, that the @discount decorator shown will work correctly with any new predicates defined by other code. For example, if zope.interface were to register its interface types to work correctly as argument annotations, you would be able to specify discounts on the basis of its interface types, not just classes or overloading-defined interface types.

Similarly, if a library like RuleDispatch or PEAK-Rules were to register an appropriate predicate implementation and dispatch engine, one would then be able to use those predicates for discounts as well, e.g.:

from somewhere import Pred  # some predicate implementation

@discount(
    price,
    Pred("isinstance(product,Shoe) and"
         " product.material.name=='Blue Suede'")
)
def forty_off_blue_suede_shoes(product):
    return Decimal('0.4')

The process of defining custom predicate types and dispatching engines is also described in more detail under the Extension API section.

Overloading Inside Classes

All of the decorators above have a special additional behavior when they are directly invoked within a class body: the first parameter (other than __proceed__, if present) of the decorated function will be treated as though it had an annotation equal to the class in which it was defined.

That is, this code:

class And(object):
    # ...
    @when(get_conjuncts)
    def __conjuncts(self):
        return self.conjuncts

produces the same effect as this (apart from the existence of a private method):

class And(object):
    # ...

@when(get_conjuncts)
def get_conjuncts_of_and(ob: And):
    return ob.conjuncts

This behavior is both a convenience enhancement when defining lots of methods, and a requirement for safely distinguishing multi-argument overloads in subclasses. Consider, for example, the following code:

class A(object):
    def foo(self, ob):
        print "got an object"

    @overload
    def foo(__proceed__, self, ob:Iterable):
        print "it's iterable!"
        return __proceed__(self, ob)


class B(A):
    foo = A.foo     # foo must be defined in local namespace

    @overload
    def foo(__proceed__, self, ob:Iterable):
        print "B got an iterable!"
        return __proceed__(self, ob)

Due to the implicit class rule, calling B().foo([]) will print “B got an iterable!” followed by “it’s iterable!”, and finally, “got an object”, while A().foo([]) would print only the messages defined in A.

Conversely, without the implicit class rule, the two “Iterable” methods would have the exact same applicability conditions, so calling either A().foo([]) or B().foo([]) would result in an AmbiguousMethods error.

It is currently an open issue to determine the best way to implement this rule in Python 3.0. Under Python 2.x, a class’ metaclass was not chosen until the end of the class body, which means that decorators could insert a custom metaclass to do processing of this sort. (This is how RuleDispatch, for example, implements the implicit class rule.)

PEP 3115, however, requires that a class’ metaclass be determined before the class body has executed, making it impossible to use this technique for class decoration any more.

At this writing, discussion on this issue is ongoing.

Interfaces and Adaptation

The overloading module provides a simple implementation of interfaces and adaptation. The following example defines an IStack interface, and declares that list objects support it:

from overloading import abstract, Interface

class IStack(Interface):
    @abstract
    def push(self, ob)
        """Push 'ob' onto the stack"""

    @abstract
    def pop(self):
        """Pop a value and return it"""


when(IStack.push, (list, object))(list.append)
when(IStack.pop, (list,))(list.pop)

mylist = []
mystack = IStack(mylist)
mystack.push(42)
assert mystack.pop()==42

The Interface class is a kind of “universal adapter”. It accepts a single argument: an object to adapt. It then binds all its methods to the target object, in place of itself. Thus, calling mystack.push(42) is the same as calling IStack.push(mylist, 42).

The @abstract decorator marks a function as being abstract: i.e., having no implementation. If an @abstract function is called, it raises NoApplicableMethods. To become executable, overloaded methods must be added using the techniques previously described. (That is, methods can be added using @when, @before, @after, @around, or any custom method combination decorators.)

In the example above, the list.append method is added as a method for IStack.push() when its arguments are a list and an arbitrary object. Thus, IStack.push(mylist, 42) is translated to list.append(mylist, 42), thereby implementing the desired operation.

Abstract and Concrete Methods

Note, by the way, that the @abstract decorator is not limited to use in interface definitions; it can be used anywhere that you wish to create an “empty” generic function that initially has no methods. In particular, it need not be used inside a class.

Also note that interface methods need not be abstract; one could, for example, write an interface like this:

class IWriteMapping(Interface):
    @abstract
    def __setitem__(self, key, value):
        """This has to be implemented"""

    def update(self, other:IReadMapping):
        for k, v in IReadMapping(other).items():
            self[k] = v

As long as __setitem__ is defined for some type, the above interface will provide a usable update() implementation. However, if some specific type (or pair of types) has a more efficient way of handling update() operations, an appropriate overload can still be registered for use in that case.

Subclassing and Re-assembly

Interfaces can be subclassed:

class ISizedStack(IStack):
    @abstract
    def __len__(self):
        """Return the number of items on the stack"""

# define __len__ support for ISizedStack
when(ISizedStack.__len__, (list,))(list.__len__)

Or assembled by combining functions from existing interfaces:

class Sizable(Interface):
    __len__ = ISizedStack.__len__

# list now implements Sizable as well as ISizedStack, without
# making any new declarations!

A class can be considered to “adapt to” an interface at a given point in time, if no method defined in the interface is guaranteed to raise a NoApplicableMethods error if invoked on an instance of that class at that point in time.

In normal usage, however, it is “easier to ask forgiveness than permission”. That is, it is easier to simply use an interface on an object by adapting it to the interface (e.g. IStack(mylist)) or invoking interface methods directly (e.g. IStack.push(mylist, 42)), than to try to figure out whether the object is adaptable to (or directly implements) the interface.

Implementing an Interface in a Class

It is possible to declare that a class directly implements an interface, using the declare_implementation() function:

from overloading import declare_implementation

class Stack(object):
    def __init__(self):
        self.data = []
    def push(self, ob):
        self.data.append(ob)
    def pop(self):
        return self.data.pop()

declare_implementation(IStack, Stack)

The declare_implementation() call above is roughly equivalent to the following steps:

when(IStack.push, (Stack,object))(lambda self, ob: self.push(ob))
when(IStack.pop, (Stack,))(lambda self, ob: self.pop())

That is, calling IStack.push() or IStack.pop() on an instance of any subclass of Stack, will simply delegate to the actual push() or pop() methods thereof.

For the sake of efficiency, calling IStack(s) where s is an instance of Stack, may return s rather than an IStack adapter. (Note that calling IStack(x) where x is already an IStack adapter will always return x unchanged; this is an additional optimization allowed in cases where the adaptee is known to directly implement the interface, without adaptation.)

For convenience, it may be useful to declare implementations in the class header, e.g.:

class Stack(metaclass=Implementer, implements=IStack):
    ...

Instead of calling declare_implementation() after the end of the suite.

Interfaces as Type Specifiers

Interface subclasses can be used as argument annotations to indicate what type of objects are acceptable to an overload, e.g.:

@overload
def traverse(g: IGraph, s: IStack):
    g = IGraph(g)
    s = IStack(s)
    # etc....

Note, however, that the actual arguments are not changed or adapted in any way by the mere use of an interface as a type specifier. You must explicitly cast the objects to the appropriate interface, as shown above.

Note, however, that other patterns of interface use are possible. For example, other interface implementations might not support adaptation, or might require that function arguments already be adapted to the specified interface. So the exact semantics of using an interface as a type specifier are dependent on the interface objects you actually use.

For the interface objects defined by this PEP, however, the semantics are as described above. An interface I1 is considered “more specific” than another interface I2, if the set of descriptors in I1’s inheritance hierarchy are a proper superset of the descriptors in I2’s inheritance hierarchy.

So, for example, ISizedStack is more specific than both ISizable and ISizedStack, irrespective of the inheritance relationships between these interfaces. It is purely a question of what operations are included within those interfaces – and the names of the operations are unimportant.

Interfaces (at least the ones provided by overloading) are always considered less-specific than concrete classes. Other interface implementations can decide on their own specificity rules, both between interfaces and other interfaces, and between interfaces and classes.

Non-Method Attributes in Interfaces

The Interface implementation actually treats all attributes and methods (i.e. descriptors) in the same way: their __get__ (and __set__ and __delete__, if present) methods are called with the wrapped (adapted) object as “self”. For functions, this has the effect of creating a bound method linking the generic function to the wrapped object.

For non-function attributes, it may be easiest to specify them using the property built-in, and the corresponding fget, fset, and fdel attributes:

class ILength(Interface):
    @property
    @abstract
    def length(self):
        """Read-only length attribute"""

# ILength(aList).length == list.__len__(aList)
when(ILength.length.fget, (list,))(list.__len__)

Alternatively, methods such as _get_foo() and _set_foo() may be defined as part of the interface, and the property defined in terms of those methods, but this is a bit more difficult for users to implement correctly when creating a class that directly implements the interface, as they would then need to match all the individual method names, not just the name of the property or attribute.

Aspects

The adaptation system described above assumes that adapters are “stateless”, which is to say that adapters have no attributes or state apart from that of the adapted object. This follows the “typeclass/instance” model of Haskell, and the concept of “pure” (i.e., transitively composable) adapters.

However, there are occasionally cases where, to provide a complete implementation of some interface, some sort of additional state is required.

One possibility of course, would be to attach monkeypatched “private” attributes to the adaptee. But this is subject to name collisions, and complicates the process of initialization (since any code using these attributes has to check for their existence and initialize them if necessary). It also doesn’t work on objects that don’t have a __dict__ attribute.

So the Aspect class is provided to make it easy to attach extra information to objects that either:

  1. have a __dict__ attribute (so aspect instances can be stored in it, keyed by aspect class),
  2. support weak referencing (so aspect instances can be managed using a global but thread-safe weak-reference dictionary), or
  3. implement or can be adapt to the overloading.IAspectOwner interface (technically, #1 or #2 imply this).

Subclassing Aspect creates an adapter class whose state is tied to the life of the adapted object.

For example, suppose you would like to count all the times a certain method is called on instances of Target (a classic AOP example). You might do something like:

from overloading import Aspect

class Count(Aspect):
    count = 0

@after(Target.some_method)
def count_after_call(self:Target, *args, **kw):
    Count(self).count += 1

The above code will keep track of the number of times that Target.some_method() is successfully called on an instance of Target (i.e., it will not count errors unless they occur in a more-specific “after” method). Other code can then access the count using Count(someTarget).count.

Aspect instances can of course have __init__ methods, to initialize any data structures. They can use either __slots__ or dictionary-based attributes for storage.

While this facility is rather primitive compared to a full-featured AOP tool like AspectJ, persons who wish to build pointcut libraries or other AspectJ-like features can certainly use Aspect objects and method-combination decorators as a base for building more expressive AOP tools.

XXX spec out full aspect API, including keys, N-to-1 aspects, manual
attach/detach/delete of aspect instances, and the IAspectOwner interface.

Extension API

TODO: explain how all of these work

implies(o1, o2)

declare_implementation(iface, class)

predicate_signatures(ob)

parse_rule(ruleset, body, predicate, actiontype, localdict, globaldict)

combine_actions(a1, a2)

rules_for(f)

Rule objects

ActionDef objects

RuleSet objects

Method objects

MethodList objects

IAspectOwner

Overloading Usage Patterns

In discussion on the Python-3000 list, the proposed feature of allowing arbitrary functions to be overloaded has been somewhat controversial, with some people expressing concern that this would make programs more difficult to understand.

The general thrust of this argument is that one cannot rely on what a function does, if it can be changed from anywhere in the program at any time. Even though in principle this can already happen through monkeypatching or code substitution, it is considered poor practice to do so.

However, providing support for overloading any function (or so the argument goes), is implicitly blessing such changes as being an acceptable practice.

This argument appears to make sense in theory, but it is almost entirely mooted in practice for two reasons.

First, people are generally not perverse, defining a function to do one thing in one place, and then summarily defining it to do the opposite somewhere else! The principal reasons to extend the behavior of a function that has not been specifically made generic are to:

  • Add special cases not contemplated by the original function’s author, such as support for additional types.
  • Be notified of an action in order to cause some related operation to be performed, either before the original operation is performed, after it, or both. This can include general-purpose operations like adding logging, timing, or tracing, as well as application-specific behavior.

None of these reasons for adding overloads imply any change to the intended default or overall behavior of the existing function, however. Just as a base class method may be overridden by a subclass for these same two reasons, so too may a function be overloaded to provide for such enhancements.

In other words, universal overloading does not equal arbitrary overloading, in the sense that we need not expect people to randomly redefine the behavior of existing functions in illogical or unpredictable ways. If they did so, it would be no less of a bad practice than any other way of writing illogical or unpredictable code!

However, to distinguish bad practice from good, it is perhaps necessary to clarify further what good practice for defining overloads is. And that brings us to the second reason why generic functions do not necessarily make programs harder to understand: overloading patterns in actual programs tend to follow very predictable patterns. (Both in Python and in languages that have no non-generic functions.)

If a module is defining a new generic operation, it will usually also define any required overloads for existing types in the same place. Likewise, if a module is defining a new type, then it will usually define overloads there for any generic functions that it knows or cares about.

As a result, the vast majority of overloads can be found adjacent to either the function being overloaded, or to a newly-defined type for which the overload is adding support. Thus, overloads are highly- discoverable in the common case, as you are either looking at the function or the type, or both.

It is only in rather infrequent cases that one will have overloads in a module that contains neither the function nor the type(s) for which the overload is added. This would be the case if, say, a third-party created a bridge of support between one library’s types and another library’s generic function(s). In such a case, however, best practice suggests prominently advertising this, especially by way of the module name.

For example, PyProtocols defines such bridge support for working with Zope interfaces and legacy Twisted interfaces, using modules called protocols.twisted_support and protocols.zope_support. (These bridges are done with interface adapters, rather than generic functions, but the basic principle is the same.)

In short, understanding programs in the presence of universal overloading need not be any more difficult, given that the vast majority of overloads will either be adjacent to a function, or the definition of a type that is passed to that function.

And, in the absence of incompetence or deliberate intention to be obscure, the few overloads that are not adjacent to the relevant type(s) or function(s), will generally not need to be understood or known about outside the scope where those overloads are defined. (Except in the “support modules” case, where best practice suggests naming them accordingly.)

Implementation Notes

Most of the functionality described in this PEP is already implemented in the in-development version of the PEAK-Rules framework. In particular, the basic overloading and method combination framework (minus the @overload decorator) already exists there. The implementation of all of these features in peak.rules.core is 656 lines of Python at this writing.

peak.rules.core currently relies on the DecoratorTools and BytecodeAssembler modules, but both of these dependencies can be replaced, as DecoratorTools is used mainly for Python 2.3 compatibility and to implement structure types (which can be done with named tuples in later versions of Python). The use of BytecodeAssembler can be replaced using an “exec” or “compile” workaround, given a reasonable effort. (It would be easier to do this if the func_closure attribute of function objects was writable.)

The Interface class has been previously prototyped, but is not included in PEAK-Rules at the present time.

The “implicit class rule” has previously been implemented in the RuleDispatch library. However, it relies on the __metaclass__ hook that is currently eliminated in PEP 3115.

I don’t currently know how to make @overload play nicely with classmethod and staticmethod in class bodies. It’s not really clear if it needs to, however.

Source: https://github.com/python/peps/blob/master/pep-3124.txt

Last modified: 2017-11-11 19:28:55 GMT