Based on this ticket I raised on mypy: Support for deprecating function arguments · Issue #19817 · python/mypy · GitHub
(While the below names mypy, this extends to other type checkers. )
Feature
Currently we can deprecate functions, classes, methods, (modules would be nice too?), etc. There should be a clean way to support deprecation of certain function arguments/parameters in a way that the type checker can catch their misuse accordingly.
Pitch
For a function foo
which may have many parameters, and for an up coming release we decide one (e.g. bar
) is redundant and should be deprecated for some reason (support will be dropped, or another parameter is a better hook for the desired behaviour, etc). Currently I have to write something along the lines of:
def foo(bar: int | None = None): if bar is not None: warnings.warn("The use of bar is discouraged for some good reason", DeprecationWarning) ...
However, the caller of this function might well be calling foo
with either foo(bar=None)
or foo(bar=42)
, when really we would want in both cases the type checker to warn us that bar
is deprecated.
I cannot see any support for this in the documentation.
I think there is a way we could do this using overload
and dispatch
(similar to https://stackoverflow.com/a/73478371/5134817), but I don’t think this solution is clean.
Instead I am imagining something along the lines of the behaviour of auto
when using enum class, where we can write.
def foo(bar: int | None = deprecated_arg(None, "Don't use bar for some good reason")): ... x = foo() # No mypy warning. x = foo(bar=None) # mypy Warning. x = foo(bar=42) # mypy Warning. from functools import partial sneaky = partial(foo, bar=42) # mypy Warning.
Some alternative syntaxes could be:
def foo(bar: Deprecated[int | None, "Don't use bar for some good reason"]): ... # NB - No default value. @deprecate_arg(bar="Don't use bar for some good reason") # Looks similar to functools.partial syntac. def foo(bar: int | None): ...
Of these two alternatives, I don’t especially like the decorator approach because it requires writing the argument out twice, so it is easy for this to go out of sync or be a small extra maintenance burden).
Note - I think deprecating certain support for types could be achieved using the type annotation, but imagine this would be a much more complex problem. E.g.
def foo(bar: int | Deprecated[None, "Don't use bar=None for some good reason"] = None): ... foo(bar=42) # no mypy warning. foo(bar=None) # mypy Warning. foo() # mypy Warning.
static or runtime warnings?
In these examples, I mention mypy warnings, meaning static warnings issued by the type checker. I don’t wish to conflate these with runtime warnings. Both I think are important, and perhaps there is room for a solution that supports both use cases. But for now I am mostly concerned with achieving the goal with static type checkers.
Difficulties with using overload
or dispatch
?
I think there are some difficulties with this:
-
There is not much in the way of documentation. (I think I would be the first to combine this in the desired way, which I find surprising and unlikely).
-
I am not sure this would scale to the case where I had multiple possible types and multiple deprecated arguments.
To elaborate more on the second point, it seems to me that it should be doable for very simple functions, but for non-trivial functions, this might not scale easily. Consider for example:
def big_function( arg1: int | str, arg2: int | str | None = None arg3: float | int | None = None, # <- Let's deprecate this arg4: float | int | None = None, # <- Let's deprecate this also *args, kwarg1: int | str, kwarg2: int | str | None = None kwarg3: float | int | None = None, # <- Let's deprecate this too kwarg4: float | int | None = None, # <- Let's deprecate this as well *kwargs): ...
I am trying to imagine a clean way to do this with overload, but it seems to me I will have to construct a huge cross product of all possible combination types.