Functools -The Power of Higher-Order Functions in Python

Source Node: 1865357

This article was published as a part of the Data Science Blogathon

Introduction

The Python Standard Library has many great modules to help in keeping your code cleaner and simpler and functools is definitely one of the

Caching

Let’s start with some of the simplest yet powerful functions of the module functools. Let’s start with the caching functions (as well as decorators) – lru_cache,cache and cached_property. The first of them – lru_cache provides a cache of the last results of the execution of functions, or in other words, remembers the result of their work:

from functools import lru_cache
import requests @lru_cache(maxsize=32)
def get_with_cache(url): try: r = requests.get(url) return r.text except: return "Not Found" for url in ["https://google.com/", "https://reddit.com/", "https://google.com/", "https://google.com/"]: get_with_cache(url) print(get_with_cache.cache_info())
# CacheInfo(hits=2, misses=4, maxsize=32, currsize=4)
print(get_with_cache.cache_parameters())
# {'maxsize': 32, 'typed': False}

In this example, we make GET requests and cache their results (up to 32 results) using a decorator @lru_cache. To see if caching actually works, you can check the function cache information using a method cache_infothat shows the number of cache hits and hits. A decorator also provides methods clear_cacheand cache_parametersfor the cancellation of the cached results and test parameters, respectively.

If you need more granular caching, you can include an optional argument typed=true, which allows you to cache different types of arguments separately.

Another decorator for caching in functools is a function called simply cache. It is a simple wrapper lru_cachethat omits the argument max_size, decreasing it, and does not remove the old values.

Another decorator you can use for caching is cached_property. As the name suggests, it is used to cache the results of class attributes. This mechanic is very useful if you have a property that is expensive to compute, but that remains the same.

from functools import cached_property class Page: @cached_property def render(self, value): # Do something with supplied value... # Long computation that renders HTML page... return html

This simple example shows. As can be used, cached property, for example, to cache a rendered HTML page that needs to be shown to the user over and over again. The same can be done for certain database queries or lengthy math calculations.

Another beauty cached_propertyis that it only runs on lookup, so it allows us to change the value of the attribute. After changing the attribute, the previously cached value will not change, instead, the new value will be calculated and cached. You can also clear the cache, and all you need to do is remove the attribute.

I want to end this section with a caveat about all of the above decorators – don’t use them if your function has some side effects or if it creates mutable objects every time it is called since these are clearly not the functions you want to cache.

Comparison and ordering

You probably already know that you can implement comparison operators in Python such as <, >=or ==, with lt, gtor eq. However, it can be quite frustrating to realize each of the eq, lt, le, gtor ge. Fortunately, functools there is a decorator @total_orderingthat can help us with this, because all we need to implement is eqone of the remaining methods, and the rest of the decorator will be generated automatically.

from functools import total_ordering @total_ordering
class Number: def __init__(self, value): self.value = value def __lt__(self, other): return self.value Number(3))
# True
print(Number(1) = Number(15))
# True
print(Number(10) <= Number(2))
# False

This way, we can implement all the extended comparison operations, despite having only eq and by hand lt. The most obvious benefit is the convenience, which is that you don’t have to write all these additional magic methods, but it is probably more important to reduce the amount of code and its better readability.

Overload

We’ve probably all been taught that there is no overloading in Python, but there is actually an easy way to implement it using two functions from functools, namely single dispatch and/or single dispatch method. These functions help us implement what we would call a multiple dispatch algorithm that allows dynamically typed programming languages ​​such as Python to distinguish between types at runtime.

Partial

We all work with various external libraries or frameworks, many of which provide functions and interfaces requiring us to pass callbacks, such as for asynchronous operations or listening for events. This is nothing new, but what if we also need to pass some arguments along with the callback. This is where it comes in handy functools.partial. Can be used partialto freeze some (or all) of the arguments to a function by creating a new object with a simplified function signature. Confused? Let’s take a look at some practical examples:

def output_result(result, log=None): if log is not None: log.debug(f"Result is: {result}") def concat(a, b): return a + b import logging
from multiprocessing import Pool
from functools import partial logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger("default") p = Pool()
p.apply_async(concat, ("Hello ", "World"), callback=partial(output_result, log=logger))
p.close()
p.join()

The code above shows how you can use it partialto pass a function ( output_result) along with an argument ( log=logger) as a callback. In this case, we will use multiprocessing.apply_async, which asynchronously calculates the result of the function ( concat) and returns the result of the callback. However, apply_asyncit will always pass the result as the first argument, and if we want to include any additional arguments, as is the case with log=logger, we need to use partial.

We have considered a fairly advanced use case, and a simpler example would be the usual creation of a function that writes in stderr instead of stdout:

import sys
from functools import partial print_stderr = partial(print, file=sys.stderr)
print_stderr("This goes to standard error output")

With this simple trick, we created a new callable function that will always pass file=sys.stderras a named argument to output, which allows us to simplify our code and not have to specify the value of the named argument every time.

And one last good example. We can use partialin conjunction with a little-known function iterto create an iterator by passing in a callable object and sentinelin iter, which can be applied like this:

from functools import partial RECORD_SIZE = 64 # Read binary file...
with open("file.data", "rb") as file: records = iter(partial(file.read, RECORD_SIZE), b'') for r in records: # Do something with the record...

Usually, when reading a file, we want to iterate over lines, but in the case of binary data, we may need to iterate over records of a fixed size. You can do this by creating a callable object using partialthat reads the specified chunk of data and passes it in iterto create an iterator. This iterator then calls the read function until it reaches the end of the file, always taking only the specified chunk size ( RECORD_SIZE). Finally, when the end of the file is reached, the value sentinel(b ”) is returned and the iteration stops.

Decorators

We’ve already talked about some decorators in the previous sections, but not decorators to create more decorators. One such decorator is functools.wraps. To understand why you need it, let’s just look at an example:

def decorator(func): def actual_func(*args, **kwargs): """Inner function within decorator, which does the actual work""" print(f"Before Calling {func.__name__}") func(*args, **kwargs) print(f"After Calling {func.__name__}") return actual_func @decorator
def greet(name): """Says hello to somebody""" print(f"Hello, {name}!") greet("Martin")
# Before Calling greet
# Hello, Martin!
# After Calling greet

This example shows how a simple decorator can be implemented. We wrap a function that performs a specific task ( actual_func) with an external decorator, and it becomes a decorator itself, which can then be applied to other functions, for example, as is the case with greet. When you call the function, greet you will see that it prints out messages both from actual_funcand on its own. Looks okay, doesn’t it? But what happens if we do this:

print(greet.__name__)
# actual_func
print(greet.__doc__)
# Inner function within decorator, which does the actual work

When we call the name and documentation of the decorated function, we realize that they have been replaced with values ​​from the decorator function. This is bad as we cannot rewrite all of our function names and documentation when we use some decorator. How can this problem be solved? Of course, with functools.wraps:

from functools import wraps def decorator(func): @wraps(func) def actual_func(*args, **kwargs): """Inner function within decorator, which does the actual work""" print(f"Before Calling {func.__name__}") func(*args, **kwargs) print(f"After Calling {func.__name__}") return actual_func @decorator
def greet(name): """Says hello to somebody""" print(f"Hello, {name}!") print(greet.__name__)
# greet
print(greet.__doc__)
# Says hello to somebody

The function’s sole purpose wrapsis to copy the name, documentation, argument list, etc., to prevent overwriting. Considering that wrapsit is also a decorator, you can simply add it to our actual_func, and the problem is solved!

Reduce

Last but not least in the module functools is this reduce. Perhaps from other languages, you might know it as fold(Haskell). This function takes an iterable and folds (adds) all of its values ​​into one. There are many applications for this, for example:

from functools import reduce
import operator def product(iterable): return reduce(operator.mul, iterable, 1) def factorial(n): return reduce(operator.mul, range(1, n)) def sum(numbers): # Use `sum` function from standard library instead return reduce(operator.add, numbers, 1) def reverse(iterable): return reduce(lambda x, y: y+x, iterable) print(product([1, 2, 3]))
# 6
print(factorial(5))
# 24
print(sum([2, 6, 8, 3]))
# 20
print(reverse("hello"))
# olleh

As you can see from the code, reduce can simplify or condense the code into one line, which would otherwise be much longer. With that said, it is usually a bad idea to abuse this function just for the sake of shrinking code, making it “smarter”, as it quickly becomes scary and unreadable. For this reason, in my opinion, it should be used sparingly.

And if you remember that it reduce often shortens everything to one line, it can be perfectly combined with partial:

product = partial(reduce, operator.mul) print(product([1, 2, 3]))
# 6

And finally, if you need more than just the final “collapsed” result, then you can use accumulate– from another great module itertools. To calculate the maximum, it can be used as follows:

from itertools import accumulate data = [3, 4, 1, 3, 5, 6, 9, 0, 1] print(list(accumulate(data, max)))
# [3, 4, 4, 4, 5, 6, 9, 9, 9]

Conclusion

As you can see functools, there are many useful functions and decorators that can make your life easier, but this is just the tip of the iceberg. As I said at the beginning, there are many functions in the Python standard library that help you write better code, so in addition to the functions that we covered here, you can pay attention to other modules, such as operatoror itertool. For any queries, you can hit my comment box. I will try my best to solve your inputs and hope to give you desired outputs. You can also reach me on LinkedIn:-

https://www.linkedin.com/in/shivani-sharma-aba6141b6/

The media shown in this article are not owned by Analytics Vidhya and are used at the Author’s discretion.

Source: https://www.analyticsvidhya.com/blog/2021/08/functools-the-power-of-higher-order-functions-in-python/

Time Stamp:

More from Analytics Vidhya