This article is an edited version of this article on the Finxter blog.

The `math.factorial()`

function is one of many functions in the `math`

module. In this article, we will explore the mathematical properties of the factorial function using Python’s Matplotlib and NumPy libraries.

## What is the Factorial Function?

A factorial of a positive integer * n* is just the product of all the integers from 1 to

*. The standard shorthand for this is to write an exclamation mark after*

**n***(read*

**n***“n factorial”*):

**n != 1β
2β
…β
n**

For instance, * 4!* is just

*.*

**1β 2β 3β 4=24**We can rewrite the above formula in a recursive form:

**n! := nβ
(nβ1)!**

Actually, we *define* * 0!=1* to make this formula hold for all positive integers

*.*

**n**This formula provides a “naive” way of computing the factorial in Python:

def funct(n): # the factorial of n calculated using recursion if n == 0: return 1 else: return n * funct(n-1) print(funct(4))

Are there faster ways of calculating the factorial? Actually, math.factorial() in Python 3 uses the so-called “divide-and-conquer” algorithm, which is significantly faster than the “naive” algorithm discussed above.

The factorial function is used in *combinatorics*, a branch of mathematics concerned with counting discrete structures. Below, we consider two very important examples from introductory combinatorics.

For our first example, we count the number of ways to rearrange four books, labeled 1 through 4. Let’s think about how many ways we can place a book in a given place on the bookshelf. In the first slot, there are four options since we can put any of the four books in that slot. There are only three options in the next slot, since we’ve already put a book on the first slot. The third and fourth slots have even fewer options, two and one respectively, because of the same argument. Thus, the number of ways to arrange the four books is

**1β
2β
3β
4 = 24**

Or simply, ** 4!**. This is called the

*permutation*of the four books.

Let’s think of another problem. Suppose now that we have six balls, 2 black and 4 white. We will assume that two balls with the same color are identical, so we cannot tell apart one black ball from the other. The same is true for the white ones as well.

How many ways are there to arrange the balls in a straight line? This is a different problem from before since we were able to distinguish each book.

To think about this problem, let’s suppose we label the balls, say ** b1,b2** and

**. (We will take the labels off at the end of the calculation so to make the balls indistinguishable again.) Then how many ways are there to rearrange the balls? By the same argument as in the bookshelf example, there are 6! ways of arranging the balls. However, since we said the balls with the same colors are indistinguishable, we must account for this in our calculation.**

*w1,w2,w3,w4*To do this, we need to think about how many ways we can arrange the labeled balls for a given configuration of the unlabeled balls. For example, if we have the configuration

**bbwwww**

after erasing the labels, then what are the possible ways that the balls could have been labeled? Some possibilities include

*b1 β
b2 β
w4 β
w1 β
w4 β
w2 β
w3*

and

*b2 β
b1 β
w2 β
w1 β
w3 β
w4*

You can see after a while that if you just arrange the black balls in any way you like, arrange the white balls in any way you like, and then put the two together, you get a valid configuration. But we can use the argument from the bookshelf example to calculate the number of ways we can arrange the white and black balls, respectively. Therefore, the number of labeled ball configurations corresponding to ** bbwwww** is just

*2!β
4! = 48*

Going back to the original problem, we see that the number of ways of rearranging the *unlabeled* balls is

In textbooks, you will see this written as

or sometimes

(The second one is read *“six choose two”* precisely because we are choosing where the two balls go out of six possible spots.) This is called a *binomial coefficient* because it is the coefficient of

when you expand out

## Asymptotic Behavior of the Factorial Function

Computer scientists often care about the run times of algorithms. To study this, they consider the *asymptotic behavior* of a given function * f(n)*, which is how quickly or slowly the function

**grows for large**

*f***. In this section, we think about the asymptotic behavior of the factorial function.**

*n*Let’s start with something simple, and try comparing the growth of the factorial function with the linear, quadratic, and exponential functions:

import math import numpy as np import matplotlib.pyplot as plt linear = list(range(1,11)) quadratic = [n**2 for n in linear] exponential = [2**n for n in linear] factorial = [math.factorial(n) for n in linear] data = np.array([linear, quadratic, exponential, factorial]) fig = plt.figure(figsize = (8, 2)) ax = fig.add_subplot(111) table = ax.table(cellText=data, rowLabels = ["$n$", "$n^2$", "$2^n$", "$n!$"], loc='center') table.set_fontsize(60) table.scale(3,5) ax.axis('off') plt.show()

We can see from the table that the factorial function grows very quickly, in fact much faster than the exponential function.

Let’s try to study the factorial function more closely. We want to find a formula that gives a sense of *how fast* the factorial function grows. In this section, we will calculate a formula which is “good enough” for many computer science calculations. (For a more technical discussion, see the appendix.)

Instead of working with ** n!**, we will study

**. From the definition of the factorial and a basic property of the natural logarithm, we can rewrite this as**

*ln n!*But the above sum is a good approximation to the integral of ** ln x** , so the above is

*approximately*equal to

(We can make this into an airtight proof by observing that the sum is a Riemann sum of the integral.) Thus, we expect ** ln n!** and

**to grow at the same speed.**

*n ln n*The technical term for this “sameness” is *asymptotic equality*. For two sequences ** a_{n}**,

*(with*

**b**_{n}**nonzero after some large enough**

*b*_{n}**), we say**

*n***and**

*a*_{n}**are asymptotically equal (written**

*b*_{n}**) if their ratio approaches 1 for large**

*a*_{n}βΌb_{n}**. In calculus notation, we can write this as:**

*n*Using this terminology, we have our asymptotic description of ** ln n!**:

Notice that this does *not* mean ** n!** is asymptotically equal to

**. More generally,**

*n^n***does not imply asymptotic equality of**

*a*_{n}βΌb_{n}**and**

*e^a*_{n}**. Try taking:**

*e^b*_{n}Let’s confirm our calculation by generating a plot:

import matplotlib.pyplot as plt import numpy as np import math n = np.arange(2,140,1) fn = [k*np.log(k)/np.log(float(math.factorial(k))) for k in n] plt.title("Plot of $\\frac{n\ln n}{\ln n!}$") plt.scatter(n,fn) plt.show()

Observe how the plot approaches 1 for large values of ** n**. This is consistent with our calculation that

## Appendix: Stirling’s Formula.

This section covers some technical aspects of the factorial function.

In the previous section, we could only calculate the asymptotic behavior of ** ln n!** and not

**. This appendix will discuss an important formula that precisely describes the asymptotic behavior of**

*n!***.**

*n!*Before diving into the main discussion, we mention a second way of describing the factorial. The *gamma function* is given by the improper integral

The gamma function is part of the math module of the Python Standard Library. You can compute it using `math.gamma()`

:

for k in range(1,7): print("Ξ(" + str(k) + ")= " + str(math.gamma(k)))

Output:

Ξ(1)= 1.0 Ξ(2)= 1.0 Ξ(3)= 2.0 Ξ(4)= 6.0 Ξ(5)= 24.0 Ξ(6)= 120.0

Looking at the numbers carefully, you notice that the first six values are precisely the factorials of 0 through 5. You can show (using either integration by parts from high school calculus, or alternatively differentiation under the integral sign) that

This is our second description of the factorial function. Since the integral of a function is just the area under its graph, ** n!** is the area under the graph of

What does this look like? Using matplotlib, we can plot the functions out for the first few values of ** n**:

import matplotlib.pyplot as plt import numpy as np import seaborn as sns; sns.set() vals = np.linspace(0,10,100) plt.plot(np.array([t*np.exp(-t) for t in vals]), label='n = 1') plt.plot(np.array([t**2*np.exp(-t) for t in vals]), label='n = 2') plt.plot(np.array([t**3*np.exp(-t) for t in vals]), label='n = 3') plt.plot(np.array([t**4*np.exp(-t) for t in vals]), label='n = 4') plt.title("$f_n(t) = t^n e^{-t}$ for small $n$.") plt.legend() plt.show() plt.show()

If you are familiar with statistics, you may notice that these graphs look somewhat similar to the normal distribution, especially for larger values of ** n**. This is a crucial observation for getting an asymptotic formula for

**.**

*n!*import matplotlib.pyplot as plt import numpy as np import seaborn as sns; sns.set() vals1 = np.linspace(-3,3,100) plt.plot(np.array([np.exp(-t**2) for t in vals1])) plt.title("The Normal Distribution.") plt.show() plt.show()

There are various tricks for evaluating the integral of normal distribution curves. If we apply those tricks to the function ** f_{n}(t)** (with appropriate modifications, of course), we get the asymptotic formula for

**:**

*n!*This formula is called *Stirling’s formula*. It is very useful for getting approximate values of ** n!** for large values of

**:**

*n*import math import numpy as np import matplotlib.pyplot as plt lst1 = list(range(0,10,1)) factorial = [math.factorial(n) for n in lst1] stirling = [round(np.sqrt(2*np.pi*n)*(n/math.e)**n,1) for n in lst1] stirling_error = [str(round(100*abs(stirling[n]-factorial[n])/factorial[n],2)) + "%" for n in range(0,10)] data = np.array([lst1, factorial, stirling, stirling_error]) fig = plt.figure(figsize = (8, 2)) ax = fig.add_subplot(111) table = ax.table(cellText=data, rowLabels = ["$n$", "$n!$", "Stirling", "Percent Error"], loc='center') table.set_fontsize(60) table.scale(3,5) ax.axis('off') plt.show()

We can see from the above Python code that the percent error of Stirling’s formula falls well under 1% after the first few values of ** n**. This is quite remarkable since

**can be difficult to compute directly for larger values of**

*n!***, but Stirling’s formula is fairly easy to evaluate.**

*n*