After learning about orbital angular momentum in my most recent quantum lecture, I decided to have a go at modelling the solutions and resulting 3D probability density functions using Python. After a fair bit of trial and error, it turned out a lot better than I though it would. Surprisingly, I had not encountered higher-order derivatives in numerical analysis before (probably because I am quite new to any kind of non-trivial modelling in physics) and had to write a recursive function to implement this, which I thought was neat.

Briefly, just like with Heisenberg’s uncertainty principle regarding momentum and position, the three
components of angular momentum in three dimensions cannot be known simultaneously: the best we can do
is one component and the magnitude of the momentum. This leads to a series of equations for the simultaneous
eigenfunctions of these two operators which, when squared, give us the corresponding
probability density functions in space. These can be plotted in 3D, resulting in the familiar electron orbitals for
the hydrogen
atom as shown in the image below (http://qa.cnx.org/contents/maDiKOfP@10/Development-of-Quantum-Theory).

My goal was to recreate these orbitals using python.

Before writing the functions for this program, we import the necessary libraries.

```
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
import numpy as np
```

First, I imported the necessary libraries: I had all of these on my computer already because I installed Anaconda which comes preloaded with a bunch of handy scientific tools. Next, I knew I would need a general purpose higher-order derivative function so I made the following function recursive_deriv, which given a function f and an order k, returns the function that computes the derivative i.e. it returns a function df(x) taking one parameter only.

```
#recursive derivative
def recursive_deriv(f,k):
def compute(x,dx=0.001):
return (f(x+dx) - f(x))/dx
if(k==0):
return f
if(k==1):
return compute
else:
return recursive_deriv(compute,k-1)
```

Throughout this post I may be quite liberal about units since they are not important as long as they are kept consistent. Working in polar coordinates, the eigenvalue equations for which we are trying to find simultaneous solutions are:

In general, when we have an equation which asserts that an operator acting on a
vector (or in this case, function) equals the original equation multiplied by
some constant, this is an eigenstate equation e.g. the energy eigenvalues are given by `$H|\psi\rangle = E|\psi\rangle$`

where E are the possible energies and `$|\psi\rangle$`

are the possible statevectors. The reason
why the constants m and l in equations (1) and (2) don’t seem quite so straightforward is because
in writing these constants it is convenient to anticipate what the answer will be ahead of time and
account for the possible eigenvalues by writing their form explicitly.

Now, given that there are always deeper levels of understanding one may ascertain for a subject, it should be sufficient to just write down the solutions to the above equations since the main attraction of this post I would assume is the modelling part.

Considering the quantum numbers `$l$`

and `$m\geq0$`

,

where `$P_l^m$`

are the associated legendre polynomials, which I looked up
on Wikipedia for the explicit form. This is where the first instance of the recursive_deriv function is used.

```
#legendre polynomials
def legendre(x,n):
def f(x):
a = (x**2-1)**n
return a
df = recursive_deriv(f,n)
p_n = 1.0/(2**n*np.math.factorial(n))*df(x)
return p_n
```

So altogether I divided the problem into components (what physics is all about) and came up with the following functions to calculate the spherical harmonics solutions.

```
def compute_theta_term(theta,l,m):
def f(x):
a = legendre(x,l)
return a
df = recursive_deriv(f,m)
result = (-1)**m*(1-np.cos(theta)**2)**(m/2)*df(np.cos(theta)) #earlier made the mistake of assuming m is power - it has a meaning
return result
def compute_phi_term(phi,m):
return np.exp(1j*m*phi)
def compute_factor(l,m):
a = (-1)**m
s = (2*l+1)*np.math.factorial(l-m)
t = 4*np.pi*np.math.factorial(l+m)
b = np.sqrt(s/t)
return a*b
def Y_l_m(theta,phi,l,m):
a = compute_factor(l,m)
b = compute_theta_term(theta,l,m)
c = compute_phi_term(phi,m)
if(m == 0):
return a*legendre(np.cos(theta),l)*c
else:
return a*b*c
```

We’re getting closer to this:

Now the last step is to graph the probability density function using matplotlib3d. I managed to do this by applying the transformation to go from spherical coordinates back to cartesian coordinates and plotting a surface as shown in the code below.

```
#make data
l = float(raw_input("l quantum number: "))
m = float(raw_input("m quantum number: "))
if(m<-l or m>l):
print "Invalid quantum numbers"
else:
theta, phi = np.linspace(0, 2 * np.pi, 50), np.linspace(-np.pi, np.pi, 50)
THETA, PHI = np.meshgrid(theta, phi)
R = (np.absolute(Y_l_m(THETA,PHI,l,m)))**2
#R = np.real(Y_l_m(THETA,PHI,l,m))**2
X = R*np.sin(THETA) * np.cos(PHI)
Y = R*np.sin(THETA) * np.sin(PHI)
Z = R * np.cos(THETA)
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
plot = ax.plot_surface(
X, Y, Z, rstride=1, cstride=1, cmap=plt.get_cmap('jet'),
linewidth=0, antialiased=False, alpha=0.5)
plt.show()
```

Taking the real parts of these solutions yields some beautiful results. Now, the
real utility in these solutions lies in the probability density function which is
obtained by taking the square of the wavefunction. This tells you the probability of finding
the electron in a given direction from the nucleus - it isn’t quite the electron cloud orbitals,
but some of the resulting shapes might look familiar because the orbital clouds are just the
places where the electron is very likely to be found, and the likelihood of an electron being
found between some `$\phi \rightarrow \phi + d\phi$`

and `$\theta \rightarrow \theta + d\theta$`

is given
by

where the factor of `$\text{sin}\theta$`

is present when integrating over solid angles.
This equation for the probability can be used to derive the familiar spdf orbitals in chemistry.

And there you have it. Whew. I think I worked on this for a bit longer than would be considered reasonable. Ah well.