Anyway, in a burst of structured procrastination, I hammered out a short Matlab program to do just this for simple one-dimensional problems. It's great—type in an arbitrary potential, and a few seconds later you can see the eigenstates and eigenvalues.

The "time independent Schrödinger equation" is simply the idea that there exists an operator

`H`

whose eigenstates are states of definite energy, and whose eigenvalues are those energies:`H |ψ> = E |ψ>`

where H is an operator and

`E`

is a number. `|ψ>`

is just a way of writing a vector; we might as well write it **v**

or `ψ(x)`

, the latter notation making it clear that `ψ`

is a function over a continuous coordinate `x`

(making it an infinite-dimensional vector).The basic problem is to find a set of eigenvalues

`E`_{n}

and eigenstates `ψ`_{n}(x)

given the operator `H`

, which is called the Hamiltonian.The Hamiltonian (for a one-dimensional problem) typically has the form:

`H = p`^{2}/(2m) + V(x)

where p is an operator that gives the momentum (making

`p^2/2m`

the kinetic energy), and V is an operator that gives the potential energy. It turns out that

`p = -i ℏ (∂/∂x)`

where ℏ is some physical constant, and (∂/∂x) is the derivative with respect to x.

From linear algebra we know that there is a one-to-one mapping between matrices and operators. To solve these sort of problems numerically, we just have to discretize everything and write our operators as matrices.

The main ingredient we need is a discrete derivative. This is not so hard: we can just estimate the derivative by computing the difference between two points and dividing by Δx.

`f'(x) = lim (h → 0) of (f(x+h) - f(x)) / h`

f'(x) ≈ (f(x+Δx) - f(x))/(Δx)

Now if you imagine

`f(x)`

as a column vector over our discretized values of x,**f** = [f(x_{1}) f(x_{2}) f(x_{3}) ... f(x_{N})]^{T}

then it's clear that the derivative operator will look something like:

D = [1 -1 0 0 ... 0 1 -1 0 ... 0 0 1 -1 ... ]/Δxwhere

`Δx = x`_{N+1}-x_{N}

.It turns out that this

*almost*works, but we really need the derivative operator to be symmetric. I'm not sure exactly why, but the practical problem is that

`D`^{N}

gets shifted farther and farther to the right as we raise N with this asymmetric version. A better form is to use a symmetric difference:`f'(x) = lim (h → 0) of (f(x+h) - f(x-h)) / (2h)`

f'(x) ≈ (f(x+Δx) - f(x-Δx))/(2 Δx)

That does work, but unfortunately it doesn't give a very good form of D

^{2}for the second derivative, which is what we need. I'll just punt and define the second derivative estimator directly:

`f''(x) = lim (h → 0) of (f(x+h) - 2 f(x) + f(x-h)) / h`^{2}

f'(x) ≈ (f(x+Δx) - 2 f(x) + f(x-Δx))/(Δx)^{2}

which, in matrix form, looks like

Dwhich is symmetric, as desired.^{2}= [ 2 -1 0 0 -1 2 -1 0 0 -1 2 -1 0 0 -1 2 ]/(Δx)^{2}

(I'm not sure if there's a good form of the first derivative operator D that will give a nice form of the second derivative D

^{2}. Perhaps having D and D

^{2}is enough to compute nice versions of all higher derivatives.)

One of the simplest problems in one-dimensional quantum mechanics is the "square well", where the potential energy is zero inside of a certain range and non-zero (or infinite) outside:

V(x) = 0 if |x| < L VNow we have everything we need to put together_{0}otherwise

`H = -D`^{2} + V

(ignoring all physical constants). Then we solve for the eigenfunctions and eigenvalues of H, as described in my previous entry.Without further ado, we find (numerically!) the usual eigenfunctions:

And the eigenvalues scale like n

^{2}, as expected:

Anyway, try playing with the complete matlab program. It's fun! Just put in a potential, run the program, and watch the eigenfunctions converge...

Notes:

- I assume this would be totally impractical with problems of greater than 1-dimension, though of course for separable problems, you can solve for each separable part individually.
- There's no reason, necessarily, to do this all in the x basis. We could instead solve for the fourier or laguerre series of the eigenfunction, which might converge faster and/or have a nicer form for the derivative.