(Damped)Random Walks
inspired by reading http://arxiv.org/pdf/1312.3966v1.pdfThis notebook tries to demonstrate some (1D) random walks in Python using Numpy arrays which is hopefully adequately fast and clear.
I'll try to be efficient in operating on arrays, but not on creating them. The reason is that python is pretty awkward about that and we should only have to create each array once.
In [2]:
%pylab inline
from IPython.display import Image
from __future__ import division
This is just a simple test to see what built-in rand() numbers look like.
In [3]:
listplot = lambda l: plot(*(zip(*l))) # so idiomatic I need to splat twice
listplot([(i,rand()) for i in range(100)])
Out[3]:
Here we define the partial sum \(P\) of a sequence \(S\):
\[ P_i=\sum_{k\leq i} S_k \]
\[ P_i=\sum_{k\leq i} S_k \]
In [5]:
def partialsums(a): # take the partial sums of some sequence (probably a numpy array)
a = a.copy() # this line preserves the original array, without it the array changes
for i in range(a.size-1):
a[i+1]+=a[i]
return a
Now we can go ahead and take an actual random walk. Here we plot two one-dimensional random walks simulataneously.
In [11]:
from numpy.random import choice
nsteps = 1e5 # slows down after 1e6
def walk1d(dist='uniform'): # random walk assuming no isotropy, homogeneity
steps = { # If this dict comes outside the function then everything breaks
'uniform' : uniform(-1,1,nsteps),
'gaussian' : normal(size=nsteps),
'discrete' : choice(array([-1,1]),nsteps)}
return partialsums(array(steps[dist]))
xs = walk1d('uniform')
ys = walk1d('uniform')
plot(xs,ys)
Out[11]:
And finally we can implement the type of walk referred to in the paper as a "Damped Random Walk", or Ornstein-Uhlenbeck process. In fact we rather use the discrete analogue, known as an autoregressive process. This is because I'm not up for numerically solving a stochastic diffy today. I also don't know how yet.
\[X_t = c + \sum_{i=1}^p \varphi_i X_{t-i}+ \varepsilon_t \]
In [13]:
nsteps=100
def walk1dspatial(f = lambda x: 0.3*x + normal()-x):
walk = zeros(nsteps)
for i in range(walk.size-1):
walk[i+1] = f(walk[i])
return walk
xs = walk1dspatial()
ys = walk1dspatial()
plot(xs,ys)
Out[13]:
A quick test to see that the numbers for the gaussian really do follow a bell curve. Not a good test. In fact I think the two distributions have different \(\sigma\).
In [8]:
binsize = 0.1
samples = 1000
domain = arange(-5,5,binsize)
r = lambda x: round(x,ndigits=int(-log10(binsize))) # binning function
bins = map(r,domain)
plot(bins,[map(r,normal(size=samples)).count(x) for x in bins])
# The above method is very bad and slow, but pretty neat
plot(domain,samples*exp(-domain**2)*binsize/2)
Out[8]: