summaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorarokem <arokem@berkeley.edu>2011-03-19 11:21:51 -0700
committerarokem <arokem@berkeley.edu>2011-05-31 17:14:53 -0700
commit723782ce4771cc4426d30274b97c91012563b70c (patch)
treea8503fe4721b8fa40d606a6a0c54a22dfbf0c515 /doc
parent4e24d0a65a9502c29509a2968636d1ac07f3c939 (diff)
More work on ar/mar examples, including moving the ar1 file.
Also - docstring of the ar generator utils function.
Diffstat (limited to 'doc')
-rwxr-xr-xdoc/examples/ar_est_1var.py90
-rw-r--r--doc/examples/ar_est_2vars.py10
2 files changed, 98 insertions, 2 deletions
diff --git a/doc/examples/ar_est_1var.py b/doc/examples/ar_est_1var.py
new file mode 100755
index 0000000..d203065
--- /dev/null
+++ b/doc/examples/ar_est_1var.py
@@ -0,0 +1,90 @@
+"""
+
+.. _ar:
+
+=============================
+Auto-regressive model fitting
+=============================
+
+Auto-regressive (AR) processes are processes that follow the following equation:
+
+.. math::
+
+ x_t = \sum_{i=1}^{n}a_i * x_{t-i} + \epsilon_t
+
+In this example, we will demonstrate the estimation of the AR model coefficients and the
+estimation of the AR process spectrum, based on the estimation of the
+coefficients.
+
+We start with imports from numpy, matplotlib and import :mod:`nitime.utils` as
+well as :mod:`nitime.algorithms:`
+
+"""
+
+import numpy as np
+from matplotlib import pyplot as plt
+
+from nitime import utils
+from nitime import algorithms as alg
+from nitime.timeseries import TimeSeries
+from nitime.viz import plot_tseries
+
+"""
+
+We define some variables, which will be used in generating the AR process:
+
+"""
+
+npts = 2048
+sigma = 0.1
+drop_transients = 128
+
+"""
+
+In this case, we generate an order 2 AR process, with the following coefficients:
+
+
+"""
+
+
+coefs = np.array([2.7607, -3.8106, 2.6535, -0.9238])
+
+
+"""
+
+This generates the AR(2) time series:
+
+"""
+
+X, noise, _ = utils.ar_generator(npts, sigma, coefs, drop_transients)
+
+ts_x = TimeSeries(X,sampling_rate=1000,time_unit='s')
+ts_noise = TimeSeries(noise,sampling_rate=1000,time_unit='s')
+
+"""
+
+We use the plot_tseries function in order to visualize the process:
+
+
+"""
+
+fig01 = plot_tseries(ts_x,label='AR signal')
+fig01 = plot_tseries(ts_noise,fig=fig01,label='Noise')
+fig01.axes[0].legend()
+
+"""
+
+.. image:: fig/ar_est_1var_01.*
+
+
+Now we estimate back the model parameters, using two different estimation
+algorithms.
+
+
+"""
+fig02
+for order in [1,2,3,4]:
+ sigma_est, coefs_est = alg.AR_est_YW(X, 2)
+ plot
+
+plt.show()
diff --git a/doc/examples/ar_est_2vars.py b/doc/examples/ar_est_2vars.py
index 0053af8..08eba32 100644
--- a/doc/examples/ar_est_2vars.py
+++ b/doc/examples/ar_est_2vars.py
@@ -7,6 +7,7 @@
Mulitvariate auto-regressive modeling
=====================================
+Multivariate auto-regressive modeling uses a simple
This example is based on Ding, Chen and Bressler 2006 [Ding2006]_.
@@ -47,8 +48,7 @@ We will generate an AR(2) model, with the following coefficients (taken from
\begin{array}{ccc}
x_t &=& 0.9x_{t-1} - 0.5 x_{t-2} + \epsilon_t\\
- y_t &=& 0.8Y_{t-1} - 0.5 y_{t-2} + 0.16 x_{t-1} - 0.2 x_{t-2} + \eta_t
- \end{array}
+ y_t &=& 0.8y_{t-1} - 0.5 y_{t-2} + 0.16 x_{t-1} - 0.2 x_{t-2} + \eta_t\end{array}
Or more succinctly, if we define:
@@ -331,6 +331,12 @@ ax02.legend()
.. image:: fig/ar_est_2vars_02.png
+
+Note that these results make intuitive sense, when you look at the equations
+governing the mutual influences. X is entirely influenced by X (no effects of Y
+on X in :ref:`eq1`) and there is some influence of X on Y (:ref:`eq2`),
+resulting in this pattern.
+
Finally, we calculate the total causality, which is the sum of all the above
causalities. We compare this to the interdependence between the processes. This is the
measure of total dependence and is closely akin to the coherence between the