Show
Ignore:
Timestamp:
09/12/09 11:41:43 (15 years ago)
Author:
smidl
Message:

doc

Files:
1 modified

Legend:

Unmodified
Added
Removed
  • library/doc/html/tut_arx.html

    r591 r608  
    6868<p>The <code>ARX</code> (AutoregRessive with eXogeneous input) model is defined as follows: </p> 
    6969<p class="formulaDsp"> 
    70 <img class="formulaDsp" alt="\[ y_t = \theta' \psi_t + \rho e_t \]" src="form_115.png"/> 
     70<img class="formulaDsp" alt="\[ y_t = \theta' \psi_t + \rho e_t \]" src="form_126.png"/> 
    7171</p> 
    72 <p> where <img class="formulaInl" alt="$y_t$" src="form_3.png"/> is the system output, <img class="formulaInl" alt="$[\theta,\rho]$" src="form_116.png"/> is vector of unknown parameters, <img class="formulaInl" alt="$\psi_t$" src="form_117.png"/> is an vector of data-dependent regressors, and noise <img class="formulaInl" alt="$e_t$" src="form_24.png"/> is assumed to be Normal distributed <img class="formulaInl" alt="$\mathcal{N}(0,1)$" src="form_118.png"/>.</p> 
     72<p> where <img class="formulaInl" alt="$y_t$" src="form_9.png"/> is the system output, <img class="formulaInl" alt="$[\theta,\rho]$" src="form_127.png"/> is vector of unknown parameters, <img class="formulaInl" alt="$\psi_t$" src="form_128.png"/> is an vector of data-dependent regressors, and noise <img class="formulaInl" alt="$e_t$" src="form_32.png"/> is assumed to be Normal distributed <img class="formulaInl" alt="$\mathcal{N}(0,1)$" src="form_129.png"/>.</p> 
    7373<p>Special cases include: </p> 
    7474<ul> 
     
    8282<dt>Information matrix </dt> 
    8383<dd>which is a sum of outer products <p class="formulaDsp"> 
    84 <img class="formulaDsp" alt="\[ V_t = \sum_{i=0}^{n} \left[\begin{array}{c}y_{t}\\ \psi_{t}\end{array}\right] \begin{array}{c} [y_{t}',\,\psi_{t}']\\ \\\end{array} \]" src="form_119.png"/> 
     84<img class="formulaDsp" alt="\[ V_t = \sum_{i=0}^{n} \left[\begin{array}{c}y_{t}\\ \psi_{t}\end{array}\right] \begin{array}{c} [y_{t}',\,\psi_{t}']\\ \\\end{array} \]" src="form_130.png"/> 
    8585</p> 
    8686 </dd> 
    8787<dt>"Degree of freedom" </dt> 
    8888<dd>which is an accumulator of number of data records <p class="formulaDsp"> 
    89 <img class="formulaDsp" alt="\[ \nu_t = \sum_{i=0}^{n} 1 \]" src="form_120.png"/> 
     89<img class="formulaDsp" alt="\[ \nu_t = \sum_{i=0}^{n} 1 \]" src="form_131.png"/> 
    9090</p> 
    9191 </dd> 
     
    9494On-line estimation</a></h2> 
    9595<p>For online estimation with stationary parameters can be easily achieved by collecting the sufficient statistics described above recursively.</p> 
    96 <p>Extension to non-stationaly parameters, <img class="formulaInl" alt="$ \theta_t , r_t $" src="form_121.png"/> can be achieved by operation called forgetting. This is an approximation of Bayesian filtering see [Kulhavy]. The resulting algorithm is defined by manipulation of sufficient statistics: </p> 
     96<p>Extension to non-stationaly parameters, <img class="formulaInl" alt="$ \theta_t , r_t $" src="form_132.png"/> can be achieved by operation called forgetting. This is an approximation of Bayesian filtering see [Kulhavy]. The resulting algorithm is defined by manipulation of sufficient statistics: </p> 
    9797<dl> 
    9898<dt>Information matrix </dt> 
    9999<dd>which is a sum of outer products <p class="formulaDsp"> 
    100 <img class="formulaDsp" alt="\[ V_t = \phi V_{t-1} + \left[\begin{array}{c}y_{t}\\ \psi_{t}\end{array}\right] \begin{array}{c} [y_{t}',\,\psi_{t}']\\ \\\end{array} +(1-\phi) V_0 \]" src="form_122.png"/> 
     100<img class="formulaDsp" alt="\[ V_t = \phi V_{t-1} + \left[\begin{array}{c}y_{t}\\ \psi_{t}\end{array}\right] \begin{array}{c} [y_{t}',\,\psi_{t}']\\ \\\end{array} +(1-\phi) V_0 \]" src="form_133.png"/> 
    101101</p> 
    102102  </dd> 
    103103<dt>"Degree of freedom" </dt> 
    104104<dd>which is an accumulator of number of data records <p class="formulaDsp"> 
    105 <img class="formulaDsp" alt="\[ \nu_t = \phi \nu_{t-1} + 1 + (1-\phi) \nu_0 \]" src="form_123.png"/> 
     105<img class="formulaDsp" alt="\[ \nu_t = \phi \nu_{t-1} + 1 + (1-\phi) \nu_0 \]" src="form_134.png"/> 
    106106</p> 
    107107  </dd> 
    108108</dl> 
    109 <p>where <img class="formulaInl" alt="$ \phi $" src="form_124.png"/> is the forgetting factor, typically <img class="formulaInl" alt="$ \phi \in [0,1]$" src="form_125.png"/> roughly corresponding to the effective length of the exponential window by relation:</p> 
     109<p>where <img class="formulaInl" alt="$ \phi $" src="form_135.png"/> is the forgetting factor, typically <img class="formulaInl" alt="$ \phi \in [0,1]$" src="form_136.png"/> roughly corresponding to the effective length of the exponential window by relation:</p> 
    110110<p class="formulaDsp"> 
    111 <img class="formulaDsp" alt="\[ \mathrm{win_length} = \frac{1}{1-\phi}\]" src="form_126.png"/> 
     111<img class="formulaDsp" alt="\[ \mathrm{win_length} = \frac{1}{1-\phi}\]" src="form_137.png"/> 
    112112</p> 
    113 <p> Hence, <img class="formulaInl" alt="$ \phi=0.9 $" src="form_127.png"/> corresponds to estimation on exponential window of effective length 10 samples.</p> 
    114 <p>Statistics <img class="formulaInl" alt="$ V_0 , \nu_0 $" src="form_128.png"/> are called alternative statistics, their role is to stabilize estimation. It is easy to show that for zero data, the statistics <img class="formulaInl" alt="$ V_t , \nu_t $" src="form_129.png"/> converge to the alternative statistics.</p> 
     113<p> Hence, <img class="formulaInl" alt="$ \phi=0.9 $" src="form_138.png"/> corresponds to estimation on exponential window of effective length 10 samples.</p> 
     114<p>Statistics <img class="formulaInl" alt="$ V_0 , \nu_0 $" src="form_139.png"/> are called alternative statistics, their role is to stabilize estimation. It is easy to show that for zero data, the statistics <img class="formulaInl" alt="$ V_t , \nu_t $" src="form_140.png"/> converge to the alternative statistics.</p> 
    115115<h2><a class="anchor" id="str"> 
    116116Structure estimation</a></h2> 
    117 <p>For this model, structure estimation is a form of model selection procedure. Specifically, we compare hypotheses that the data were generated by the full model with hypotheses that some regressors in vector <img class="formulaInl" alt="$\psi$" src="form_33.png"/> are redundant. The number of possible hypotheses is then the number of all possible combinations of all regressors.</p> 
    118 <p>However, due to property known as nesting in exponential family, these hypotheses can be tested using only the posterior statistics. (This property does no hold for forgetting <img class="formulaInl" alt="$ \phi<1 $" src="form_130.png"/>). Hence, for low dimensional problems, this can be done by a tree search (method <a class="el" href="classbdm_1_1ARX.html#a16b02ae03316751664c22d59d90c1e34" title="Brute force structure estimation.">bdm::ARX::structure_est()</a>). Or more sophisticated algorithm [ref Ludvik]</p> 
     117<p>For this model, structure estimation is a form of model selection procedure. Specifically, we compare hypotheses that the data were generated by the full model with hypotheses that some regressors in vector <img class="formulaInl" alt="$\psi$" src="form_41.png"/> are redundant. The number of possible hypotheses is then the number of all possible combinations of all regressors.</p> 
     118<p>However, due to property known as nesting in exponential family, these hypotheses can be tested using only the posterior statistics. (This property does no hold for forgetting <img class="formulaInl" alt="$ \phi<1 $" src="form_141.png"/>). Hence, for low dimensional problems, this can be done by a tree search (method <a class="el" href="classbdm_1_1ARX.html#a16b02ae03316751664c22d59d90c1e34" title="Brute force structure estimation.">bdm::ARX::structure_est()</a>). Or more sophisticated algorithm [ref Ludvik]</p> 
    119119<h2><a class="anchor" id="soft"> 
    120120Software Image</a></h2> 
     
    128128<h2><a class="anchor" id="try"> 
    129129How to try</a></h2> 
    130 <p>The best way to experiment with this object is to run matlab script <code>arx_test.m</code> located in directory <code></code>./library/tutorial. See <a class="el" href="arx_ui.html">Running experiment <code>estimator</code> with ARX data fields</a> for detailed description.</p> 
     130<p>The best way to experiment with this object is to run matlab script <code>arx_test.m</code> located in directory <code></code>./library/tutorial. See <a class="el" href="arx_ui.html">Running experiment <code>estimator</code> with ARX data fields  this page is out of date, as the user info concept has been changed</a> for detailed description.</p> 
    131131<ul> 
    132132<li>In default setup, the parameters converge to the true values as expected. </li> 
     
    135135</ul> 
    136136</div> 
    137 <hr size="1"/><address style="text-align: right;"><small>Generated on Sun Aug 30 22:10:50 2009 for mixpp by&nbsp; 
     137<hr size="1"/><address style="text-align: right;"><small>Generated on Tue Sep 8 22:11:32 2009 for mixpp by&nbsp; 
    138138<a href="http://www.doxygen.org/index.html"> 
    139139<img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.6.1 </small></address>