More advanced filtering algorithms for non-linear non-Gaussian models can be derived, see ...
 
 where  is the state,
 is the state,  is the system output,
 is the system output,  are state matrices of appropriate dimensions,
 are state matrices of appropriate dimensions,  are zero mean Gaussian noises with covariance matrices
 are zero mean Gaussian noises with covariance matrices  , respectively.
, respectively.
Both prior and posterior densities on the state are Gaussian, i.e. of the class enorm.
There is a range of classes that implements this functionality, namely:
 
 where  are general non-linear functions which have finite derivatives. Remaining variables have the same meaning as in the Kalman Filter.
 are general non-linear functions which have finite derivatives. Remaining variables have the same meaning as in the Kalman Filter.
In order to use this class, the non-linear functions and their derivatives must be defined as an instance of class diffbifn.
Two classes are defined:
#include <estim/libKF.h> using namespace bdm; // estimation of AR(0) model int main() { //dimensions int dx=3, dy=3, du=1; // matrices mat A = eye(dx); mat B = zeros(dx,du); mat C = eye(dx); mat D = zeros(dy,du); mat Q = eye(dx); mat R = 0.1*eye(dy); //prior mat P0 = 100*eye(dx); vec mu0 = zeros(dx); // Estimator KalmanCh KF; KF.set_parameters(A,B,C,D,/*covariances*/ Q,R); KF.set_statistics(mu0,P0); // Estimation loop for (int i=0;i<100;i++){ KF.bayes(randn(dx+du)); } //print results cout << "Posterior estimate of x is: " << endl; cout << "mean: "<< KF.posterior().mean()<< endl; cout << "variance: "<< KF.posterior().variance()<< endl; }
 1.5.8
 1.5.8