Cooperative Localisation Filters 0.0.1
Loading...
Searching...
No Matches
Classes | Public Member Functions | Protected Types | Protected Member Functions | Protected Attributes | Static Protected Attributes | List of all members
Filter Class Referenceabstract

#include <filter.h>

Inheritance diagram for Filter:
Inheritance graph
[legend]

Classes

struct  EstimationParameters
 Houses the parameters required for performing Bayesian filtering. More...
 

Public Member Functions

 Filter (DataHandler &data)
 Assigns fields data based on datahandler input.
 
virtual ~Filter ()
 
void performInference ()
 Performs robot state inference using the EKF bayesian inference framework for all robots provided.
 

Protected Types

enum  { FORWARD_VELOCITY = 0 , ANGULAR_VELOCITY = 1 }
 
enum  { RANGE = 0 , BEARING = 1 }
 
enum  { X = 0 , Y = 1 , ORIENTATION = 2 }
 
typedef Eigen::Matrix< double, 2, 1 > vector2D_t
 
typedef Eigen::Matrix< double, total_states, 1 > vector3D_t
 
typedef Eigen::Matrix< double, 2 *total_states, 1 > vector6D_t
 
typedef Eigen::Matrix< double, 2, 2 > matrix2D_t
 
typedef Eigen::Matrix< double, total_states, total_statesmatrix3D_t
 
typedef Eigen::Matrix< double, 2 *total_states, 2 *total_statesmatrix6D_t
 
typedef vector3D_t state_t
 
typedef vector6D_t augmentedState_t
 
typedef matrix3D_t covariance_t
 
typedef matrix6D_t augmentedCovariance_t
 
typedef vector3D_t information_t
 
typedef vector6D_t augmentedInformation_t
 
typedef matrix3D_t precision_t
 
typedef matrix6D_t augmentedPrecision_t
 
typedef vector2D_t measurement_t
 
typedef matrix2D_t measurementCovariance_t
 
typedef Eigen::Matrix< double, 2 *total_states, total_measurementskalmanGain_t
 
typedef Eigen::Matrix< double, total_states, total_statesmotionJacobian_t
 
typedef Eigen::Matrix< double, total_states, total_inputsprocessJacobian_t
 
typedef Eigen::Matrix< double, total_measurements, 2 *total_statesmeasurementJacobian_t
 
typedef Eigen::Matrix< double, total_inputs, total_inputsprocessCovariance_t
 
typedef Eigen::Matrix< double, total_measurements, 1 > huberMeasurementThresholds_t
 
typedef Eigen::Matrix< double, total_measurements, total_measurementshuberMeasurementWeights_t
 
typedef Eigen::Matrix< double, 2 *total_states, 1 > huberStateThresholds_t
 
typedef Eigen::Matrix< double, 2 *total_states, 2 *total_stateshuberStateWeights_t
 

Protected Member Functions

virtual void prediction (const Robot::Odometry &, EstimationParameters &)=0
 
virtual void correction (EstimationParameters &, const EstimationParameters &)=0
 
void motionModel (const Robot::Odometry &, EstimationParameters &, const double)
 The unicycle motion model used to perform motion predictions.
 
void calculateMotionJacobian (const Robot::Odometry &, EstimationParameters &, const double)
 Calculates the Jacobian matrix of the unicycle motion model in terms of the systems states (x,y,orientation).
 
void calculateProcessJacobian (EstimationParameters &, const double)
 Calculates the Jacobian matrix of the motion model evaluated in terms of the process inputs.
 
measurement_t measurementModel (EstimationParameters &, const EstimationParameters &)
 Uses the non-linear measurement model to predict what the measurement from the system would be given the state estimates of the ego robot and measured agent.
 
void calculateMeasurementJacobian (EstimationParameters &, const EstimationParameters &)
 Jacobian of the measurement model evaluated in terms of the systems states: x,y, and heading.
 
matrix3D_t marginalise (const matrix6D_t &)
 Schur complement-based marginalisation that marginalises a 6x6 matrix into a 3x3 matrix.
 
state_t marginalise (const vector6D_t &, const matrix6D_t &)
 Schur complement-based marginalisation that marginalises a 6x1 vector and 6x6 matrix into a 3x1 vector.
 
augmentedInformation_t createAugmentedVector (const state_t &, const state_t &)
 Combines the 3 states of the ego vehicle (x,y,orientation), with the state of the agent measured to create a augmented state vector.
 
augmentedCovariance_t createAugmentedMatrix (const covariance_t &, const covariance_t &)
 Combines the covariance/precision matrix of the 3 states of the ego vehicle (x,y,orientation), with the covariance/precision of the of the agent measured.
 
void normaliseAngle (double &)
 Normalise an angle between \((-\pi, \pi]\).
 
measurement_t calculateNormalisedInnovation (const EstimationParameters &)
 Calculates the normalised residual of the value produced by the sensor measurement and the prior estimate passed through the non-linear measurement model.
 
augmentedState_t calculateNormalisedEstimationResidual (const EstimationParameters &)
 Calculates the normalised residual of the initial estimate and updated estimate.
 
huberMeasurementWeights_t HuberMeasurement (const measurement_t &, const huberMeasurementThresholds_t &)
 Huber Measurement cost function,.
 
huberStateWeights_t HuberState (const augmentedState_t &, const huberStateThresholds_t &)
 Huber State cost function,.
 
matrix3D_t computePseudoInverse (const matrix3D_t &)
 
matrix6D_t computePseudoInverse (const matrix6D_t &)
 

Protected Attributes

DataHandler & data_
 Data class housing all the data pertaining to cooperative localisation (positioning).
 
std::vector< EstimationParametersrobot_parameters
 Houses all estimation parameters for all robots.
 
std::vector< EstimationParameterslandmark_parameters
 Houses all estimation parameters for all landmarks.
 
const huberMeasurementThresholds_t measurement_thresholds
 The thresholds for the huber measurement cost function.
 
const huberStateThresholds_t state_thresholds
 The thresholds for the huber state cost function.
 

Static Protected Attributes

static const unsigned short total_states = 3
 The total number of system states: x coordinate [m], y-coordinate [m], and orientation (heading) [rad].
 
static const unsigned short total_inputs = 2
 The total number of system inputs: forward velocity [m/s] and angular velocity [rad/s].
 
static const unsigned short total_measurements = 2
 The total number of system measurements: range [m] and bearing [rad].
 

Member Typedef Documentation

◆ augmentedCovariance_t

◆ augmentedInformation_t

◆ augmentedPrecision_t

◆ augmentedState_t

◆ covariance_t

typedef matrix3D_t Filter::covariance_t
protected

◆ huberMeasurementThresholds_t

typedef Eigen::Matrix<double, total_measurements, 1> Filter::huberMeasurementThresholds_t
protected

◆ huberMeasurementWeights_t

typedef Eigen::Matrix<double, total_measurements, total_measurements> Filter::huberMeasurementWeights_t
protected

◆ huberStateThresholds_t

typedef Eigen::Matrix<double, 2 * total_states, 1> Filter::huberStateThresholds_t
protected

◆ huberStateWeights_t

typedef Eigen::Matrix<double, 2 * total_states, 2 * total_states> Filter::huberStateWeights_t
protected

◆ information_t

typedef vector3D_t Filter::information_t
protected

◆ kalmanGain_t

typedef Eigen::Matrix<double, 2 * total_states, total_measurements> Filter::kalmanGain_t
protected

◆ matrix2D_t

typedef Eigen::Matrix<double, 2, 2> Filter::matrix2D_t
protected

◆ matrix3D_t

typedef Eigen::Matrix<double, total_states, total_states> Filter::matrix3D_t
protected

◆ matrix6D_t

typedef Eigen::Matrix<double, 2 * total_states, 2 * total_states> Filter::matrix6D_t
protected

◆ measurement_t

typedef vector2D_t Filter::measurement_t
protected

◆ measurementCovariance_t

◆ measurementJacobian_t

typedef Eigen::Matrix<double, total_measurements, 2 * total_states> Filter::measurementJacobian_t
protected

◆ motionJacobian_t

typedef Eigen::Matrix<double, total_states, total_states> Filter::motionJacobian_t
protected

◆ precision_t

typedef matrix3D_t Filter::precision_t
protected

◆ processCovariance_t

typedef Eigen::Matrix<double, total_inputs, total_inputs> Filter::processCovariance_t
protected

◆ processJacobian_t

typedef Eigen::Matrix<double, total_states, total_inputs> Filter::processJacobian_t
protected

◆ state_t

typedef vector3D_t Filter::state_t
protected

◆ vector2D_t

typedef Eigen::Matrix<double, 2, 1> Filter::vector2D_t
protected

◆ vector3D_t

typedef Eigen::Matrix<double, total_states, 1> Filter::vector3D_t
protected

◆ vector6D_t

typedef Eigen::Matrix<double, 2 * total_states, 1> Filter::vector6D_t
protected

Member Enumeration Documentation

◆ anonymous enum

anonymous enum
protected
Enumerator
FORWARD_VELOCITY 
ANGULAR_VELOCITY 

◆ anonymous enum

anonymous enum
protected
Enumerator
RANGE 
BEARING 

◆ anonymous enum

anonymous enum
protected
Enumerator
ORIENTATION 

Constructor & Destructor Documentation

◆ Filter()

Filter::Filter ( DataHandler &  data)
explicit

Assigns fields data based on datahandler input.

Parameters
[in]dataClass containing all robot data.

◆ ~Filter()

Filter::~Filter ( )
virtualdefault

Member Function Documentation

◆ calculateMeasurementJacobian()

void Filter::calculateMeasurementJacobian ( EstimationParameters ego_robot,
const EstimationParameters other_agent 
)
protected

Jacobian of the measurement model evaluated in terms of the systems states: x,y, and heading.

Parameters
[in,out]ego_robotThe estimation parameters of the ego robot.
[in]other_agentThe estimation parameters of the agent that was measured by the ego robot.

The formula used for the calculation of the Jacobian of the measurement matrix between ego vehicle \(i\) and measured agent \(j\) take the form

\[ H = \begin{bmatrix} \frac{-\Delta x}{d} & \frac{-\Delta y}{d} & 0 & \frac{\Delta x}{d} & \frac{\Delta y}{d} & 0\\ \frac{\Delta y}{d^2} & \frac{-\Delta x}{d^2} & -1 & \frac{-\Delta y}{d^2} & \frac{\Delta x}{d^2} & 0\end{bmatrix} \]

where \(\Delta x = x_j - x_i\); \(\Delta y = y_j - y_i\); and \(\Delta d = \sqrt{\Delta x^2 + \Delta y^2}\).

◆ calculateMotionJacobian()

void Filter::calculateMotionJacobian ( const Robot::Odometry &  odometry,
EstimationParameters estimation_parameters,
const double  sample_period 
)
protected

Calculates the Jacobian matrix of the unicycle motion model in terms of the systems states (x,y,orientation).

Parameters
[in]odometryThe prior inputs into the system comprising a forward and angular velocity.
[in,out]estimation_parametersThe estimation parameters of the ego robot.
[in]sample_periodThe period between odometry measurements.

The formula used for the calculation of the motion model Jacobian takes the form:

\[ F = \begin{bmatrix} 1 & 0 & -\tilde{v}\Delta t \sin(\theta) \\ 0 & 1 & \tilde{v} \Delta t \cos(\theta) \\ 0 & 0 & 1 \end{bmatrix}, \]

where \(\theta\) denotes the heading (orientation) of the ego vehicle; and \(\tilde{v}\) denotes the forward velocity. The forward velocity is a random variable with Gaussian distributed noise \(\mathcal{N}(0,w)\) , where \(w\) is defined by the covariance matrix Filter::EstimationParameters.measurement_noise. See Filter::motionModel for information on the motion model from which this was derived.

◆ calculateNormalisedEstimationResidual()

Filter::augmentedState_t Filter::calculateNormalisedEstimationResidual ( const EstimationParameters filter)
protected

Calculates the normalised residual of the initial estimate and updated estimate.

Parameters
[in]filterThe estimation parameters of the filter.
Returns
The normalised estimation residual.

◆ calculateNormalisedInnovation()

Filter::measurement_t Filter::calculateNormalisedInnovation ( const EstimationParameters filter)
protected

Calculates the normalised residual of the value produced by the sensor measurement and the prior estimate passed through the non-linear measurement model.

Parameters
[in]filterThe estimation parameters of the filter.
Returns
The normalised innovation.

◆ calculateProcessJacobian()

void Filter::calculateProcessJacobian ( EstimationParameters estimation_parameters,
const double  sample_period 
)
protected

Calculates the Jacobian matrix of the motion model evaluated in terms of the process inputs.

Parameters
[in,out]estimation_parametersThe estimation parameters of the ego robot.
[in]sample_periodThe period between odometry measurements.

The formula used for the calculation of the process noise Jacobian takes the form

\[L = \begin{bmatrix}\Delta t \cos(\theta) & 0 \\ \Delta t \sin(\theta) & 0 \\ 0 & \Delta t \end{bmatrix}, \]

where \(\Delta t\) denotes the sample period; and \(\theta\) denotes the heading (orientation) of the ego robot. measurement_noise. See EKF::prediction for information on the motion model from which this was derived.

◆ computePseudoInverse() [1/2]

Filter::matrix3D_t Filter::computePseudoInverse ( const matrix3D_t matrix_3d)
protected

◆ computePseudoInverse() [2/2]

Filter::matrix6D_t Filter::computePseudoInverse ( const matrix6D_t matrix_6d)
protected

◆ correction()

virtual void Filter::correction ( EstimationParameters ,
const EstimationParameters  
)
protectedpure virtual

Implemented in EKF, IEKF, and InformationFilter.

◆ createAugmentedMatrix()

Filter::augmentedCovariance_t Filter::createAugmentedMatrix ( const covariance_t ego_robot,
const covariance_t other_agent 
)
protected

Combines the covariance/precision matrix of the 3 states of the ego vehicle (x,y,orientation), with the covariance/precision of the of the agent measured.

Parameters
[in]ego_robotthe structure containing the estimation parameters of the ego vehicle.
[in]other_agentthe structure containing the estimation parameters of the agent measured by the ego vehicle.

Cooperative Localisation (Positioning) involves robots that share thier state and estimation error covariances / precision when one robot measures the other. As a result, the estimation error covariance/precision needs to be augmented from a 3x3 to a 6x6 matrix to house the error covariance of both the ego vehicle ( \(i\)) and the measured agent ( \(j\)):

\[\mathbf{P} = \begin{bmatrix} \mathbf{P}_i & \mathbf{0} \\ \mathbf{0} & \mathbf{P}_j \end{bmatrix}, \]

where \(\mathbf{P}_i\) and \(\mathbf{P}_j\) are the estimation error covariance of the ego robot \(i\) and the observed agent \(j\) respectively.

◆ createAugmentedVector()

Filter::augmentedInformation_t Filter::createAugmentedVector ( const state_t ego_robot,
const state_t other_agent 
)
protected

Combines the 3 states of the ego vehicle (x,y,orientation), with the state of the agent measured to create a augmented state vector.

Parameters
[in]ego_robotthe structure containing the estimation parameters of the ego vehicle.
[in]other_agentthe structure containing the estimation parameters of the agent measured by the ego vehicle.
Returns
A 5x1 state vector.

◆ HuberMeasurement()

Filter::huberMeasurementWeights_t Filter::HuberMeasurement ( const measurement_t measurement_residual,
const huberMeasurementThresholds_t tau 
)
protected

Huber Measurement cost function,.

Parameters
[in]measurement_residualThe difference between the measurement and the predicted measurement based on the states of the robots.
[in]tauTunable parameter \(\tau\) that is used to determine if the residual is too large.

◆ HuberState()

Filter::huberStateWeights_t Filter::HuberState ( const augmentedState_t error_residual,
const huberStateThresholds_t tau 
)
protected

Huber State cost function,.

Parameters
[in]error_residualThe difference between the measurement and the predicted measurement based on the states of the robots.
[in]tauTunable parameter \(\tau\) that is used to determine if the residual is too large.

◆ marginalise() [1/2]

Filter::matrix3D_t Filter::marginalise ( const matrix6D_t matrix_6d)
protected

Schur complement-based marginalisation that marginalises a 6x6 matrix into a 3x3 matrix.

Parameters
[in]matrix_6dA 6x6 matrix.
Returns
A 3x3 matrix.

◆ marginalise() [2/2]

Filter::state_t Filter::marginalise ( const vector6D_t vector_6d,
const matrix6D_t matrix_6d 
)
protected

Schur complement-based marginalisation that marginalises a 6x1 vector and 6x6 matrix into a 3x1 vector.

Parameters
[in]vector_6dA 6x1 vector.
[in]matrix_6dA 6x6 matrix.
Returns
A 3x1 vector.

◆ measurementModel()

Filter::measurement_t Filter::measurementModel ( EstimationParameters ego_robot,
const EstimationParameters other_agent 
)
protected

Uses the non-linear measurement model to predict what the measurement from the system would be given the state estimates of the ego robot and measured agent.

Parameters
[in,out]ego_robotThe estimation parameters of the ego robot.
[in]other_agentThe estimation parameters of the agent that was measured by the ego robot.

The measusurement model for the measurement taken from ego vehicle \(i\) to agent \(j\) used for the correction step takes the form

\[ \begin{bmatrix} r_{ij}^{(t)} \\ \phi_{ij}^{(t)}\end{bmatrix} = \begin{bmatrix}\sqrt{(x_j^{(t)} - x_i^{(t)})^2 + (y_j^{(t)} - y_i^{(t)})^2} + q_r \\ \text{atan2}\left(\frac{y_j^{(t)}-y_i^{(t)}}{x_j^{(t)}-x_i^{(t)} }\right) - \theta_i^{(t)} + q_\phi\end{bmatrix}, \]

where \(x\) and \(y\) denote the robots coordinates; \(\theta\) denotes the ego robots orientation (heading); and \(q_r\) and \(q_\omega\) denote the Gaussian distributed measurement noise (See Filter::EstimationParameters.measurement_noise).

◆ motionModel()

void Filter::motionModel ( const Robot::Odometry &  odometry,
EstimationParameters estimation_parameters,
const double  sample_period 
)
protected

The unicycle motion model used to perform motion predictions.

Parameters
[in]odometryThe prior inputs into the system comprising a forward and angular velocity.
[in,out]estimation_parametersThe estimation parameters of the ego robot.
[in]sample_periodThe period between odometry measurements.

The motion model used for robot takes the form:

\[\begin{bmatrix} x_i^{(t+1)} \\ y_i^{(t+1)} \\ \theta_i^{(t+1)}\end{bmatrix} = \begin{bmatrix} x_i^{(t)} + \tilde{v}_i^{(t)}\Delta t\cos(\theta_i^{(t)}) \\ y_i^{(t)} + \tilde{v}_i^{(t)}\Delta t\sin(\theta_i^{(t)})\\ \theta_i^{(t)} + \tilde{\omega}_i^{(t)}\Delta t. \end{bmatrix}, \]

where \(i\) denotes the robots ID; \(t\) denotes the current timestep; \(\Delta t\) denotes the sample period; \(x\) and \(y\) are the robots coordinates; \(\theta\) denotes the robots heading (orientation); \(\tilde{v}_i\) denotes the forward velocity input; and \(\tilde{\omega}\) denotes the angular velocity input. Both \(\tilde{v}_i\) and \(\tilde{\omega}_i\) are normally distributed random variables \(\mathcal{N}(0,w)\) (see Filter::EstimationParameters::process_noise).

◆ normaliseAngle()

void Filter::normaliseAngle ( double &  angle)
protected

Normalise an angle between \((-\pi, \pi]\).

Parameters
[in,out]angleAngle in radians.

◆ performInference()

void Filter::performInference ( )

Performs robot state inference using the EKF bayesian inference framework for all robots provided.

◆ prediction()

virtual void Filter::prediction ( const Robot::Odometry &  ,
EstimationParameters  
)
protectedpure virtual

Implemented in EKF, and InformationFilter.

Member Data Documentation

◆ data_

DataHandler& Filter::data_
protected

Data class housing all the data pertaining to cooperative localisation (positioning).

◆ landmark_parameters

std::vector<EstimationParameters> Filter::landmark_parameters
protected

Houses all estimation parameters for all landmarks.

◆ measurement_thresholds

const huberMeasurementThresholds_t Filter::measurement_thresholds
protected
Initial value:
=
(huberMeasurementThresholds_t() << 0.2, 0.01).finished()
Eigen::Matrix< double, total_measurements, 1 > huberMeasurementThresholds_t
Definition filter.h:84

The thresholds for the huber measurement cost function.

The vector is of the for: range and bearing.

◆ robot_parameters

std::vector<EstimationParameters> Filter::robot_parameters
protected

Houses all estimation parameters for all robots.

◆ state_thresholds

const huberStateThresholds_t Filter::state_thresholds
protected
Initial value:
=
(huberStateThresholds_t() << 0.15, 0.154, 0.255, 0.0104, 0.0104, 0.0)
.finished()
Eigen::Matrix< double, 2 *total_states, 1 > huberStateThresholds_t
Definition filter.h:89

The thresholds for the huber state cost function.

The vector is of the for: ego x, ego y, ego orientation, agent x, agent y.

◆ total_inputs

const unsigned short Filter::total_inputs = 2
staticprotected

The total number of system inputs: forward velocity [m/s] and angular velocity [rad/s].

◆ total_measurements

const unsigned short Filter::total_measurements = 2
staticprotected

The total number of system measurements: range [m] and bearing [rad].

◆ total_states

const unsigned short Filter::total_states = 3
staticprotected

The total number of system states: x coordinate [m], y-coordinate [m], and orientation (heading) [rad].


The documentation for this class was generated from the following files: