diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index 4b0a461..ca795c1 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-08-03T14:12:43","documenter_version":"1.5.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-08-03T15:27:24","documenter_version":"1.5.0"}} \ No newline at end of file diff --git a/dev/examples/kalman-filter/index.html b/dev/examples/kalman-filter/index.html index 490385e..7e860ac 100644 --- a/dev/examples/kalman-filter/index.html +++ b/dev/examples/kalman-filter/index.html @@ -95,155 +95,155 @@ ) - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

This page was generated using Literate.jl.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

This page was generated using Literate.jl.

diff --git a/dev/extras/index.html b/dev/extras/index.html index 808f901..5d03dae 100644 --- a/dev/extras/index.html +++ b/dev/extras/index.html @@ -1,2 +1,2 @@ -Control Variables and Extras · SSMProblems

Control Variables and Extras

All functions that form part of the SSMProblems model interface demand that a final positional argument called extra is included.

This argument has multiple uses, but is generally used to pass in additional information to the model at inference time. Although this might seem unnecessary and clunky for simple models, this addition leads to a great amount of flexibility which allows complex and exotic models to be implemented with little effort or performance penalty.

If your model does not require any extras, you can simply using Nothing as the type for this argument.

When forward-simulating, filtering or smoothing from a model, a vector a extras is passed to the sampler, with each element corresponding to the extra argument for each timestep. Some advanced algorithms may also augment the extra vector with additional information.

Use as Control Variables

In simple cases extra can be treated as a control (or input) vector. For example, for data arriving at irregular time intervals, the extra argument could be the time deltas between observations. Or, in control engineering, the extra argument could be the control inputs to the system.

Note, that it is also possible to store this data in the latent dynamic's struct and extract it during a transition (e.g. dyn.dts[timestep]). However, this approach has the disadvantage that the control variables must be defined when the model is instantiated. Further, this means that re-runs with new control variables require a re-instantiation of the model.

Using extra for control variables allows for a separation between the abstract definition of the state space model and the concrete simulation or inference given specific data.

Use with Streaming Data

The de-coupling of model definition and data that comes from using extra makes it easy to use SSMProblems with streaming data. As control variables arrive, these can be passed to the model distributions via the extra argument.

Use in Rao-Blackwellisation

Briefly, a Rao-Blackwellised particle filter is an efficient variant of the generic particle filter that can be applied to state space models that have an analytically tractable sub-model. The filter behaves as two nested filters, a regular particle filter for the outer model, and an analytic filter (e.g. Kalman filter) for the inner sub-model.

Since the value of the extra argument can be defined at inference time, the outer filter can pass information to the inner filter via the extra argument. This leads to a clean and generic interface for Rao-Blackwellised filtering which is not possible with other state space model packages.

+Control Variables and Extras · SSMProblems

Control Variables and Extras

All functions that form part of the SSMProblems model interface demand that a final positional argument called extra is included.

This argument has multiple uses, but is generally used to pass in additional information to the model at inference time. Although this might seem unnecessary and clunky for simple models, this addition leads to a great amount of flexibility which allows complex and exotic models to be implemented with little effort or performance penalty.

If your model does not require any extras, you can simply using Nothing as the type for this argument.

When forward-simulating, filtering or smoothing from a model, a vector a extras is passed to the sampler, with each element corresponding to the extra argument for each timestep. Some advanced algorithms may also augment the extra vector with additional information.

Use as Control Variables

In simple cases extra can be treated as a control (or input) vector. For example, for data arriving at irregular time intervals, the extra argument could be the time deltas between observations. Or, in control engineering, the extra argument could be the control inputs to the system.

Note, that it is also possible to store this data in the latent dynamic's struct and extract it during a transition (e.g. dyn.dts[timestep]). However, this approach has the disadvantage that the control variables must be defined when the model is instantiated. Further, this means that re-runs with new control variables require a re-instantiation of the model.

Using extra for control variables allows for a separation between the abstract definition of the state space model and the concrete simulation or inference given specific data.

Use with Streaming Data

The de-coupling of model definition and data that comes from using extra makes it easy to use SSMProblems with streaming data. As control variables arrive, these can be passed to the model distributions via the extra argument.

Use in Rao-Blackwellisation

Briefly, a Rao-Blackwellised particle filter is an efficient variant of the generic particle filter that can be applied to state space models that have an analytically tractable sub-model. The filter behaves as two nested filters, a regular particle filter for the outer model, and an analytic filter (e.g. Kalman filter) for the inner sub-model.

Since the value of the extra argument can be defined at inference time, the outer filter can pass information to the inner filter via the extra argument. This leads to a clean and generic interface for Rao-Blackwellised filtering which is not possible with other state space model packages.

diff --git a/dev/index.html b/dev/index.html index 773c46b..c1c282c 100644 --- a/dev/index.html +++ b/dev/index.html @@ -43,7 +43,7 @@ see [AbstractMCMC.sample(::StateSpaceModel)](@ref). For most regular use-cases, the predefined `StateSpaceModel` type, documented below, -should be sufficient.source
SSMProblems.LatentDynamicsType
Latent dynamics of a state space model.
 
 Any concrete subtype of `LatentDynamics` should implement the functions `logdensity` and
 `simulate`, by defining two methods as documented below, one for initialisation and one
@@ -51,21 +51,21 @@
 exact inference algorithm that is intended to be used.
 
 Alternatively, you may specify methods for the function `distribution` which will be
-used to define the above methods.
source
SSMProblems.ObservationProcessType
Observation process of a state space model.
 
 Any concrete subtype of `ObservationProcess` must implement the `logdensity`
 method, as defined below. Optionally, it may also implement `simulate` for use in
 forward simulation of the state space model.
 
 Alternatively, you may specify a method for `distribution`, which will be used to define
-both of the above methods.
source
SSMProblems.StateSpaceModelType
A state space model.
 
 A vanilla implementation of a state space model, composed of a latent dynamics and an
 observation process.
 
 # Fields
 - `dyn::LD`: The latent dynamics of the state space model.
-- `obs::OP`: The observation process of the state space model.
source
SSMProblems.distributionMethod
distribution(dyn::LatentDynamics, extra)

Return the initialisation distribution for the latent dynamics.

The method should return the distribution of the initial state of the latent dynamics. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also LatentDynamics.

Returns

  • Distributions.Distribution: The distribution of the initial state.
source
SSMProblems.distributionMethod
distribution(dyn::LatentDynamics, step::Integer, state, extra)

Return the transition distribution for the latent dynamics.

The method should return the distribution of the state for the next time step given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also LatentDynamics.

Returns

  • Distributions.Distribution: The distribution of the new state.
source
SSMProblems.distributionMethod
distribution(obs::ObservationProcess, step::Integer, state, extra)

Return the observation distribution for the observation process.

The method should return the distribution of an observation given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also ObservationProcess.

Returns

  • Distributions.Distribution: The distribution of the observation.
source
SSMProblems.logdensityMethod
logdensity(dyn::LatentDynamics, new_state, extra)

Compute the log-density of an initial state for the latent dynamics.

The method should return the log-density of the initial state new_state for the initial time step of the latent dynamics.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.logdensityMethod
logdensity(dyn::LatentDynamics, step::Integer, state, new_state, extra)

Compute the log-density of a transition of the latent dynamics.

The method should return the log-density of the new state new_state given the current state state at time step step.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.logdensityMethod
logdensity(obs::ObservationProcess, step::Integer, state, observation, extra)

Compute the log-density of an observation given the current state.

The method should return the log-density of the observation observation given the current state state at time step step.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also ObservationProcess.

source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], dyn::LatentDynamics, extra)
+- `obs::OP`: The observation process of the state space model.
source
SSMProblems.distributionMethod
distribution(dyn::LatentDynamics, extra)

Return the initialisation distribution for the latent dynamics.

The method should return the distribution of the initial state of the latent dynamics. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also LatentDynamics.

Returns

  • Distributions.Distribution: The distribution of the initial state.
source
SSMProblems.distributionMethod
distribution(dyn::LatentDynamics, step::Integer, state, extra)

Return the transition distribution for the latent dynamics.

The method should return the distribution of the state for the next time step given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also LatentDynamics.

Returns

  • Distributions.Distribution: The distribution of the new state.
source
SSMProblems.distributionMethod
distribution(obs::ObservationProcess, step::Integer, state, extra)

Return the observation distribution for the observation process.

The method should return the distribution of an observation given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.

See also ObservationProcess.

Returns

  • Distributions.Distribution: The distribution of the observation.
source
SSMProblems.logdensityMethod
logdensity(dyn::LatentDynamics, new_state, extra)

Compute the log-density of an initial state for the latent dynamics.

The method should return the log-density of the initial state new_state for the initial time step of the latent dynamics.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.logdensityMethod
logdensity(dyn::LatentDynamics, step::Integer, state, new_state, extra)

Compute the log-density of a transition of the latent dynamics.

The method should return the log-density of the new state new_state given the current state state at time step step.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.logdensityMethod
logdensity(obs::ObservationProcess, step::Integer, state, observation, extra)

Compute the log-density of an observation given the current state.

The method should return the log-density of the observation observation given the current state state at time step step.

The default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.

See also ObservationProcess.

source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], dyn::LatentDynamics, extra)
 
 Simulate an initial state for the latent dynamics.
 
@@ -75,4 +75,4 @@
 The default behaviour is generate a random sample from distribution returned by the
 corresponding `distribution()` method.
 
-See also [`LatentDynamics`](@ref).
source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], dyn::LatentDynamics, step::Integer, state, extra)

Simulate a transition of the latent dynamics.

The method should return a random state for the next time step given the state state at the current time step, step.

The default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], process::ObservationProcess, step::Integer, state, extra)

Simulate an observation given the current state.

The method should return a random observation given the current state state at time step step.

The default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.

See also ObservationProcess.

source
  • Murray

    Murray, Lawrence & Lee, Anthony & Jacob, Pierre. (2013). Rethinking resampling in the particle filter on graphics processing units.

+See also [`LatentDynamics`](@ref).source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], dyn::LatentDynamics, step::Integer, state, extra)

Simulate a transition of the latent dynamics.

The method should return a random state for the next time step given the state state at the current time step, step.

The default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.

See also LatentDynamics.

source
SSMProblems.simulateMethod
simulate([rng::AbstractRNG], process::ObservationProcess, step::Integer, state, extra)

Simulate an observation given the current state.

The method should return a random observation given the current state state at time step step.

The default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.

See also ObservationProcess.

source
  • Murray

    Murray, Lawrence & Lee, Anthony & Jacob, Pierre. (2013). Rethinking resampling in the particle filter on graphics processing units.

diff --git a/dev/search_index.js b/dev/search_index.js index 38115b0..170d4ce 100644 --- a/dev/search_index.js +++ b/dev/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"extras/#Control-Variables-and-Extras","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"All functions that form part of the SSMProblems model interface demand that a final positional argument called extra is included.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"This argument has multiple uses, but is generally used to pass in additional information to the model at inference time. Although this might seem unnecessary and clunky for simple models, this addition leads to a great amount of flexibility which allows complex and exotic models to be implemented with little effort or performance penalty.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"If your model does not require any extras, you can simply using Nothing as the type for this argument.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"When forward-simulating, filtering or smoothing from a model, a vector a extras is passed to the sampler, with each element corresponding to the extra argument for each timestep. Some advanced algorithms may also augment the extra vector with additional information.","category":"page"},{"location":"extras/#Use-as-Control-Variables","page":"Control Variables and Extras","title":"Use as Control Variables","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"In simple cases extra can be treated as a control (or input) vector. For example, for data arriving at irregular time intervals, the extra argument could be the time deltas between observations. Or, in control engineering, the extra argument could be the control inputs to the system.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Note, that it is also possible to store this data in the latent dynamic's struct and extract it during a transition (e.g. dyn.dts[timestep]). However, this approach has the disadvantage that the control variables must be defined when the model is instantiated. Further, this means that re-runs with new control variables require a re-instantiation of the model.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Using extra for control variables allows for a separation between the abstract definition of the state space model and the concrete simulation or inference given specific data.","category":"page"},{"location":"extras/#Use-with-Streaming-Data","page":"Control Variables and Extras","title":"Use with Streaming Data","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"The de-coupling of model definition and data that comes from using extra makes it easy to use SSMProblems with streaming data. As control variables arrive, these can be passed to the model distributions via the extra argument.","category":"page"},{"location":"extras/#Use-in-Rao-Blackwellisation","page":"Control Variables and Extras","title":"Use in Rao-Blackwellisation","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Briefly, a Rao-Blackwellised particle filter is an efficient variant of the generic particle filter that can be applied to state space models that have an analytically tractable sub-model. The filter behaves as two nested filters, a regular particle filter for the outer model, and an analytic filter (e.g. Kalman filter) for the inner sub-model.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Since the value of the extra argument can be defined at inference time, the outer filter can pass information to the inner filter via the extra argument. This leads to a clean and generic interface for Rao-Blackwellised filtering which is not possible with other state space model packages.","category":"page"},{"location":"#SSMProblems","page":"Home","title":"SSMProblems","text":"","category":"section"},{"location":"#Installation","page":"Home","title":"Installation","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"In the julia REPL:","category":"page"},{"location":"","page":"Home","title":"Home","text":"] add SSMProblems","category":"page"},{"location":"#Documentation","page":"Home","title":"Documentation","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"SSMProblems defines a generic interface for state space models (SSMs). Its main objective is to provide a consistent interface for filtering and smoothing algorithms to interact with.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Consider a standard (Markovian) state-space model from[Murray]: (Image: state space model)","category":"page"},{"location":"","page":"Home","title":"Home","text":"[Murray]: Murray, Lawrence & Lee, Anthony & Jacob, Pierre. (2013). Rethinking resampling in the particle filter on graphics processing units.","category":"page"},{"location":"","page":"Home","title":"Home","text":"The following three distributions fully specify the model:","category":"page"},{"location":"","page":"Home","title":"Home","text":"The initialisation distribution, f_0, for the initial latent state X_0\nThe transition distribution, f, for the latent state X_t given the previous X_t-1\nThe observation distribution, g, for an observation Y_t given the state X_t","category":"page"},{"location":"","page":"Home","title":"Home","text":"The dynamics of the model are given by,","category":"page"},{"location":"","page":"Home","title":"Home","text":"beginaligned\nx_0 sim f_0(x_0) \nx_t x_t-1 sim f(x_t x_t-1) \ny_t x_t sim g(y_t x_t)\nendaligned","category":"page"},{"location":"","page":"Home","title":"Home","text":"and the joint law is,","category":"page"},{"location":"","page":"Home","title":"Home","text":"p(x_0T y_0T) = f_0(x_0) prod_t g(y_t x_t) f(x_t x_t-1)","category":"page"},{"location":"","page":"Home","title":"Home","text":"We can consider a state space model as being made up of two components:","category":"page"},{"location":"","page":"Home","title":"Home","text":"A latent Markov chain describing the evolution of the latent state\nAn observation process describing the relationship between the latent states and the observations","category":"page"},{"location":"","page":"Home","title":"Home","text":"Through this lens, we see that the distributions f_0, f fully describe the latent Markov chain, whereas g describes the observation process.","category":"page"},{"location":"","page":"Home","title":"Home","text":"A user of SSMProblems may define these three distributions directly. Alternatively, they can define a subset of methods for sampling and evaluating log-densities of the distributions, depending on the requirements of the filtering/smoothing algorithms they intend to use.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Using the first approach, we can define a simple linear state space model as follows:","category":"page"},{"location":"","page":"Home","title":"Home","text":"using Distributions\nusing SSMProblems\n\nstruct SimpleLatentDynamics <: LatentDynamics end\n\nfunction distribution(rng::AbstractRNG, dyn::SimpleLatentDynamics, extra::Nothing)\n return Normal(0.0, 1.0)\nend\n\nfunction distribution(rng::AbstractRNG, dyn::SimpleLatentDynamics, step::Int, state::Float64, extra::Nothing)\n return Normal(state, 0.1)\nend\n\nstruct SimpleObservationProcess <: ObservationProcess end\n\nfunction distribution(\n obs::SimpleObservationPRocess, step::Int, state::Float64, observation::Float64, extra::Nothing\n)\n return Normal(state, 0.5)\nend\n\n# Construct an SSM from the components\ndyn = SimpleLatentDynamics()\nobs = SimpleObservationProcess()\nmodel = StateSpaceModel(dyn, obs)","category":"page"},{"location":"","page":"Home","title":"Home","text":"There are a few things to note here:","category":"page"},{"location":"","page":"Home","title":"Home","text":"Two methods must be defined for the LatentDynamics, one containing step/state arguments and use for transitioning, and one without these, used for initialisation. the model is time-homogeneous, these are not required in the function bodies.\nEvery function takes an extra argument. This is part of the \"secret sauce\" of SSMProblems that allows it to flexibly represent more exotic models without any performance penalty. You can read more about it here.\nIf your latent dynamics and observation process cannot be represented as a Distribution object, you may implement specific methods for sampling and log-density evaluation as documented below.","category":"page"},{"location":"","page":"Home","title":"Home","text":"These distribution definitions are used to define simulate and logdensity methods for the latent dynamics and observation process. Package users can then interact with the state space model through these functions.","category":"page"},{"location":"","page":"Home","title":"Home","text":"For example, a bootstrap filter targeting the filtering distribution p(x_t y_0t) using N particles would roughly follow:","category":"page"},{"location":"","page":"Home","title":"Home","text":"dyn, obs = model.dyn, model.obs\n\nfor (i, observation) in enumerate(observations)\n idx = resample(rng, log_weights)\n particles = particles[idx]\n for i in 1:N\n particles[i] = simulate(rng, dyn, i, particles[i], nothing)\n log_weights[i] += logdensity(obs, i, particles[i], observation, nothing)\n end\nend","category":"page"},{"location":"","page":"Home","title":"Home","text":"For more thorough examples, see the provided example scripts.","category":"page"},{"location":"#Interface","page":"Home","title":"Interface","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Modules = [SSMProblems]\nOrder = [:type, :function, :module]","category":"page"},{"location":"#SSMProblems.AbstractStateSpaceModel","page":"Home","title":"SSMProblems.AbstractStateSpaceModel","text":"An abstract type for state space models.\n\nAny concrete subtype of `AbstractStateSpaceModel` should implement a method for\n`AbstractMCMC.sample` which performs forward simulation. For an example implementation,\nsee [AbstractMCMC.sample(::StateSpaceModel)](@ref).\n\nFor most regular use-cases, the predefined `StateSpaceModel` type, documented below,\nshould be sufficient.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.LatentDynamics","page":"Home","title":"SSMProblems.LatentDynamics","text":"Latent dynamics of a state space model.\n\nAny concrete subtype of `LatentDynamics` should implement the functions `logdensity` and\n`simulate`, by defining two methods as documented below, one for initialisation and one\nfor transitioning. Whether each of these functions need to be implemented depends on the\nexact inference algorithm that is intended to be used.\n\nAlternatively, you may specify methods for the function `distribution` which will be\nused to define the above methods.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.ObservationProcess","page":"Home","title":"SSMProblems.ObservationProcess","text":"Observation process of a state space model.\n\nAny concrete subtype of `ObservationProcess` must implement the `logdensity`\nmethod, as defined below. Optionally, it may also implement `simulate` for use in\nforward simulation of the state space model.\n\nAlternatively, you may specify a method for `distribution`, which will be used to define\nboth of the above methods.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.StateSpaceModel","page":"Home","title":"SSMProblems.StateSpaceModel","text":"A state space model.\n\nA vanilla implementation of a state space model, composed of a latent dynamics and an\nobservation process.\n\n# Fields\n- `dyn::LD`: The latent dynamics of the state space model.\n- `obs::OP`: The observation process of the state space model.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.distribution-Tuple{LatentDynamics, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(dyn::LatentDynamics, extra)\n\nReturn the initialisation distribution for the latent dynamics.\n\nThe method should return the distribution of the initial state of the latent dynamics. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.\n\nSee also LatentDynamics.\n\nReturns\n\nDistributions.Distribution: The distribution of the initial state.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.distribution-Tuple{LatentDynamics, Integer, Any, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(dyn::LatentDynamics, step::Integer, state, extra)\n\nReturn the transition distribution for the latent dynamics.\n\nThe method should return the distribution of the state for the next time step given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations. \n\nSee also LatentDynamics.\n\nReturns\n\nDistributions.Distribution: The distribution of the new state.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.distribution-Tuple{ObservationProcess, Integer, Any, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(obs::ObservationProcess, step::Integer, state, extra)\n\nReturn the observation distribution for the observation process.\n\nThe method should return the distribution of an observation given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.\n\nSee also ObservationProcess.\n\nReturns\n\nDistributions.Distribution: The distribution of the observation.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{LatentDynamics, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(dyn::LatentDynamics, new_state, extra)\n\nCompute the log-density of an initial state for the latent dynamics.\n\nThe method should return the log-density of the initial state new_state for the initial time step of the latent dynamics.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{LatentDynamics, Integer, Any, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(dyn::LatentDynamics, step::Integer, state, new_state, extra)\n\nCompute the log-density of a transition of the latent dynamics.\n\nThe method should return the log-density of the new state new_state given the current state state at time step step.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{ObservationProcess, Integer, Any, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(obs::ObservationProcess, step::Integer, state, observation, extra)\n\nCompute the log-density of an observation given the current state.\n\nThe method should return the log-density of the observation observation given the current state state at time step step.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also ObservationProcess.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, LatentDynamics, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], dyn::LatentDynamics, extra)\n\nSimulate an initial state for the latent dynamics.\n\nThe method should return a random initial state for the first time step of the latent\ndynamics.\n\nThe default behaviour is generate a random sample from distribution returned by the\ncorresponding `distribution()` method.\n\nSee also [`LatentDynamics`](@ref).\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, LatentDynamics, Integer, Any, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], dyn::LatentDynamics, step::Integer, state, extra)\n\nSimulate a transition of the latent dynamics.\n\nThe method should return a random state for the next time step given the state state at the current time step, step.\n\nThe default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, ObservationProcess, Integer, Any, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], process::ObservationProcess, step::Integer, state, extra)\n\nSimulate an observation given the current state.\n\nThe method should return a random observation given the current state state at time step step.\n\nThe default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.\n\nSee also ObservationProcess.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.SSMProblems","page":"Home","title":"SSMProblems.SSMProblems","text":"A unified interface to define state space models in the context of particle MCMC algorithms.\n\n\n\n\n\n","category":"module"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"EditURL = \"../../../examples/kalman-filter/script.jl\"","category":"page"},{"location":"examples/kalman-filter/#Kalman-Filter","page":"Kalman Filter","title":"Kalman Filter","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"This example implements a Kalman filter for a linear Gaussian state space model using the SSMProblems interface.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"using AbstractMCMC\nusing Distributions\nusing LinearAlgebra\nusing Plots\nusing Random\nusing UnPack\n\nusing SSMProblems","category":"page"},{"location":"examples/kalman-filter/#Model-Definition","page":"Kalman Filter","title":"Model Definition","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We start by defining structs to store the paramaters for our specific latent dynamics and observation process.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"The latent dynamics have the following form:","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"x[0] = z + ϵ, ϵ ∼ N(0, P)\nx[k] = Φx[k-1] + b + w[k], w[k] ∼ N(0, Q)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We store all of these paramaters in a struct.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct LinearGaussianLatentDynamics{T<:Real} <: LatentDynamics\n z::Vector{T}\n P::Matrix{T}\n Φ::Matrix{T}\n b::Vector{T}\n Q::Matrix{T}\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Similarly, the observation process is defined by the following equation:","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"y[k] = Hx[k] + v[k], v[k] ∼ N(0, R)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct LinearGaussianObservationProcess{T<:Real} <: ObservationProcess\n H::Matrix{T}\n R::Matrix{T}\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We then define general transition and observation distributions to be used in forward simulation. Since our model is homogenous (doesn't depend on control variables), we use Nothing for the type of extra.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Even if we did not require forward simulation (e.g. we were given observations), it is still useful to define these methods as they allow us to run a particle filter on our model with no additional implementation required. Although a Kalman filter would generally be preferred in this linear Gaussian case, it may be of interest to compare the sampling performance with a general particle filter.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function SSMProblems.distribution(model::LinearGaussianLatentDynamics, extra::Nothing)\n return MvNormal(model.z, model.P)\nend\nfunction SSMProblems.distribution(\n model::LinearGaussianLatentDynamics{T},\n step::Int,\n state::AbstractVector{T},\n extra::Nothing,\n) where {T}\n return MvNormal(model.Φ * state + model.b, model.Q)\nend\nfunction SSMProblems.distribution(\n model::LinearGaussianObservationProcess{T},\n step::Int,\n state::AbstractVector{T},\n extra::Nothing,\n) where {T}\n return MvNormal(model.H * state, model.R)\nend","category":"page"},{"location":"examples/kalman-filter/#Filtering-Algorithm","page":"Kalman Filter","title":"Filtering Algorithm","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We define a concrete type to represent our sampling algorithm. This is used for dispatch to, say, distinguish from using a generic particle filter.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct KalmanFilter end","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"A Kalman filter is only valid for linear Gaussian state space models, so we define an alias for an SSM with linear Gaussian latent dynamics and observation process, which will be used to dispatch to the correct method.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"const LinearGaussianSSM{T} = StateSpaceModel{\n <:LinearGaussianLatentDynamics{T},<:LinearGaussianObservationProcess{T}\n};","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We then define a method for the sample function. This is a standardised interface which requires the model we are sampling from, the sampling algorithm as well as the observations and any extras.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function AbstractMCMC.sample(\n model::LinearGaussianSSM{U},\n ::KalmanFilter,\n observations::AbstractVector,\n extras::AbstractVector,\n) where {U}\n T = length(observations)\n x_filts = Vector{Vector{U}}(undef, T)\n P_filts = Vector{Matrix{U}}(undef, T)\n\n @unpack z, P, Φ, b, Q = model.dyn ## Extract parameters\n @unpack H, R = model.obs\n\n for t in 1:T\n x_pred, P_pred = if t == 1\n z, P\n else\n Φ * x_filts[t - 1] + b, Φ * P_filts[t - 1] * Φ' + Q ## Prediction step\n end\n\n y = observations[t] ## Update step\n K = P_pred * H' / (H * P_pred * H' + R)\n x_filt = x_pred + K * (y - H * x_pred)\n P_filt = P_pred - K * H * P_pred\n\n x_filts[t] = x_filt\n P_filts[t] = P_filt\n end\n\n return x_filts, P_filts\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"In this specific case, however, since our model is homogenous, we do not expect to have any extras passed in. For convenience, we create a method without the extras argument which then replaces them with a vector of nothings. This pattern is not specific to linear Gaussian models or Kalman filters so we define it with general types.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function AbstractMCMC.sample(\n model::StateSpaceModel, algorithm, observations::AbstractVector\n)\n extras = fill(nothing, length(observations))\n return sample(model, algorithm, observations, extras)\nend","category":"page"},{"location":"examples/kalman-filter/#Simulation-and-Filtering","page":"Kalman Filter","title":"Simulation and Filtering","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We define the parameters for our model as so.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"SEED = 1;\nT = 100;\nz = [-1.0, 1.0];\nP = Matrix(1.0I, 2, 2);\nΦ = [0.8 0.2; -0.1 0.8];\nb = zeros(2);\nQ = [0.2 0.0; 0.0 0.5];\nH = [1.0 0.0;];\nR = Matrix(0.3I, 1, 1);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We can then construct the latent dynamics and observation process, before combining these to form a state space model.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"dyn = LinearGaussianLatentDynamics(z, P, Φ, b, Q);\nobs = LinearGaussianObservationProcess(H, R);\nmodel = StateSpaceModel(dyn, obs);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Synthetic data can be generated by directly sampling from the model. This calls a utility function from the SSMProblems package, which in turn, calls the three distribution functions we defined above.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"rng = MersenneTwister(SEED);\nxs, ys = sample(rng, model, T);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We can then run the Kalman filter and plot the filtering results against the ground truth.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"x_filts, P_filts = AbstractMCMC.sample(model, KalmanFilter(), ys);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Plot trajectory for first dimension","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"p = plot(; title=\"First Dimension Kalman Filter Estimates\", xlabel=\"Step\", ylabel=\"Value\")\nplot!(p, 1:T, first.(xs); label=\"Truth\")\nscatter!(p, 1:T, first.(ys); label=\"Observations\")\nplot!(\n p,\n 1:T,\n first.(x_filts);\n ribbon=sqrt.(first.(P_filts)),\n label=\"Filtered (±1σ)\",\n fillalpha=0.2,\n)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"\n\n\n \n \n \n\n\n\n \n \n \n\n\n\n \n \n \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"This page was generated using Literate.jl.","category":"page"}] +[{"location":"extras/#Control-Variables-and-Extras","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"All functions that form part of the SSMProblems model interface demand that a final positional argument called extra is included.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"This argument has multiple uses, but is generally used to pass in additional information to the model at inference time. Although this might seem unnecessary and clunky for simple models, this addition leads to a great amount of flexibility which allows complex and exotic models to be implemented with little effort or performance penalty.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"If your model does not require any extras, you can simply using Nothing as the type for this argument.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"When forward-simulating, filtering or smoothing from a model, a vector a extras is passed to the sampler, with each element corresponding to the extra argument for each timestep. Some advanced algorithms may also augment the extra vector with additional information.","category":"page"},{"location":"extras/#Use-as-Control-Variables","page":"Control Variables and Extras","title":"Use as Control Variables","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"In simple cases extra can be treated as a control (or input) vector. For example, for data arriving at irregular time intervals, the extra argument could be the time deltas between observations. Or, in control engineering, the extra argument could be the control inputs to the system.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Note, that it is also possible to store this data in the latent dynamic's struct and extract it during a transition (e.g. dyn.dts[timestep]). However, this approach has the disadvantage that the control variables must be defined when the model is instantiated. Further, this means that re-runs with new control variables require a re-instantiation of the model.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Using extra for control variables allows for a separation between the abstract definition of the state space model and the concrete simulation or inference given specific data.","category":"page"},{"location":"extras/#Use-with-Streaming-Data","page":"Control Variables and Extras","title":"Use with Streaming Data","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"The de-coupling of model definition and data that comes from using extra makes it easy to use SSMProblems with streaming data. As control variables arrive, these can be passed to the model distributions via the extra argument.","category":"page"},{"location":"extras/#Use-in-Rao-Blackwellisation","page":"Control Variables and Extras","title":"Use in Rao-Blackwellisation","text":"","category":"section"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Briefly, a Rao-Blackwellised particle filter is an efficient variant of the generic particle filter that can be applied to state space models that have an analytically tractable sub-model. The filter behaves as two nested filters, a regular particle filter for the outer model, and an analytic filter (e.g. Kalman filter) for the inner sub-model.","category":"page"},{"location":"extras/","page":"Control Variables and Extras","title":"Control Variables and Extras","text":"Since the value of the extra argument can be defined at inference time, the outer filter can pass information to the inner filter via the extra argument. This leads to a clean and generic interface for Rao-Blackwellised filtering which is not possible with other state space model packages.","category":"page"},{"location":"#SSMProblems","page":"Home","title":"SSMProblems","text":"","category":"section"},{"location":"#Installation","page":"Home","title":"Installation","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"In the julia REPL:","category":"page"},{"location":"","page":"Home","title":"Home","text":"] add SSMProblems","category":"page"},{"location":"#Documentation","page":"Home","title":"Documentation","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"SSMProblems defines a generic interface for state space models (SSMs). Its main objective is to provide a consistent interface for filtering and smoothing algorithms to interact with.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Consider a standard (Markovian) state-space model from[Murray]: (Image: state space model)","category":"page"},{"location":"","page":"Home","title":"Home","text":"[Murray]: Murray, Lawrence & Lee, Anthony & Jacob, Pierre. (2013). Rethinking resampling in the particle filter on graphics processing units.","category":"page"},{"location":"","page":"Home","title":"Home","text":"The following three distributions fully specify the model:","category":"page"},{"location":"","page":"Home","title":"Home","text":"The initialisation distribution, f_0, for the initial latent state X_0\nThe transition distribution, f, for the latent state X_t given the previous X_t-1\nThe observation distribution, g, for an observation Y_t given the state X_t","category":"page"},{"location":"","page":"Home","title":"Home","text":"The dynamics of the model are given by,","category":"page"},{"location":"","page":"Home","title":"Home","text":"beginaligned\nx_0 sim f_0(x_0) \nx_t x_t-1 sim f(x_t x_t-1) \ny_t x_t sim g(y_t x_t)\nendaligned","category":"page"},{"location":"","page":"Home","title":"Home","text":"and the joint law is,","category":"page"},{"location":"","page":"Home","title":"Home","text":"p(x_0T y_0T) = f_0(x_0) prod_t g(y_t x_t) f(x_t x_t-1)","category":"page"},{"location":"","page":"Home","title":"Home","text":"We can consider a state space model as being made up of two components:","category":"page"},{"location":"","page":"Home","title":"Home","text":"A latent Markov chain describing the evolution of the latent state\nAn observation process describing the relationship between the latent states and the observations","category":"page"},{"location":"","page":"Home","title":"Home","text":"Through this lens, we see that the distributions f_0, f fully describe the latent Markov chain, whereas g describes the observation process.","category":"page"},{"location":"","page":"Home","title":"Home","text":"A user of SSMProblems may define these three distributions directly. Alternatively, they can define a subset of methods for sampling and evaluating log-densities of the distributions, depending on the requirements of the filtering/smoothing algorithms they intend to use.","category":"page"},{"location":"","page":"Home","title":"Home","text":"Using the first approach, we can define a simple linear state space model as follows:","category":"page"},{"location":"","page":"Home","title":"Home","text":"using Distributions\nusing SSMProblems\n\nstruct SimpleLatentDynamics <: LatentDynamics end\n\nfunction distribution(rng::AbstractRNG, dyn::SimpleLatentDynamics, extra::Nothing)\n return Normal(0.0, 1.0)\nend\n\nfunction distribution(rng::AbstractRNG, dyn::SimpleLatentDynamics, step::Int, state::Float64, extra::Nothing)\n return Normal(state, 0.1)\nend\n\nstruct SimpleObservationProcess <: ObservationProcess end\n\nfunction distribution(\n obs::SimpleObservationPRocess, step::Int, state::Float64, observation::Float64, extra::Nothing\n)\n return Normal(state, 0.5)\nend\n\n# Construct an SSM from the components\ndyn = SimpleLatentDynamics()\nobs = SimpleObservationProcess()\nmodel = StateSpaceModel(dyn, obs)","category":"page"},{"location":"","page":"Home","title":"Home","text":"There are a few things to note here:","category":"page"},{"location":"","page":"Home","title":"Home","text":"Two methods must be defined for the LatentDynamics, one containing step/state arguments and use for transitioning, and one without these, used for initialisation. the model is time-homogeneous, these are not required in the function bodies.\nEvery function takes an extra argument. This is part of the \"secret sauce\" of SSMProblems that allows it to flexibly represent more exotic models without any performance penalty. You can read more about it here.\nIf your latent dynamics and observation process cannot be represented as a Distribution object, you may implement specific methods for sampling and log-density evaluation as documented below.","category":"page"},{"location":"","page":"Home","title":"Home","text":"These distribution definitions are used to define simulate and logdensity methods for the latent dynamics and observation process. Package users can then interact with the state space model through these functions.","category":"page"},{"location":"","page":"Home","title":"Home","text":"For example, a bootstrap filter targeting the filtering distribution p(x_t y_0t) using N particles would roughly follow:","category":"page"},{"location":"","page":"Home","title":"Home","text":"dyn, obs = model.dyn, model.obs\n\nfor (i, observation) in enumerate(observations)\n idx = resample(rng, log_weights)\n particles = particles[idx]\n for i in 1:N\n particles[i] = simulate(rng, dyn, i, particles[i], nothing)\n log_weights[i] += logdensity(obs, i, particles[i], observation, nothing)\n end\nend","category":"page"},{"location":"","page":"Home","title":"Home","text":"For more thorough examples, see the provided example scripts.","category":"page"},{"location":"#Interface","page":"Home","title":"Interface","text":"","category":"section"},{"location":"","page":"Home","title":"Home","text":"Modules = [SSMProblems]\nOrder = [:type, :function, :module]","category":"page"},{"location":"#SSMProblems.AbstractStateSpaceModel","page":"Home","title":"SSMProblems.AbstractStateSpaceModel","text":"An abstract type for state space models.\n\nAny concrete subtype of `AbstractStateSpaceModel` should implement a method for\n`AbstractMCMC.sample` which performs forward simulation. For an example implementation,\nsee [AbstractMCMC.sample(::StateSpaceModel)](@ref).\n\nFor most regular use-cases, the predefined `StateSpaceModel` type, documented below,\nshould be sufficient.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.LatentDynamics","page":"Home","title":"SSMProblems.LatentDynamics","text":"Latent dynamics of a state space model.\n\nAny concrete subtype of `LatentDynamics` should implement the functions `logdensity` and\n`simulate`, by defining two methods as documented below, one for initialisation and one\nfor transitioning. Whether each of these functions need to be implemented depends on the\nexact inference algorithm that is intended to be used.\n\nAlternatively, you may specify methods for the function `distribution` which will be\nused to define the above methods.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.ObservationProcess","page":"Home","title":"SSMProblems.ObservationProcess","text":"Observation process of a state space model.\n\nAny concrete subtype of `ObservationProcess` must implement the `logdensity`\nmethod, as defined below. Optionally, it may also implement `simulate` for use in\nforward simulation of the state space model.\n\nAlternatively, you may specify a method for `distribution`, which will be used to define\nboth of the above methods.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.StateSpaceModel","page":"Home","title":"SSMProblems.StateSpaceModel","text":"A state space model.\n\nA vanilla implementation of a state space model, composed of a latent dynamics and an\nobservation process.\n\n# Fields\n- `dyn::LD`: The latent dynamics of the state space model.\n- `obs::OP`: The observation process of the state space model.\n\n\n\n\n\n","category":"type"},{"location":"#SSMProblems.distribution-Tuple{LatentDynamics, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(dyn::LatentDynamics, extra)\n\nReturn the initialisation distribution for the latent dynamics.\n\nThe method should return the distribution of the initial state of the latent dynamics. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.\n\nSee also LatentDynamics.\n\nReturns\n\nDistributions.Distribution: The distribution of the initial state.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.distribution-Tuple{LatentDynamics, Integer, Any, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(dyn::LatentDynamics, step::Integer, state, extra)\n\nReturn the transition distribution for the latent dynamics.\n\nThe method should return the distribution of the state for the next time step given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations. \n\nSee also LatentDynamics.\n\nReturns\n\nDistributions.Distribution: The distribution of the new state.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.distribution-Tuple{ObservationProcess, Integer, Any, Any}","page":"Home","title":"SSMProblems.distribution","text":"distribution(obs::ObservationProcess, step::Integer, state, extra)\n\nReturn the observation distribution for the observation process.\n\nThe method should return the distribution of an observation given the current state state at time step step. The returned value should be a Distributions.Distribution object that implements sampling and log-density calculations.\n\nSee also ObservationProcess.\n\nReturns\n\nDistributions.Distribution: The distribution of the observation.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{LatentDynamics, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(dyn::LatentDynamics, new_state, extra)\n\nCompute the log-density of an initial state for the latent dynamics.\n\nThe method should return the log-density of the initial state new_state for the initial time step of the latent dynamics.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{LatentDynamics, Integer, Any, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(dyn::LatentDynamics, step::Integer, state, new_state, extra)\n\nCompute the log-density of a transition of the latent dynamics.\n\nThe method should return the log-density of the new state new_state given the current state state at time step step.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.logdensity-Tuple{ObservationProcess, Integer, Any, Any, Any}","page":"Home","title":"SSMProblems.logdensity","text":"logdensity(obs::ObservationProcess, step::Integer, state, observation, extra)\n\nCompute the log-density of an observation given the current state.\n\nThe method should return the log-density of the observation observation given the current state state at time step step.\n\nThe default behaviour is to compute the log-density of the distribution return by the corresponding distribution() method.\n\nSee also ObservationProcess.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, LatentDynamics, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], dyn::LatentDynamics, extra)\n\nSimulate an initial state for the latent dynamics.\n\nThe method should return a random initial state for the first time step of the latent\ndynamics.\n\nThe default behaviour is generate a random sample from distribution returned by the\ncorresponding `distribution()` method.\n\nSee also [`LatentDynamics`](@ref).\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, LatentDynamics, Integer, Any, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], dyn::LatentDynamics, step::Integer, state, extra)\n\nSimulate a transition of the latent dynamics.\n\nThe method should return a random state for the next time step given the state state at the current time step, step.\n\nThe default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.\n\nSee also LatentDynamics.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.simulate-Tuple{Random.AbstractRNG, ObservationProcess, Integer, Any, Any}","page":"Home","title":"SSMProblems.simulate","text":"simulate([rng::AbstractRNG], process::ObservationProcess, step::Integer, state, extra)\n\nSimulate an observation given the current state.\n\nThe method should return a random observation given the current state state at time step step.\n\nThe default behaviour is generate a random sample from distribution returned by the corresponding distribution() method.\n\nSee also ObservationProcess.\n\n\n\n\n\n","category":"method"},{"location":"#SSMProblems.SSMProblems","page":"Home","title":"SSMProblems.SSMProblems","text":"A unified interface to define state space models in the context of particle MCMC algorithms.\n\n\n\n\n\n","category":"module"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"EditURL = \"../../../examples/kalman-filter/script.jl\"","category":"page"},{"location":"examples/kalman-filter/#Kalman-Filter","page":"Kalman Filter","title":"Kalman Filter","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"This example implements a Kalman filter for a linear Gaussian state space model using the SSMProblems interface.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"using AbstractMCMC\nusing Distributions\nusing LinearAlgebra\nusing Plots\nusing Random\nusing UnPack\n\nusing SSMProblems","category":"page"},{"location":"examples/kalman-filter/#Model-Definition","page":"Kalman Filter","title":"Model Definition","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We start by defining structs to store the paramaters for our specific latent dynamics and observation process.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"The latent dynamics have the following form:","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"x[0] = z + ϵ, ϵ ∼ N(0, P)\nx[k] = Φx[k-1] + b + w[k], w[k] ∼ N(0, Q)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We store all of these paramaters in a struct.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct LinearGaussianLatentDynamics{T<:Real} <: LatentDynamics\n z::Vector{T}\n P::Matrix{T}\n Φ::Matrix{T}\n b::Vector{T}\n Q::Matrix{T}\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Similarly, the observation process is defined by the following equation:","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"y[k] = Hx[k] + v[k], v[k] ∼ N(0, R)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct LinearGaussianObservationProcess{T<:Real} <: ObservationProcess\n H::Matrix{T}\n R::Matrix{T}\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We then define general transition and observation distributions to be used in forward simulation. Since our model is homogenous (doesn't depend on control variables), we use Nothing for the type of extra.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Even if we did not require forward simulation (e.g. we were given observations), it is still useful to define these methods as they allow us to run a particle filter on our model with no additional implementation required. Although a Kalman filter would generally be preferred in this linear Gaussian case, it may be of interest to compare the sampling performance with a general particle filter.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function SSMProblems.distribution(model::LinearGaussianLatentDynamics, extra::Nothing)\n return MvNormal(model.z, model.P)\nend\nfunction SSMProblems.distribution(\n model::LinearGaussianLatentDynamics{T},\n step::Int,\n state::AbstractVector{T},\n extra::Nothing,\n) where {T}\n return MvNormal(model.Φ * state + model.b, model.Q)\nend\nfunction SSMProblems.distribution(\n model::LinearGaussianObservationProcess{T},\n step::Int,\n state::AbstractVector{T},\n extra::Nothing,\n) where {T}\n return MvNormal(model.H * state, model.R)\nend","category":"page"},{"location":"examples/kalman-filter/#Filtering-Algorithm","page":"Kalman Filter","title":"Filtering Algorithm","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We define a concrete type to represent our sampling algorithm. This is used for dispatch to, say, distinguish from using a generic particle filter.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"struct KalmanFilter end","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"A Kalman filter is only valid for linear Gaussian state space models, so we define an alias for an SSM with linear Gaussian latent dynamics and observation process, which will be used to dispatch to the correct method.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"const LinearGaussianSSM{T} = StateSpaceModel{\n <:LinearGaussianLatentDynamics{T},<:LinearGaussianObservationProcess{T}\n};","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We then define a method for the sample function. This is a standardised interface which requires the model we are sampling from, the sampling algorithm as well as the observations and any extras.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function AbstractMCMC.sample(\n model::LinearGaussianSSM{U},\n ::KalmanFilter,\n observations::AbstractVector,\n extras::AbstractVector,\n) where {U}\n T = length(observations)\n x_filts = Vector{Vector{U}}(undef, T)\n P_filts = Vector{Matrix{U}}(undef, T)\n\n @unpack z, P, Φ, b, Q = model.dyn ## Extract parameters\n @unpack H, R = model.obs\n\n for t in 1:T\n x_pred, P_pred = if t == 1\n z, P\n else\n Φ * x_filts[t - 1] + b, Φ * P_filts[t - 1] * Φ' + Q ## Prediction step\n end\n\n y = observations[t] ## Update step\n K = P_pred * H' / (H * P_pred * H' + R)\n x_filt = x_pred + K * (y - H * x_pred)\n P_filt = P_pred - K * H * P_pred\n\n x_filts[t] = x_filt\n P_filts[t] = P_filt\n end\n\n return x_filts, P_filts\nend","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"In this specific case, however, since our model is homogenous, we do not expect to have any extras passed in. For convenience, we create a method without the extras argument which then replaces them with a vector of nothings. This pattern is not specific to linear Gaussian models or Kalman filters so we define it with general types.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"function AbstractMCMC.sample(\n model::StateSpaceModel, algorithm, observations::AbstractVector\n)\n extras = fill(nothing, length(observations))\n return sample(model, algorithm, observations, extras)\nend","category":"page"},{"location":"examples/kalman-filter/#Simulation-and-Filtering","page":"Kalman Filter","title":"Simulation and Filtering","text":"","category":"section"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We define the parameters for our model as so.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"SEED = 1;\nT = 100;\nz = [-1.0, 1.0];\nP = Matrix(1.0I, 2, 2);\nΦ = [0.8 0.2; -0.1 0.8];\nb = zeros(2);\nQ = [0.2 0.0; 0.0 0.5];\nH = [1.0 0.0;];\nR = Matrix(0.3I, 1, 1);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We can then construct the latent dynamics and observation process, before combining these to form a state space model.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"dyn = LinearGaussianLatentDynamics(z, P, Φ, b, Q);\nobs = LinearGaussianObservationProcess(H, R);\nmodel = StateSpaceModel(dyn, obs);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Synthetic data can be generated by directly sampling from the model. This calls a utility function from the SSMProblems package, which in turn, calls the three distribution functions we defined above.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"rng = MersenneTwister(SEED);\nxs, ys = sample(rng, model, T);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"We can then run the Kalman filter and plot the filtering results against the ground truth.","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"x_filts, P_filts = AbstractMCMC.sample(model, KalmanFilter(), ys);","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"Plot trajectory for first dimension","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"p = plot(; title=\"First Dimension Kalman Filter Estimates\", xlabel=\"Step\", ylabel=\"Value\")\nplot!(p, 1:T, first.(xs); label=\"Truth\")\nscatter!(p, 1:T, first.(ys); label=\"Observations\")\nplot!(\n p,\n 1:T,\n first.(x_filts);\n ribbon=sqrt.(first.(P_filts)),\n label=\"Filtered (±1σ)\",\n fillalpha=0.2,\n)","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"\n\n\n \n \n \n\n\n\n \n \n \n\n\n\n \n \n \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"","category":"page"},{"location":"examples/kalman-filter/","page":"Kalman Filter","title":"Kalman Filter","text":"This page was generated using Literate.jl.","category":"page"}] }