Skip to content

Commit

Permalink
WIP: reviewing new api (#95)
Browse files Browse the repository at this point in the history
* Equivalent to, not equivalent with

* Include dispersion entropy

* Clearer wording

* Symbolization routines need to be imported before estimators using them

* Export in-place version.

* Follow Julia's linting conventions

* Run CI on all pull requests

* Don't export in-place version

* Missing a letter
  • Loading branch information
kahaaga authored Sep 19, 2022
1 parent 608a804 commit aace207
Show file tree
Hide file tree
Showing 6 changed files with 15 additions and 12 deletions.
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ on:
pull_request:
branches:
- main
- '**' # matches every branch
push:
branches:
- main
Expand Down
2 changes: 1 addition & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,4 +50,4 @@ The input data type typically depend on the probability estimator chosen. In gen

- _Timeseries_, which are `AbstractVector{<:Real}`, used in e.g. with [`WaveletOverlap`](@ref).
- _Multi-dimensional timeseries, or datasets, or state space sets_, which are `Dataset`, used e.g. with [`NaiveKernel`](@ref).
- _Spatial data_, which are higher dimensional standard `Array`, used e.g. with [`SpatialSymbolicPermutation`](@ref).
- _Spatial data_, which are higher dimensional standard `Array`s, used e.g. with [`SpatialSymbolicPermutation`](@ref).
4 changes: 2 additions & 2 deletions src/Entropies.jl
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
"""
A Julia package that provides estimators for probabilities, entropies,
and complexity measures for timeseries, nonlinear dynamics and complex systems.
It can be used as standalone or part of several projects in the JuliaDynamics organization,
It can be used as a standalone package, or as part of several projects in the JuliaDynamics organization,
such as [DynamicalSystems.jl](https://juliadynamics.github.io/DynamicalSystems.jl/dev/)
or [CausalityTools.jl](https://juliadynamics.github.io/CausalityTools.jl/dev/).
"""
Expand All @@ -12,10 +12,10 @@ using DelayEmbeddings: AbstractDataset, Dataset, dimension
export AbstractDataset, Dataset
const Array_or_Dataset = Union{<:AbstractArray, <:AbstractDataset}

include("symbolization/symbolize.jl")
include("probabilities.jl")
include("probabilities_estimators/probabilities_estimators.jl")
include("entropies/entropies.jl")
include("symbolization/symbolize.jl")
include("deprecations.jl")


Expand Down
4 changes: 3 additions & 1 deletion src/entropies/entropies.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,6 @@ include("tsallis.jl")
include("shannon.jl")
include("convenience_definitions.jl")
include("direct_entropies/nearest_neighbors/nearest_neighbors.jl")
# TODO: What else is included here from direct entropies?
include("direct_entropies/entropy_dispersion.jl")

# TODO: What else is included here from direct entropies?
2 changes: 1 addition & 1 deletion src/entropies/shannon.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ export entropy_shannon

"""
entropy_shannon(args...; base = MathConstants.e)
Equivalent with `entropy_renyi(args...; base, q = 1)` and provided solely for convenience.
Equivalent to `entropy_renyi(args...; base, q = 1)` and provided solely for convenience.
Compute the Shannon entropy, given by
```math
H(p) = - \\sum_i p[i] \\log(p[i])
Expand Down
14 changes: 7 additions & 7 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -158,16 +158,16 @@ end
@test sum(p2) 1.0

# Entropies
@test entropy_renyi!(s, x, est, q = 1) 0 # Regular order-1 entropy
@test entropy_renyi!(s, y, est, q = 1) >= 0 # Regular order-1 entropy
@test entropy_renyi!(s, x, est, q = 2) 0 # Higher-order entropy
@test entropy_renyi!(s, y, est, q = 2) >= 0 # Higher-order entropy
@test Entropies.entropy_renyi!(s, x, est, q = 1) 0 # Regular order-1 entropy
@test Entropies.entropy_renyi!(s, y, est, q = 1) >= 0 # Regular order-1 entropy
@test Entropies.entropy_renyi!(s, x, est, q = 2) 0 # Higher-order entropy
@test Entropies.entropy_renyi!(s, y, est, q = 2) >= 0 # Higher-order entropy

# For a time series
sz = zeros(Int, N - (est.m-1)*est.τ)
@test probabilities!(sz, z, est) isa Probabilities
@test probabilities(z, est) isa Probabilities
@test entropy_renyi!(sz, z, est) isa Real
@test Entropies.entropy_renyi!(sz, z, est) isa Real
@test entropy_renyi(z, est) isa Real
end

Expand Down Expand Up @@ -290,7 +290,7 @@ end
RectangularBinning([0.2, 0.3, 0.3])
]

@testset "Binning test $i" for i in 1:length(binnings)
@testset "Binning test $i" for i in eachindex(binnings)
est = VisitationFrequency(binnings[i])
@test probabilities(D, est) isa Probabilities
@test entropy_renyi(D, est, q=1, base = 3) isa Real # Regular order-1 entropy
Expand All @@ -310,7 +310,7 @@ end
RectangularBinning([0.2, 0.3, 0.3])
]

@testset "Binning test $i" for i in 1:length(binnings)
@testset "Binning test $i" for i in eachindex(binnings)
to = Entropies.transferoperator(D, binnings[i])
@test to isa Entropies.TransferOperatorApproximationRectangular

Expand Down

0 comments on commit aace207

Please sign in to comment.