-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fluctuation complexity, restrict possibilites to formally defined self-informations #413
base: main
Are you sure you want to change the base?
Changes from 9 commits
95f8b94
58b80e6
f5d5b87
805b60f
785cb6c
0d8abd7
bb2712f
1d4f326
e3c4d39
780176b
ba6b73b
bb1aa37
0e7a2d1
0132c05
6c26097
30a82f7
ddea888
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||
---|---|---|---|---|---|---|
@@ -1,4 +1,5 @@ | ||||||
export information, information_maximum, information_normalized, convert_logunit | ||||||
export self_information | ||||||
export entropy | ||||||
|
||||||
########################################################################################### | ||||||
|
@@ -279,6 +280,39 @@ function information(::InformationMeasure, ::DifferentialInfoEstimator, args...) | |||||
)) | ||||||
end | ||||||
|
||||||
""" | ||||||
self_information(measure::InformationMeasure, pᵢ) | ||||||
|
||||||
Compute the "self-information"/"surprisal" of a single probability `pᵢ` under the given | ||||||
information measure. | ||||||
|
||||||
This function assumes `pᵢ > 0`, so make sure to pre-filter your probabilities. | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Just require it and throw error in the function body. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah, I see the problem here. You want to define all information content functions with their simple syntax as we anyways filter 0 probabilities when we compute entropy. Okay, let's say then: "This function requires There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yep, we can say that. |
||||||
|
||||||
## Definition | ||||||
|
||||||
We here use the definition "self-information" very loosely, and | ||||||
define it as the functional ``I_M(p_i)`` that satisfies ``\\sum_i p_i I_M(p_i) = I_M``, | ||||||
where `I_M` is the given information measure. | ||||||
|
||||||
If `measure` is [`Shannon`](@ref), then this is the | ||||||
[Shannon self-information](https://en.wikipedia.org/wiki/Information_content), which | ||||||
fulfils a set of axioms. If `measure` is some other information, then it is not guaranteed | ||||||
that these axioms are fulfilled. We *only* guarantee that the probability-weighted | ||||||
sum of the self-information equals the information measure. | ||||||
|
||||||
!!! note "Motivation for this definition" | ||||||
This definition is motivated by the desire to compute generalized | ||||||
[`FluctuationComplexity`](@ref), which is a measure of fluctuations of local self-information | ||||||
relative to some information-theoretic summary statistic of a distribution. Defining | ||||||
these self-information functions has, as far as we know, not been treated in the literature | ||||||
before, and will be part of an upcoming paper we're writing! | ||||||
""" | ||||||
function self_information(measure::InformationMeasure, pᵢ) | ||||||
throw(ArgumentError( | ||||||
"""`InformationMeasure` $(typeof(measure)) does not implement `self_information`.""" | ||||||
)) | ||||||
end | ||||||
|
||||||
|
||||||
########################################################################################### | ||||||
# Utils | ||||||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -57,3 +57,8 @@ function information_maximum(e::TsallisExtropy, L::Int) | |
|
||
return ((L - 1) * L^(q - 1) - (L - 1)^q) / ((q - 1) * L^(q - 1)) | ||
end | ||
|
||
function self_information(e::TsallisExtropy, pᵢ, N) #must have N | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What is If There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'll see if I can define a functional that doesn't depend on |
||
k, q = e.k, e.q | ||
return (N - 1)/(q - 1) - (1 - pᵢ)^q / (q-1) | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd suggest that we use just
p
here and make it clear in the docstring that this is a number. We useprobs
forProbabilities
in most places in the library.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another argument is that the subscript
i
is out of context here, and may confuse instead of clarify.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, we can call this simply
p
. It is clear thatp
is from a distribution.