Skip to content

Commit

Permalink
different prompt
Browse files Browse the repository at this point in the history
  • Loading branch information
DominiqueMakowski committed May 6, 2022
1 parent 8463114 commit 5f98eed
Show file tree
Hide file tree
Showing 10 changed files with 126 additions and 27 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -200,3 +200,4 @@ data/mit_long-term/ECGs.csv
data/mit_long-term/ECGs.csv
.DS_Store
docs_wip/readme/README_popularity.py
paper/OHBM2022/OHBM2022_Makowski.mp4
4 changes: 4 additions & 0 deletions docs_wip/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,10 @@ def find_version():
napoleon_use_rtype = False
add_module_names = False # If true, the current module name will be prepended to all description

# -- Options for ipython directive ----------------------------------------

ipython_promptin = ">" # "In [%d]:"
ipython_promptout = ">" # "Out [%d]:"

# -- Options for myst_nb ---------------------------------------------------
nb_execution_mode = "force"
Expand Down
15 changes: 7 additions & 8 deletions neurokit2/complexity/complexity.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@
def complexity(signal, which=["fast", "medium"], delay=1, dimension=2, tolerance="sd", **kwargs):
"""**Complexity and Chaos Analysis**
Measuring complexity refers to the quantification of various concepts, such as **chaos, entropy,
unpredictability, and fractal dimension**.
Measuring the complexity of a signal refers to the quantification of various aspects related to
concepts such as **chaos**, **entropy**, **unpredictability**, and **fractal dimension**.
.. tip::
Expand All @@ -50,11 +50,11 @@ def complexity(signal, which=["fast", "medium"], delay=1, dimension=2, tolerance
separately, to gain more control over the parameters and information that you get.
The categorization by "computation time" is based on our preliminary `benchmarking study
<https://neurokit2.readthedocs.io/en/latest/studies/complexity_benchmark.html>`_ results:
<https://neuropsychology.github.io/NeuroKit/studies/complexity_benchmark.html>`_ results:
.. figure:: ../../studies/complexity_benchmark/figures/computation_time-1.png
:alt: Complexity Benchmark (Makowski).
:target: https://neurokit2.readthedocs.io/en/latest/studies/complexity_benchmark.html
:target: https://neuropsychology.github.io/NeuroKit/studies/complexity_benchmark.html
Parameters
----------
Expand All @@ -68,11 +68,11 @@ def complexity(signal, which=["fast", "medium"], delay=1, dimension=2, tolerance
See :func:`complexity_delay` to estimate the optimal value for this parameter.
dimension : int
Embedding Dimension (*m*, sometimes referred to as *d* or *order*). See
:func:`complexity_dimension()` to estimate the optimal value for this parameter.
:func:`complexity_dimension` to estimate the optimal value for this parameter.
tolerance : float
Tolerance (often denoted as *r*), distance to consider two data points as similar. If
``"sd"`` (default), will be set to :math:`0.2 * SD_{signal}`. See
:func:`complexity_tolerance()` to estimate the optimal value for this parameter.
:func:`complexity_tolerance` to estimate the optimal value for this parameter.
Returns
--------
Expand All @@ -83,8 +83,7 @@ def complexity(signal, which=["fast", "medium"], delay=1, dimension=2, tolerance
See Also
--------
entropy_permutation, entropy_differential, entropy_svd, fractal_katz, fractal_petrosian,
fractal_sevcik, fisher_information, complexity_hjorth, complexity_rqa
complexity_delay, complexity_dimension, complexity_tolerance
Examples
----------
Expand Down
Binary file added paper/OHBM2022/OHBM2022_Makowski.pptx
Binary file not shown.
Binary file removed paper/OHBM2022/PowerPitch.pptx
Binary file not shown.
30 changes: 11 additions & 19 deletions studies/complexity_benchmark/README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ The script to generate the data can be found at ...

Generated 5 types of signals, to which we added different types of noise.

```{r message=FALSE, warning=FALSE, fig.height=20, fig.width=12}
```{r message=FALSE, warning=FALSE, fig.height=20, fig.width=12, cache=TRUE}
library(tidyverse)
library(easystats)
library(patchwork)
Expand Down Expand Up @@ -268,7 +268,7 @@ colors <- c(

### Computation Time

```{r computation_time, message=FALSE, warning=FALSE, fig.width=16*1.25, fig.height=10*1.25}
```{r computation_time, message=FALSE, warning=FALSE, fig.width=16*1.25, fig.height=10*1.25, cache=TRUE}
order <- df |>
group_by(Index) |>
summarize(Duration = median(Duration)) |>
Expand Down Expand Up @@ -392,7 +392,7 @@ estimate_relation(model) |>
### Correlation


```{r message=FALSE, warning=FALSE, fig.width=16, fig.height=15}
```{r message=FALSE, warning=FALSE, fig.width=16, fig.height=15, cache=TRUE}
data <- df |>
mutate(i = paste(Signal, Length, Noise_Type, Noise_Intensity, sep = "__")) |>
select(i, Index, Result) |>
Expand Down Expand Up @@ -461,7 +461,7 @@ cor <- get_cor(data)
- Remove **RR** because it's slower.


```{r message=FALSE, warning=FALSE, include=FALSE, eval=FALSE}
```{r message=FALSE, warning=FALSE, include=FALSE, eval=FALSE, cache=TRUE}
cor |>
cor_lower() |>
filter(Parameter1 %in% names(data), Parameter2 %in% names(data)) |>
Expand All @@ -487,21 +487,10 @@ filter(averagetime, Index %in% c("FuzzyEn", "FuzzyApEn"))
filter(averagetime, Index %in% c("SVDEn", "FuzzycApEn"))
filter(averagetime, Index %in% c("CPEn", "CRPEn"))
filter(averagetime, Index %in% c("NLDFD", "RR"))
```


# NLFD | RR
# NLFD | RR
# - Drop RR because it's slower
# H (uncorrected) | H (corrected)
# - ??
# SVDEn | FuzzyEn
# - Drop FuzzyEn because it's slower
# Hasselman positively correlated with most of the others
# - RR: much slower
```{r message=FALSE, warning=FALSE, cache=TRUE}
data <- data |>
select(
-`CREn (B)`,
Expand All @@ -517,12 +506,15 @@ data <- data |>
-FuzzycApEn,
-CPEn,
-RR,
-MFDFA_HDelta,
-FuzzyRCMSEn,
-`CREn (1000)`, -`CREn (100)`,
-RQA_VEn, -RQA_LEn
)
cor <- get_cor(data)
```


<!-- ### Hierarchical CLustering -->


Expand All @@ -544,7 +536,7 @@ cor <- get_cor(data)
### Factor Analysis


```{r message=FALSE, warning=FALSE, include=FALSE, eval=FALSE}
```{r message=FALSE, warning=FALSE, fig.width=12, fig.height=18}
r <- correlation::cor_smooth(as.matrix(cor))
plot(parameters::n_factors(data, cor = r))
Expand Down
103 changes: 103 additions & 0 deletions studies/complexity_benchmark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -246,6 +246,33 @@ cor <- get_cor(data)
- **NLDFD**, and **RR**
- Remove **RR** because it’s slower.

``` r
data <- data |>
select(
-`CREn (B)`,
-`CREn (D)`, -`ShanEn (D)`,
-`CREn (r)`, -`ShanEn (r)`,
# -`CREn (C)`, -`ShanEn (C)`,
-`PSDFD (Voss1998)`,
-`RangeEn (A)`, -`RangeEn (Ac)`,
-FI,
-MMSEn,
-`H (corrected)`,
-FuzzyApEn,
-FuzzycApEn,
-CPEn,
-RR,
-MFDFA_HDelta,
-FuzzyRCMSEn,
-`CREn (1000)`, -`CREn (100)`,
-RQA_VEn, -RQA_LEn
)

cor <- get_cor(data)
```

![](../../studies/complexity_benchmark/figures/unnamed-chunk-9-1.png)<!-- -->

<!-- ### Hierarchical CLustering -->
<!-- ```{r message=FALSE, warning=FALSE} -->
<!-- n <- parameters::n_clusters(as.data.frame(t(data)), standardize = FALSE) -->
Expand All @@ -262,4 +289,80 @@ cor <- get_cor(data)

### Factor Analysis

``` r
r <- correlation::cor_smooth(as.matrix(cor))

plot(parameters::n_factors(data, cor = r))
```

![](../../studies/complexity_benchmark/figures/unnamed-chunk-10-1.png)<!-- -->

``` r
rez <- parameters::factor_analysis(data, cor = r, n = 14, rotation = "varimax", sort = TRUE, fm="wls")
# rez <- parameters::principal_components(data, n = 5, sort = TRUE)
# rez

col <- gsub('[[:digit:]]+', '', names(rez)[2])
closest <- colnames(select(rez, starts_with(col)))[apply(select(rez, starts_with(col)), 1, \(x) which.max(abs(x)))]

loadings <- attributes(rez)$loadings_long |>
mutate(
Loading = abs(Loading),
Component = fct_relevel(Component, rev(names(select(rez, starts_with(col))))),
Variable = fct_rev(fct_relevel(Variable, rez$Variable))
)

colors <- setNames(see::palette_material("rainbow")(length(levels(loadings$Component))), levels(loadings$Component))


p1 <- loadings |>
# filter(Variable == "CD") |>
ggplot(aes(x = Variable, y = Loading)) +
geom_bar(aes(fill = Component), stat = "identity") +
geom_vline(xintercept = c("SD", "Length", "Noise", "Random"), color = "red") +
geom_vline(xintercept = head(cumsum(sort(table(closest))), -1) + 0.5) +
scale_y_continuous(expand = c(0, 0)) +
scale_fill_material_d("rainbow") +
coord_flip() +
theme_minimal() +
guides(fill = guide_legend(reverse = TRUE)) +
labs(x = NULL) +
theme(
axis.text.y = element_text(
color = rev(colors[closest]),
face = rev(ifelse(rez$Variable %in% c("SD", "Length", "Noise", "Random"), "italic", "plain")),
hjust = 0.5
),
axis.text.x = element_blank(),
plot.title = element_text(hjust = 0.5),
panel.grid.major = element_blank(),
panel.grid.minor = element_blank()
)

p2 <- order |>
mutate(Duration = 1 + Duration * 10000) |>
filter(Index %in% loadings$Variable) |>
mutate(Index = fct_relevel(Index, levels(loadings$Variable))) |>
ggplot(aes(x = log10(Duration), y = Index)) +
geom_bar(aes(fill = log10(Duration)), stat = "identity") +
geom_hline(yintercept = head(cumsum(sort(table(closest))), -1) + 0.5) +
scale_x_reverse(expand = c(0, 0)) +
# scale_x_log10(breaks = 10**seq(0, 4), labels = function(x) sprintf("%g", x), expand=c(0, 0)) +
scale_y_discrete(position = "right") +
scale_fill_viridis_c(guide = "none") +
labs(x = "Computation Time", y = NULL) +
theme_minimal() +
theme(
axis.text.y = element_blank(),
axis.text.x = element_blank(),
plot.title = element_text(hjust = 0.5),
panel.grid.major = element_blank(),
panel.grid.minor = element_blank()
)

(p2 | p1) + patchwork::plot_annotation(title = "Computation Time and Factor Loading", theme = theme(plot.title = element_text(hjust = 0.5, face = "bold")))
```

![](../../studies/complexity_benchmark/figures/unnamed-chunk-10-2.png)<!-- -->

## References
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 5f98eed

Please sign in to comment.