Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework presentation of keywords #393

Merged
merged 61 commits into from
Aug 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
175e2cf
Sketch an approach to reuse docstring snippets – and try another form…
kellertuer Jun 7, 2024
c783273
update Changelog.
kellertuer Jun 7, 2024
d010df5
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Jun 9, 2024
0bd85f2
I think I like the form.
kellertuer Jun 9, 2024
4ec846e
remove a function.
kellertuer Jun 9, 2024
13c569e
got confused, adds a version back.
kellertuer Jun 9, 2024
29de40f
continue with docs.
kellertuer Jun 12, 2024
a00202c
run formatter
kellertuer Jun 12, 2024
a842a0c
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Jun 14, 2024
02e4a31
Finish gradient descent rework.
kellertuer Jun 18, 2024
e3dbb03
Update src/plans/docstring_snippets.jl
kellertuer Jun 18, 2024
538110f
Work on doc strings.
kellertuer Jun 19, 2024
4220b6b
Merge branch 'kellertuer/modularize-docstrings' of github.com:JuliaMa…
kellertuer Jun 19, 2024
a66efcf
fine tune doc string snipped definitions.
kellertuer Jun 19, 2024
7ca676f
allow variables in strings for vale.
kellertuer Jun 19, 2024
77d5544
Revise FW docs.
kellertuer Jun 20, 2024
1a41d23
Refactor further keyword arguments.
kellertuer Jun 22, 2024
b57e907
Work through further docstrings
kellertuer Jun 22, 2024
5bc2e95
Fix a typo.
kellertuer Jun 22, 2024
520121b
rewrite AdaptiveRegularizationState
kellertuer Jun 23, 2024
666a994
Further work on keywords.
kellertuer Jun 23, 2024
1d18804
Fix a typo in the docs that issued a warning, finish reworking ARC
kellertuer Jun 24, 2024
adc29b3
Finish Lanczos.
kellertuer Jun 26, 2024
67056e8
Start ALM.
kellertuer Jun 26, 2024
7db62eb
Update src/plans/docstring_snippets.jl
kellertuer Jun 26, 2024
f3e152f
Update src/solvers/Lanczos.jl
kellertuer Jun 26, 2024
e1fecb6
Rework docs of ALM.
kellertuer Jun 28, 2024
9e1e6e1
Finish ALM.
kellertuer Jun 28, 2024
a53b4a5
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Jun 28, 2024
da38048
Reformulate the NM solver.
kellertuer Jul 4, 2024
832f484
Write Nelder Mead.
kellertuer Jul 4, 2024
2da37f3
Write a detailed NelderMead algorithm into the docs.
kellertuer Jul 4, 2024
57b6539
Rework ChambollePock docs.
kellertuer Jul 24, 2024
571ada7
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Jul 25, 2024
cd34c0b
Update the CG solver.
kellertuer Jul 25, 2024
36d1c88
Rewrite CBM docs.
kellertuer Jul 26, 2024
6038f2e
Add a new sections on the about.
kellertuer Jul 26, 2024
28e4241
Merge branch 'kellertuer/about-update' into kellertuer/modularize-doc…
kellertuer Jul 26, 2024
0c227d6
Escaping newlines does not seem to work on Julia 1.6
kellertuer Jul 26, 2024
cbd2f8c
rework CPPA docs.
kellertuer Jul 27, 2024
73cf968
Update DoC docs.
kellertuer Jul 27, 2024
f05a3ef
Finish DCPPA docs rework
kellertuer Jul 27, 2024
0182d58
fix DR docs.
kellertuer Jul 28, 2024
f236171
Rework EPM docs.
kellertuer Jul 28, 2024
e641231
fix the more precise constructor.
kellertuer Jul 28, 2024
ab9510a
Update LM docs.
kellertuer Jul 29, 2024
1953ad2
Rework PSO docs.
kellertuer Jul 29, 2024
b3e2d80
Finish reworm of PDSN
kellertuer Aug 1, 2024
47112a2
Finish reworking the stochastic gradient.
kellertuer Aug 1, 2024
760ebe0
finish SGM.
kellertuer Aug 1, 2024
eed2c16
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Aug 3, 2024
83c61f9
Adapt PBM.
kellertuer Aug 3, 2024
ef49863
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Aug 3, 2024
2a03cf1
Update QN.
kellertuer Aug 3, 2024
b91f76e
Fix that stop shall not print at init.
kellertuer Aug 4, 2024
56ad6a8
Finish adapting TCG.
kellertuer Aug 4, 2024
c64f94a
Finish TR.
kellertuer Aug 4, 2024
dd8e9e1
Fix a typo.
kellertuer Aug 4, 2024
062d463
Fix a typo.
kellertuer Aug 6, 2024
195a745
Merge branch 'master' into kellertuer/modularize-docstrings
kellertuer Aug 10, 2024
65d6381
Check / unify iterates / indexing.
kellertuer Aug 10, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,17 @@ All notable Changes to the Julia package `Manopt.jl` will be documented in this
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.4.70] WIP

### Added

* Unify doc strings and presentation of keyword arguments
* general indexing, for example in a vector, uses `i`
* index for inequality constraints is unified to `i` running from `1,...,m`
* index for equality constraints is unified to `j` running from `1,...,n`
* iterations are using now `k`
* Doc strings unified and even reusing similar docstring snippets.

## [0.4.69] – August 3, 2024

### Changed
Expand Down Expand Up @@ -40,6 +51,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
* a few typos in the documentation
* `WolfePowellLinesearch` no longer uses `max_stepsize` with invalid point by default.


## [0.4.66] June 27, 2024

### Changed
Expand Down
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
authors = ["Ronny Bergmann <[email protected]>"]
version = "0.4.69"
version = "0.4.70"

[deps]
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
Expand Down
18 changes: 14 additions & 4 deletions docs/src/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,10 @@
Manopt.jl inherited its name from [Manopt](https://manopt.org), a Matlab toolbox for optimization on manifolds.
This Julia package was started and is currently maintained by [Ronny Bergmann](https://ronnybergmann.net/).

The following people contributed
## Contributors

Thanks to the following contributors to `Manopt.jl`:

* [Constantin Ahlmann-Eltze](https://const-ae.name) implemented the [gradient and differential `check` functions](helpers/checks.md)
* [Renée Dornig](https://github.com/r-dornig) implemented the [particle swarm](solvers/particle_swarm.md), the [Riemannian Augmented Lagrangian Method](solvers/augmented_Lagrangian_method.md), the [Exact Penalty Method](solvers/exact_penalty_method.md), as well as the [`NonmonotoneLinesearch`](@ref)
* [Willem Diepeveen](https://www.maths.cam.ac.uk/person/wd292) implemented the [primal-dual Riemannian semismooth Newton](solvers/primal_dual_semismooth_Newton.md) solver.
Expand All @@ -14,21 +17,28 @@ The following people contributed
* [Markus A. Stokkenes](https://www.linkedin.com/in/markus-a-stokkenes-b41bba17b/) contributed most of the implementation of the [Interior Point Newton Method](solvers/interior_point_Newton.md)
* [Manuel Weiss](https://scoop.iwr.uni-heidelberg.de/author/manuel-weiß/) implemented most of the [conjugate gradient update rules](@ref cg-coeffs)

as well as various [contributors](https://github.com/JuliaManifolds/Manopt.jl/graphs/contributors) providing small extensions, finding small bugs and mistakes and fixing them by opening [PR](https://github.com/JuliaManifolds/Manopt.jl/pulls)s.
as well as various [contributors](https://github.com/JuliaManifolds/Manopt.jl/graphs/contributors) providing small extensions, finding small bugs and mistakes and fixing them by opening [PR](https://github.com/JuliaManifolds/Manopt.jl/pulls)s. Thanks to all of you.

If you want to contribute a manifold or algorithm or have any questions, visit
the [GitHub repository](https://github.com/JuliaManifolds/Manopt.jl/)
to clone/fork the repository or open an issue.

## Work using Manopt.jl

* [ExponentialFamilyProjection.jl](https://github.com/ReactiveBayes/ExponentialFamilyProjection.jl) projects distributions
* [Caesar.jl](https://github.com/JuliaRobotics/Caesar.jl) within non-Gaussian factor graph inference algorithms

Is a package missing? [Open an issue](https://github.com/JuliaManifolds/Manopt.jl/issues/new)!
It would be great to collect anything and anyone using Manopt.jl

# Further packages
## Further packages

`Manopt.jl` belongs to the Manopt family:

* [manopt.org](https://www.manopt.org) The Matlab version of Manopt, see also their :octocat: [GitHub repository](https://github.com/NicolasBoumal/manopt)
* [pymanopt.org](https://www.pymanopt.org/) The Python version of Manopt providing also several AD backends, see also their :octocat: [GitHub repository](https://github.com/pymanopt/pymanopt)

but there are also more packages providing tools on manifolds:
but there are also more packages providing tools on manifolds in other languages

* [Jax Geometry](https://github.com/ComputationalEvolutionaryMorphometry/jaxgeometry) (Python/Jax) for differential geometry and stochastic dynamics with deep learning
* [Geomstats](https://geomstats.github.io) (Python with several backends) focusing on statistics and machine learning :octocat: [GitHub repository](https://github.com/geomstats/geomstats)
Expand Down
2 changes: 2 additions & 0 deletions docs/src/notation.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,6 @@ with the following additional parts.

| Symbol | Description | Also used | Comment |
|:--:|:--------------- |:--:|:-- |
| ``\operatorname{arg\,min}`` | argument of a function ``f`` where a local or global minimum is attained | |
| ``k`` | the current iterate | ``ì`` | the goal is to unify this to `k` |
| ```` | The [Levi-Cevita connection](https://en.wikipedia.org/wiki/Levi-Civita_connection) | | |
4 changes: 2 additions & 2 deletions docs/src/plans/debug.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,6 @@ automatically available, as explained in the [`gradient_descent`](@ref) solver.

```@docs
initialize_solver!(amp::AbstractManoptProblem, dss::DebugSolverState)
step_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, i)
stop_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, i::Int)
step_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, k)
stop_solver!(amp::AbstractManoptProblem, dss::DebugSolverState, k::Int)
```
2 changes: 1 addition & 1 deletion docs/src/plans/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ The following symbols are used.
| Symbol | Used in | Description |
| :----------- | :------ | :--------------------------------------------------------- |
| `:Activity` | [`DebugWhenActive`](@ref) | activity of the debug action stored within |
| `:Basepoint` | [`TangentSpace`]() | the point the tangent space is at |
| `:Basepoint` | [`TangentSpace`](@extref ManifoldsBase `ManifoldsBase.TangentSpace`) | the point the tangent space is at |
| `:Cost` | generic |the cost function (within an objective, as pass down) |
| `:Debug` | [`DebugSolverState`](@ref) | the stored `debugDictionary` |
| `:Gradient` | generic | the gradient function (within an objective, as pass down) |
Expand Down
4 changes: 2 additions & 2 deletions docs/src/plans/record.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,6 @@ Further specific [`RecordAction`](@ref)s can be found when specific types of [`A

```@docs
initialize_solver!(amp::AbstractManoptProblem, rss::RecordSolverState)
step_solver!(p::AbstractManoptProblem, s::RecordSolverState, i)
stop_solver!(p::AbstractManoptProblem, s::RecordSolverState, i)
step_solver!(p::AbstractManoptProblem, s::RecordSolverState, k)
stop_solver!(p::AbstractManoptProblem, s::RecordSolverState, k)
```
2 changes: 1 addition & 1 deletion docs/src/plans/stepsize.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ Tangent bundle with the Sasaki metric has 0 injectivity radius, so the maximum s
`Hyperrectangle` also has 0 injectivity radius and an estimate based on maximum of dimensions along each index is used instead.
For manifolds with corners, however, a line search capable of handling break points along the projected search direction should be used, and such algorithms do not call `max_stepsize`.

Some solvers have a different iterate from the one used for linesearch. Then the following state can be used to wrap
Some solvers have a different iterate from the one used for the line search. Then the following state can be used to wrap
these locally

```@docs
Expand Down
20 changes: 10 additions & 10 deletions docs/src/solvers/DouglasRachford.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,44 +6,44 @@ manifolds in [BergmannPerschSteidl:2016](@cite).
The aim is to minimize the sum

```math
F(p) = f(p) + g(p)
f(p) = g(p) + h(p)
```

on a manifold, where the two summands have proximal maps
``\operatorname{prox}_{λ f}, \operatorname{prox}_{λ g}`` that are easy
``\operatorname{prox}_{λ g}, \operatorname{prox}_{λ h}`` that are easy
to evaluate (maybe in closed form, or not too costly to approximate).
Further, define the reflection operator at the proximal map as

```math
\operatorname{refl}_{λ f}(p) = \operatorname{retr}_{\operatorname{prox}_{λ f}(p)} \bigl( -\operatorname{retr}^{-1}_{\operatorname{prox}_{λ f}(p)} p \bigr).
\operatorname{refl}_{λ g}(p) = \operatorname{retr}_{\operatorname{prox}_{λ g}(p)} \bigl( -\operatorname{retr}^{-1}_{\operatorname{prox}_{λ g}(p)} p \bigr).
```

Let ``\alpha_k ∈ [0,1]`` with ``\sum_{k ∈ ℕ} \alpha_k(1-\alpha_k) = \infty``
and ``λ > 0`` (which might depend on iteration ``k`` as well) be given.

Then the (P)DRA algorithm for initial data ``x_0 ∈ \mathcal H`` as
Then the (P)DRA algorithm for initial data ``p^{(0)} ∈ \mathcal M`` as

## Initialization

Initialize ``t_0 = x_0`` and ``k=0``
Initialize ``q^{(0)} = p^{(0)}`` and ``k=0``

## Iteration

Repeat until a convergence criterion is reached

1. Compute ``s_k = \operatorname{refl}_{λ f}\operatorname{refl}_{λ g}(t_k)``
2. Within that operation, store ``p_{k+1} = \operatorname{prox}_{λ g}(t_k)`` which is the prox the inner reflection reflects at.
3. Compute ``t_{k+1} = g(\alpha_k; t_k, s_k)``, where ``g`` is a curve approximating the shortest geodesic, provided by a retraction and its inverse
1. Compute ``r^{(k)} = \operatorname{refl}_{λ g}\operatorname{refl}_{λ h}(q^{(k)})``
2. Within that operation, store ``p^{(k+1)} = \operatorname{prox}_{λ h}(q^{(k)})`` which is the prox the inner reflection reflects at.
3. Compute ``q^{(k+1)} = g(\alpha_k; q^{(k)}, r^{(k)})``, where ``g`` is a curve approximating the shortest geodesic, provided by a retraction and its inverse
4. Set ``k = k+1``

## Result

The result is given by the last computed ``p_K``.
The result is given by the last computed ``p^{(K)}`` at the last iterate ``K``.

For the parallel version, the first proximal map is a vectorial version where
in each component one prox is applied to the corresponding copy of ``t_k`` and
the second proximal map corresponds to the indicator function of the set,
where all copies are equal (in ``\mathcal H^n``, where ``n`` is the number of copies),
where all copies are equal (in ``\mathcal M^n``, where ``n`` is the number of copies),
leading to the second prox being the Riemannian mean.

## Interface
Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/adaptive-regularization-with-cubics.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ of a manifolds to be available
* By default the tangent vector storing the gradient is initialized calling [`zero_vector`](@extref `ManifoldsBase.zero_vector-Tuple{AbstractManifold, Any}`)`(M,p)`.
* [`inner`](@extref `ManifoldsBase.inner-Tuple{AbstractManifold, Any, Any, Any}`)`(M, p, X, Y)` is used within the algorithm step

Furthermore, within the Lanczos subsolver, generating a random vector (at `p`) using [`rand!`](@extref Base.rand-Tuple{AbstractManifold})(M, X; vector_at=p)` in place of `X` is required
Furthermore, within the Lanczos subsolver, generating a random vector (at `p`) using [`rand!`](@extref Base.rand-Tuple{AbstractManifold})`(M, X; vector_at=p)` in place of `X` is required

## Literature

Expand Down
10 changes: 8 additions & 2 deletions docs/src/solvers/conjugate_residual.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Conjugate Residual Solver in a Tangent space
# Conjugate residual solver in a Tangent space

```@meta
CurrentModule = Manopt
Expand All @@ -14,7 +14,7 @@ conjugate_residual
ConjugateResidualState
```

## Objetive
## Objective

```@docs
SymmetricLinearSystemObjective
Expand All @@ -26,6 +26,12 @@ SymmetricLinearSystemObjective
StopWhenRelativeResidualLess
```

## Internal functions

```@docs
Manopt.get_b
```

## Literature

```@bibliography
Expand Down
2 changes: 1 addition & 1 deletion docs/src/solvers/interior_point_Newton.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Interior Point Newton method
# Interior point Newton method

```@meta
CurrentModule = Manopt
Expand Down
10 changes: 10 additions & 0 deletions docs/styles/config/vocabularies/Manopt/accept.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
_field_.*\b
_arg_.*\b
_kw_.*\b
_l_.*\b
_math_.*\b
_problem_.*\b
_doc_.*\b
Absil
Adagrad
[A|a]djoint
Expand Down Expand Up @@ -62,6 +69,7 @@ Lui
Manifolds.jl
ManifoldsBase.jl
[Mm]anopt(:?.org|.jl)?
Markus
Marquardt
Moakher
Munkvold
Expand Down Expand Up @@ -90,6 +98,7 @@ Riemer
Riemopt
Riesz
Rosenbrock
Sasaki
semicontinuous
Steihaug
Stiefel
Expand All @@ -98,6 +107,7 @@ Souza
Steidl
Stephansen
[Ss]tepsize
Stokkenes
[Ss]ubdifferential
[Ss]ubgradient
subsampled
Expand Down
14 changes: 9 additions & 5 deletions ext/ManoptLRUCacheExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,15 @@ Given a vector of symbols `caches`, this function sets up the
# Keyword arguments
* `p`: (`rand(M)`) a point on a manifold, to both infer its type for keys and initialize caches
* `value`: (`0.0`) a value both typing and initialising number-caches, the default is for (Float) values like the cost.
* `X`: (`zero_vector(M, p)` a tangent vector at `p` to both type and initialize tangent vector caches
* `cache_size`: (`10`) a default cache size to use
* `cache_sizes`: (`Dict{Symbol,Int}()`) a dictionary of sizes for the `caches` to specify different (non-default) sizes
* `p=`$(Manopt._link_rand()): a point on a manifold, to both infer its type for keys and initialize caches
* `value=0.0`:
a value both typing and initialising number-caches, the default is for (Float) values like the cost.
* `X=zero_vector(M, p)`:
a tangent vector at `p` to both type and initialize tangent vector caches
* `cache_size=10`:
a default cache size to use
* `cache_sizes=Dict{Symbol,Int}()`:
a dictionary of sizes for the `caches` to specify different (non-default) sizes
"""
function Manopt.init_caches(
M::AbstractManifold,
Expand Down
2 changes: 1 addition & 1 deletion ext/ManoptLineSearchesExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ end
function (cs::Manopt.LineSearchesStepsize)(
mp::AbstractManoptProblem,
s::AbstractManoptSolverState,
i::Int,
k::Int,
η=-get_gradient(s);
fp=get_cost(mp, get_iterate(s)),
kwargs...,
Expand Down
6 changes: 6 additions & 0 deletions ext/ManoptManifoldsExt/ManoptManifoldsExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,12 @@ module ManoptManifoldsExt

using ManifoldsBase: exp, log, ParallelTransport, vector_transport_to
using Manopt
using Manopt:
_l_refl,
_l_retr,
_kw_retraction_method_default,
_kw_inverse_retraction_method_default,
_kw_X_default
import Manopt:
max_stepsize,
alternating_gradient_descent,
Expand Down
10 changes: 5 additions & 5 deletions ext/ManoptManifoldsExt/alternating_gradient.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,18 +16,18 @@ function get_gradient(
end

@doc raw"""
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, k)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, k)
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, i)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, i)
Evaluate one of the component gradients ``\operatorname{grad}f_k``, ``k\{1,…,n\}``, at `x` (in place of `Y`).
Evaluate one of the component gradients ``\operatorname{grad}f_i``, ``i\{1,…,n\}``, at `x` (in place of `Y`).
"""
function get_gradient(
M::ProductManifold,
mago::ManifoldAlternatingGradientObjective{AllocatingEvaluation,TC,<:Function},
p,
k,
i,
) where {TC}
return get_gradient(M, mago, p)[M, k]
return get_gradient(M, mago, p)[M, i]
end
function get_gradient!(
M::AbstractManifold,
Expand Down
21 changes: 12 additions & 9 deletions ext/ManoptManifoldsExt/manifold_functions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -108,28 +108,31 @@ function reflect!(M::AbstractManifold, q, pr::Function, x; kwargs...)
return reflect!(M, q, pr(x), x; kwargs...)
end

@doc raw"""
@doc """
reflect(M, p, x, kwargs...)
reflect!(M, q, p, x, kwargs...)

Reflect the point `x` from the manifold `M` at point `p`, given by

````math
\operatorname{refl}_p(x) = \operatorname{retr}_p(-\operatorname{retr}^{-1}_p x).
````
```math
$_l_refl
```

where ``\operatorname{retr}`` and ``\operatorname{retr}^{-1}`` denote a retraction and an inverse
where ``$_l_retr`` and ``$_l_retr^{-1}`` denote a retraction and an inverse
retraction, respectively.
This can also be done in place of `q`.

## Keyword arguments

* `retraction_method`: (`default_retraction_metiod(M, typeof(p))`) the retraction to use in the reflection
* `inverse_retraction_method`: (`default_inverse_retraction_method(M, typeof(p))`) the inverse retraction to use within the reflection
* $_kw_retraction_method_default
the retraction to use in the reflection
* $_kw_inverse_retraction_method_default
the inverse retraction to use within the reflection

and for the `reflect!` additionally

* `X`: (`zero_vector(M,p)`) a temporary memory to compute the inverse retraction in place.
* $_kw_X_default
a temporary memory to compute the inverse retraction in place.
otherwise this is the memory that would be allocated anyways.
"""
function reflect(
Expand All @@ -149,7 +152,7 @@ function reflect!(
q,
p,
x;
retraction_method=default_retraction_method(M),
retraction_method=default_retraction_method(M, typeof(p)),
inverse_retraction_method=default_inverse_retraction_method(M),
X=zero_vector(M, p),
)
Expand Down
Loading
Loading