Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 711 Bytes

README.md

File metadata and controls

19 lines (12 loc) · 711 Bytes

LRMoE.jl

LRMoE.jl is an implementation of the Logit-Reduced Mixture-of-Experts model in julia. This package is introduced in Tseung et al. 2021.

To install the stable version of the package, simply type the following in the julia REPL:

] add LRMoE

To install the latest version, type the following in the julia REPL:

] add https://github.com/sparktseung/LRMoE.jl

The website of full documentation is here.