Skip to content

sparktseung/MixedLRMoE.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MixedLRMoE.jl

MixedLRMoE.jl is an implementation of the Mixed Logit-Reduced Mixture-of-Experts (Mixed LRMoE) model in julia. This theoretical development of the Mixed LRMoE is given in Fung and Tseung (2022+), while an application in the automobile insurance context is given in Tseung et al. (2023).

To install the stable version of the package, simply type the following in the julia REPL:

] add MixedLRMoE

To install the latest version, type the following in the julia REPL:

] add https://github.com/sparktseung/MixedLRMoE.jl

The website of full documentation is here.