From 794e7e266b21df9ffb7321d9b5fbbfca7bd3b51c Mon Sep 17 00:00:00 2001 From: Maxim Ziatdinov <34245227+ziatdinovmax@users.noreply.github.com> Date: Mon, 4 Nov 2024 22:11:49 -0800 Subject: [PATCH 1/2] Update README.md --- README.md | 34 ++++++++++++++++++++++++++++++++++ 1 file changed, 34 insertions(+) diff --git a/README.md b/README.md index 19085f1..81e01a6 100644 --- a/README.md +++ b/README.md @@ -132,3 +132,37 @@ This approach allows you to incorporate domain knowledge or theoretical models i ![pretrained_priors](https://github.com/user-attachments/assets/33f80877-4a5c-46d2-ba5d-ee540418e21b) See example [here](https://github.com/ziatdinovmax/NeuroBayes/blob/main/examples/pretrained_priors.ipynb). + +### Comparison with GP-based methods +NeuroBayes provides implementations of Gaussian Process (GP) and Deep Kernel Learning (DKL) models for comparison with BNN approaches. These implementations support both fully Bayesian and variational inference methods. + +Gaussian Process: + +```python3 +# Specify kernel +kernel = nb.kernels.MaternKernel +# Initialize GP model +model = nb.GP(kernel) +# Train the same way as BNN +model.fit(X_measured, y_measured, num_warmup=1000, num_samples=1000) +# Make a prediction the same way as with BNN +posterior_mean, posterior_var = model.predict(X_domain) +``` + +Deep Kernel Learning: + +```python3 +# Set a number of latent dimensions +latent_dim = 2 +# Initialize NN architecture for the feature extractor part of the DKL +architecture = nb.FlaxMLP(hidden_dims = [64, 32, 16, 8], target_dim=latent_dim) +# Specify kernel for the GP part of DKL +kernel = nb.kernels.MaternKernel + +# Initialize DKL model +model = nb.DKL(net, kernel) +# Train and make a prediction the same way as with GP and BNN +model.fit(X_measured, y_measured, num_warmup=1000, num_samples=1000) +posterior_mean, posterior_var = model.predict(X_domain) +``` +The training and prediction interface is consistent across all model types (BNN, PBNN, GP, and DKL) in NeuroBayes, making it easy to compare different approaches for your specific use case. From e6332828de77fc220b1f898cb0e180423b33ec85 Mon Sep 17 00:00:00 2001 From: Maxim Ziatdinov <34245227+ziatdinovmax@users.noreply.github.com> Date: Fri, 8 Nov 2024 07:59:11 -0800 Subject: [PATCH 2/2] Add a link to surrogate model selector app --- README.md | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/README.md b/README.md index 81e01a6..850d91f 100644 --- a/README.md +++ b/README.md @@ -166,3 +166,11 @@ model.fit(X_measured, y_measured, num_warmup=1000, num_samples=1000) posterior_mean, posterior_var = model.predict(X_domain) ``` The training and prediction interface is consistent across all model types (BNN, PBNN, GP, and DKL) in NeuroBayes, making it easy to compare different approaches for your specific use case. + +### Surrogate model recommender + +I made a simple tool to guide the selection of a surrogate model (between BNN, PBNN, DKL, and GP) in the active learning setting: + +https://surrogate-model-selector.vercel.app/ + +Note: It reflects typical behaviors based on active learning requirements like training time per iteration, but model performance can vary significantly based on implementation details, hyperparameter tuning, and specific problem characteristics.