Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make batch prediction able to not retrain using predicted guesses #118

Open
ardunn opened this issue Jul 31, 2020 · 2 comments
Open

make batch prediction able to not retrain using predicted guesses #118

ardunn opened this issue Jul 31, 2020 · 2 comments
Labels
enhancement v1.1 updates for version 1.1

Comments

@ardunn
Copy link
Collaborator

ardunn commented Jul 31, 2020

right now, batch prediction will remove data points from all_xz_unsearched to simulate not having that data point before retraining n times for a batch size of n. So if predicting on 100 points, will predict point 1, recompute acquisition function for point 2, etc. the problem is recomputing the acquisition function is expensive for each one, especially if each is using a bootstrap (e.g., RF). However, there's no need for the model to actually be retrained for every batch; So maybe the model can be trained once per batch and the code can be rewritten to compute the acquisition function alone for each new point in the batch

@ardunn ardunn added enhancement v1.1 updates for version 1.1 labels Jul 31, 2020
@ardunn
Copy link
Collaborator Author

ardunn commented Jul 31, 2020

alternatively, just store the predictions temporarily and store them until the batch runs out

@ardunn
Copy link
Collaborator Author

ardunn commented Jul 31, 2020

can be done by separating the prediction of rocketsled.acq.acquire into two functions: train and acquire or moving all the training code into OptTask

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement v1.1 updates for version 1.1
Projects
None yet
Development

No branches or pull requests

1 participant