forked from mikekestemont/pandora
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Develop a tuning toolkit #81
Labels
Comments
Other idea : The actual tuning.py could have a second option to generate such configs using a csv file :
|
Given a dev set there is nothing stopping us from doing hyperparameter
optimization. There are a couple of non-bayesian methods worth trying and
offering them for people with GPUs and time to invest in squeezing the most
out of their data. Things like hyperband or successive halving.
…On Mon 23. Oct 2017 at 12:45, Thibault Clérice ***@***.***> wrote:
Other idea :
The actual tuning.py could have a second option to generate such configs
using a csv file :
name,include_lemma,include_pos,nb_left_tokens...
config1,True,True,2
config2,True,True,1
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#81 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AF6Ho2XoE6EKKatPIoCE-ml3JhTVZD6tks5svG42gaJpZM4QCmIH>
.
|
I am all for method-based hyperparameter optimization. The only thing is that it might take more time that just what I described. But please, if you find the time to do this one, I'd be glad to test it :) |
Note : we could also simply use existing libraries such as https://github.com/hyperopt/hyperopt |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
General idea :
The text was updated successfully, but these errors were encountered: