-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Botorch with cardinality constraint via sampling #301
base: main
Are you sure you want to change the base?
Botorch with cardinality constraint via sampling #301
Conversation
0be3285
to
1f7783d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @Waschenbacher, thanks for the work. This is not yet a review but only a first batch of very high-level comments that I would like you to address before we can go into the actual review process. The reason being that the main functionality brought by this PR is currently rather convoluted and hard to parse, so I'd prefer to to work with a more readable version, tbh
1f7783d
to
9d28b49
Compare
f68ce4f
to
fc52875
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @Waschenbacher, I've finally managed to spend some time on this important PR, thanks again for your preparation work. I've already refactored some parts that were clear to me. However, I'm not yet certain about the design at the remaining places. I have the feeling that it can potentially be simplified a lot, depending on whether it is possible to reuse the idea of reduced subspaces. Have marked the corresponding places with comments. I think we need to discuss this part first before we can continue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First high-level review regarding points that we should discuss. Not a full review yet as I think that the code might change depending on what is decided regarding the min_cardinality
.
@@ -76,6 +85,13 @@ class BotorchRecommender(BayesianRecommender): | |||
optimization. **Does not affect purely discrete optimization**. | |||
""" | |||
|
|||
max_n_subspaces: int = field(default=10, validator=[instance_of(int), ge(1)]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that it should be linked to cardinality
somehow: For people that are not interested in cardinality constraints, it is only clear after reading the docstring that this is not interesting for them. This would however make the name quite long :/ So although I am not perfectly happy with the name, we could keep it as I do not see a better alternative while keeping it here.
subspace_continuous: SubspaceContinuous, | ||
batch_size: int, | ||
) -> tuple[Tensor, Tensor]: | ||
"""Recommend from a continuous search space with cardinality constraints. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like it, only some information about how max_n_subspaces
comes into play here is still missing for me :)
@@ -301,6 +334,45 @@ def _drop_parameters(self, parameter_names: Collection[str]) -> SubspaceContinuo | |||
], | |||
) | |||
|
|||
def _enforce_cardinality_constraints_via_assignment( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since there is only one way of enforcing cardinality consraints, why not simply enforce_cardinality_constraints
? If people are interested in the details of the how, they can read the docstring.
- Replace redandunt function - Ensure near-zero range being an open interval
- Add to-do related to customized error in botorch - Add to-do related to active parameters guarantee in random sampler
@AdrianSosic This is the updated PR according to our discussion in the baybathon. The main changes are below:
|
@@ -41,6 +49,9 @@ def validate_constraints( # noqa: DOC101, DOC103 | |||
param_names_discrete = [p.name for p in parameters if p.is_discrete] | |||
param_names_continuous = [p.name for p in parameters if p.is_continuous] | |||
param_names_non_numerical = [p.name for p in parameters if not p.is_numerical] | |||
params_continuous: list[NumericalContinuousParameter] = [ | |||
p for p in parameters if isinstance(p, NumericalContinuousParameter) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sry for my confusion, the method exists in this PR https://github.com/emdgroup/baybe/pull/291/files#diff-9b02c8d8e9e86b086ea306806ebc5e47435ac5045557f8eb138a7be22c3cb0e8R461 which is however on hold
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an incomplete review as I was not aware that there is still stuff being worked on. Feel free to either already include my comments or just resolve them.
baybe/parameters/numerical.py
Outdated
|
||
Important: | ||
Value in the open interval (-near_zero_threshold, near_zero_threshold) | ||
will be treated as near_zero. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Insonsistencies regarding the use of near_zero
and near-zero
in this docstring.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is renamed to zeros
in the updated version. Resolve it for now. If near_zero
is preferred, I'm open to rename it back.
cardinality is violated in `BotorchRecommender` | ||
- Attribute `max_n_subspaces` to `BotorchRecommender`, allowing to control | ||
optimization behavior in the presence of multiple subspaces | ||
- Utilities `inactive_parameter_combinations` and`n_inactive_parameter_combinations` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think these as well as the utilities noted below are not user-facing, are they? In that case, I do not think that it is necessary to include them in the CHANGELOG
|
||
return pd.DataFrame(points, columns=subspace_continuous.parameter_names) | ||
|
||
def _recommend_continuous_torch( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need an additional function for that? In my opinion, this check could be done in the original function, and this function just adds another layer of complexity. Also, I think the name is weird, why the explicit mention of torch
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_recommend_continuous
is the outer layer and does some checks and returns pd.DataFrame
, while _recommend_continuous_torch
is only responsible of returning points, acqf_value
with type tensor
. Moreover, _recommend_continuous_torch
is needed in _optimize_continuous_subspaces
(see https://github.com/emdgroup/baybe/pull/301/files#r1825774190). Correct me if I'm wrong @AdrianSosic
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, exactly, it's to fit the required interfaces on both sides. _recommend_continuous
still operates on the dataframe level as it needs to return outputs that are then shipped to the user. But there are also places like _optimize_continuous_subspaces
where you still need access to the low-level output like the corresponding acqf values. The only (sort of reasonable) way I saw to achieve both here is to extract that inner part and declare a separate function for it. But I'd be very happy if you see a more elegant alternative 👍🏼
subspace_continuous: SubspaceContinuous, | ||
batch_size: int, | ||
) -> tuple[Tensor, Tensor]: | ||
"""Recommend from a continuous search space with cardinality constraints. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is still the case.
- Support checking minimum cardinality or maximum cardinatliy - Adapt to threshold per cardinality - Update related tests
ee54c89
to
bb1fc3d
Compare
* Assure parameter bounds cover zero * Check invalid "activate_parameter" option first
bb1fc3d
to
bddab62
Compare
Since there seems to be the wish to merge this soon, please put this out of draft mode if it is indeed ready for review. I will not have a look a this earlier, since I assume that being in Draft mode means that it is not yet ready (which however conflicts with some of the comments that I see here) |
(edited by @AdrianSosic)
This PR adds support for cardinality constraints to
BotorchRecommender
. The core idea is to tackle the problem in an exhaustive search like manner, i.e. byThe PR implements two mechanisms for determining the configuration of inactive parameters:
The current aggregation step is to simply optimize all subspaces independently of each other and then return the batch from the subspace where the highest acquisition value is achieved. This has the side-effect that the set of inactive parameters is the same across the entire recommendation batch. This can be a desirable property in many use cases but potentially higher acquisition values can be obtained by altering the in-/activity sets across the batch. A simple way to achieve this (though out of scope for this PR) is by generalizing the sequential greedy principle to multiple subspaces.
Out of scope