-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SQP Method with HiGHS #10
Conversation
For checking the two methods within Gurobi, aside from verifying output from import alphargs
# key problem variables loaded from standard format txt files
sigma, mubar, omega, n = alphargs.load_problem(
"examples/04/A04.txt",
"examples/04/EBV04.txt",
"examples/04/S04.txt"
)
sires = range(0, n, 2)
dams = range(1, n, 2)
lam = 0.5
kap = 1
# robust direct optimization using Gurobi
w_rbs, z_rbs, obj_rbs = alphargs.gurobi_robust_genetics(
sigma, mubar, omega, sires, dams, lam, kap, n)
# robust SQP optimization using Gurobi
w_sqp, z_sqp, obj_sqp = alphargs.gurobi_robust_genetics_sqp(
sigma, mubar, omega, sires, dams, lam, kap, n)
# compare robust using direct and SQP
alphargs.print_compare_solutions(
w_rbs, w_sqp, obj_rbs, obj_sqp, z1=z_rbs, z2=z_sqp,
name1="w_rbs", name2="w_sqp")
print("\nDirect:")
if not alphargs.check_uncertainty_constraint(z_rbs, w_rbs, omega, debug=True):
raise ValueError
print("\nSQP:")
if not alphargs.check_uncertainty_constraint(z_sqp, w_sqp, omega, debug=True):
raise ValueError
print("\nDone!") which produces the output i w_rbs w_sqp
1 0.38200 0.38214
2 0.38200 0.38214
3 0.11800 0.11786
4 0.11800 0.11786
w_rbs objective: 0.77684 (z = 0.37924)
w_sqp objective: 0.77684 (z = 0.37892)
Maximum change: 0.00014
Average change: 0.00014
Minimum change: 0.00014
Direct:
z: 0.37923871642022844
w'*Ω*w: 0.3792386953366983
Diff: 2.1083530143961582e-08
SQP:
z: 0.37892405482420255
w'*Ω*w: 0.37892405801748014
Diff: -3.1932775867993257e-09 I'm not sure why the slight (but not insignificant) difference between direct robust optimization and using SQP, though I haven't yet checked the MPS files to see if there's a model error or similar. |
Using a similar test to above but just looking at the standard non-robust problem, import alphargs
sigma, mubar, omega, n = alphargs.load_problem(
"examples/50/A50.txt",
"examples/50/EBV50.txt",
"examples/50/S50.txt",
issparse=True # loads into SciPy, not NumPy
)
sires = range(0, n, 2)
dams = range(1, n, 2)
lam = 0.5
kap = 1
w_gurobi, obj_gurobi = alphargs.gurobi_standard_genetics(
sigma, mubar, sires, dams, lam, n)
w_highs, obj_highs = alphargs.highs_standard_genetics(
sigma, mubar, sires, dams, lam, n)
alphargs.print_compare_solutions(
w_gurobi, w_highs, obj_gurobi, obj_highs, name1="Gurobi", name2="HiGHS") gives weights that differ by order |
These option values force writing of the model to a file only when running HiGHS from the command line - allows conversion from |
Was quickly becoming necessary for large examples
I say "working" in ba6d088 because the HiGHS SQP function gives the correct answer for the Between the two, it points to some issue with how I've defined the |
@jajhall Once h.addVar(0, highspy.kHighsInf)
h.changeColCost(dimension, kappa) to add an additional variable? I tried doing this to incorporate |
I've checked out ba6d088, but What version of |
According to
Yeah, of the back of what you said earlier I've been trying to avoid committing changes to the example file to avoid flooding the history with minor changes while debugging. I've tried to make usage of the two similar, so it just replaces This is what I'm using to compare HiGHS-HiGHS: sigma, mubar, omega, n = alphargs.load_problem(
"examples/04/A04.txt",
"examples/04/EBV04.txt",
"examples/04/S04.txt",
issparse=True
)
sires = range(0, n, 2)
dams = range(1, n, 2)
lam = 0.5
kap = 1
# computes the standard and robust genetic selection solutions
w_std, obj_std = alphargs.highs_standard_genetics(
sigma, mubar, sires, dams, lam, n)
w_rbs, z_rbs, obj_rbs = alphargs.highs_robust_genetics_sqp(
sigma, mubar, omega, sires, dams, lam, kap, n)
alphargs.print_compare_solutions(
w_std, w_rbs, obj_std, obj_rbs, z2=z_rbs, name1="w_std", name2="w_rbs") And this is what I'm using to compare Gurobi-HiGHS: sigma, mubar, omega, n = alphargs.load_problem(
"examples/50/A50.txt",
"examples/50/EBV50.txt",
"examples/50/S50.txt",
issparse=True
)
sires = range(0, n, 2)
dams = range(1, n, 2)
lam = 0.5
kap = 1
t0 = time.time()
w_grb, z_grb, obj_grb = alphargs.gurobi_robust_genetics_sqp(
sigma, mubar, omega, sires, dams, lam, kap, n, debug=False)
t1 = time.time()
print(f"Gurobi took {t1-t0:.5f} seconds")
t0 = time.time()
w_his, z_his, obj_his = alphargs.highs_robust_genetics_sqp(
sigma, mubar, omega, sires, dams, lam, kap, n, debug=False)
t1 = time.time()
print(f"HiGHS took {t1-t0:.5f} seconds")
alphargs.print_compare_solutions(
w_grb, w_his, obj_grb, obj_his, z1=z_grb, z2=z_his,
name1="Gurobi", name2="HiGHS", tol=1e-6
) |
I'm not certain yet, but I think the issue for the pushed method comes from not extending |
Commenting out the
I immediately see reported
Your |
Correcting the
That's because
not
Correcting that, I see that HiGHS runs for
If I test against
My instinct is that, once the gap gets to It would be good to know why you get a segfault in The good news is that |
A take-home from this is that, when you're developing the coding of model-building, keep the logging on, and test the return code from |
Ach damn! I'd suspected an issue with how I'd extended it but couldn't find which values were wrong before leaving. Thanks!
Is there a way to tell HiGHS what level of logging to provide? It would be useful to print warnings and errors, but skip everything else. Since it's solving a problem at each iteration there's a lot of terminal output which can obscure the important information |
Useful to know that provides a return code, will incorporate that! Are these the values it returns? Tried to find the docs for |
I've kept a minimum example of this error to one side in case you want it later. Specifically, in that situation where the matrix pointers/indices are setup wrong and it's non-diagonal, there's a segfault if
and an aborted core dump if
|
Thanks. If |
The most complete documentation is for the C API https://ergo-code.github.io/HiGHS/stable/interfaces/c_api/ (since it's the one most used by developers of interfaces to HiGHS). Unfortunately, since you can't pass Python parameters by reference, some of the The rule of thumb with HiGHS is that if the method doesn't return a structure like https://ergo-code.github.io/HiGHS/dev/structures/enums/#HighsStatus |
No, the |
Yeah that doesn't seem to be working, at least when I first tried playing around with it. These are all good to know though, thanks! |
Regression introduced, free(): invalid pointer
Aborted (core dumped) while WARNING: LP matrix packed vector contains 1 |values| in [0, 0] less than or equal to 1e-09: ignored
ERROR: Matrix dimension validation fails on index size = 4 < 16 = number of nonzeros
ERROR: Matrix dimension validation fails on value size = 4 < 16 = number of nonzeros so likely the same problem as last time, I've just oops a CSR index somewhere. Think it's that np.append(sigma.indptr, sigma.indptr[-1]) not Update: right place, wrong correction. Hopefully about to fix |
AKA I wasn't actually extending sigma correctly in the first place
An SQP method using Gurobi was added as a proof of concept while I was working out the HiGHS documentation. Now clarified that this is the HiGHS' reference QP example (with this using
passModel
as an alternative), we will seek to implement this method in HiGHS instead.Initial draft of the PR just templates the functions, nothing more. Aside from implementing the method, it'll be necessary to write some new loaders to get HiGHS what it needs. The key steps to completion (in no particular order) are: