Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

XGBoost 2.0.0 breaking changes #651

Closed
ksaur opened this issue Sep 20, 2023 · 3 comments
Closed

XGBoost 2.0.0 breaking changes #651

ksaur opened this issue Sep 20, 2023 · 3 comments

Comments

@ksaur
Copy link
Contributor

ksaur commented Sep 20, 2023

Hi there, XGB 2.0.0 was released last week (Sept 12, 2023) that breaks some things.

Here is an example:


from onnxmltools.convert import convert_xgboost
from onnxmltools.convert.common.data_types import FloatTensorType
import xgboost as xgb

import numpy as np
n_features = 28
n_total = 100
np.random.seed(0)
X = np.random.rand(n_total, n_features)
X = np.array(X, dtype=np.float32)
y = np.random.randint(2, size=n_total)

# Create XGBoost Model
model = xgb.XGBClassifier()
model.fit(X, y)

conv_model = convert_xgboost(model, initial_types=[("input", FloatTensorType(shape=[None, None]))])

with: xgboost==1.7.6
Output: No error

with xgboost==2.0.0
Output:
TypeError: unsupported operand type(s) for [//](https://github.com/onnx/onnxmltools/issues/new): 'int' and 'NoneType'

cc: microsoft/hummingbird#732

@ParanoidAltoid
Copy link

Found a really pernicious bug where 2.0.0 caused incorrect output, but with no errors.

Running sklearn-onnx's own tutorial on serializing xgboost models fails:
http://onnx.ai/sklearn-onnx/auto_examples/plot_pipeline_xgboost.html
Downloading the file at the bottom of that page and running with 2.0.0 gives this:

predict [162.61285 102.44898 163.59125 112.00508 122.19469]
predict [ 11.50652  -48.65735   12.484911 -39.10126  -28.911642]

(Both those should be the same, second is the same model but after serializing.)

Maybe I'm missing something obvious but this was brutal for me. Would a warning be worth adding for 2.0.0 when calling this function until it's fixed?

@xadupre
Copy link
Collaborator

xadupre commented Nov 17, 2023

Your example is passing now.

@xadupre
Copy link
Collaborator

xadupre commented Nov 17, 2023

I'll close the issue.

@xadupre xadupre closed this as completed Nov 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants