Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Float64 replaced by AbstractFloat for regression #226

Merged
merged 4 commits into from
Oct 16, 2023
Merged

Float64 replaced by AbstractFloat for regression #226

merged 4 commits into from
Oct 16, 2023

Conversation

xinadi
Copy link
Contributor

@xinadi xinadi commented Oct 14, 2023

Based on #225.
Occurunces of Float64 related to labels were replaced by AbstractFloat.
Problem noted in #225 (comment) still not solved as requare more refactor of parameters. But origin issue solved and now other Float type than Float64 will lead to regression, not calssification.
Also, I see a little perfomance boost and reducing number of allocations on my data.

@codecov
Copy link

codecov bot commented Oct 15, 2023

Codecov Report

All modified lines are covered by tests ✅

Comparison is base (605e4d4) 87.50% compared to head (4ccba62) 87.65%.

❗ Current head 4ccba62 differs from pull request most recent head 00fd6cb. Consider uploading reports for the commit 00fd6cb to get more accurate results

Additional details and impacted files
@@            Coverage Diff             @@
##              dev     #226      +/-   ##
==========================================
+ Coverage   87.50%   87.65%   +0.15%     
==========================================
  Files          10       10              
  Lines        1328     1329       +1     
==========================================
+ Hits         1162     1165       +3     
+ Misses        166      164       -2     
Files Coverage Δ
src/classification/main.jl 95.88% <100.00%> (-0.04%) ⬇️
src/measures.jl 97.39% <ø> (ø)
src/regression/main.jl 92.59% <100.00%> (+3.70%) ⬆️
src/regression/tree.jl 95.07% <100.00%> (+0.03%) ⬆️

... and 2 files with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@rikhuijzer
Copy link
Member

Thanks for opening this pull request @xinadi. I was trying to reproduce what situations this PR has fixed. Could you add a test that fails on the current version of DecisionTree.jl and succeeds on this PR?

@xinadi
Copy link
Contributor Author

xinadi commented Oct 16, 2023

@rikhuijzer I added test at started form line

# Test Float16 labels, and Float16 features
because float16 checks and it seems beast place for this test.
The idea of test is next: if classification takes place each prediciton will be equal to the some value from training labels set because estimation based on votes. On the other hand, for regression predicitons will be means and will not be exact as from training labes set. In theory, with PR test also could be failed in the sutiation when for each predicition all the trees estimate the same value. But for 10^3 number of predictions and machine precision for mean the probability of it should be very small. Also, as I understend the seed of rng guarantees same numbers. So I see proper work of test on my PC.

@@ -293,11 +293,7 @@ function apply_tree(tree::LeafOrNode{S,T}, features::AbstractMatrix{S}) where {S
for i in 1:N
predictions[i] = apply_tree(tree, features[i, :])
end
if T <: Float64
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a weird piece of code. If Float64, then convert to Float64. I agree with deletion and we'll see whether a bug pops up.

@rikhuijzer rikhuijzer merged commit 2c75d94 into JuliaAI:dev Oct 16, 2023
@rikhuijzer
Copy link
Member

Thank you @xinadi!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants