Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: An option to disable flushing to zero on CPU #22858

Open
Zentrik opened this issue Feb 19, 2025 · 0 comments
Open

Feature Request: An option to disable flushing to zero on CPU #22858

Zentrik opened this issue Feb 19, 2025 · 0 comments
Labels
CPU Related to XLA on CPU enhancement New feature or request

Comments

@Zentrik
Copy link
Contributor

Zentrik commented Feb 19, 2025

I would like a way to disable the code below,

function->addFnAttr("denormal-fp-math", "preserve-sign");

Xla supports disabling fastmath (e.g. the --xla_cpu_enable_fast_math flag) and we seem to disable flushing to zeros in the code below. So controlling whether we set denormal-fp-math seems reasonable to me.

// Tensorflow tries to enable the following behaviors in all its threads:
//
// - Denormals are zero (DAZ): roughly, operations treat denormal floats as
// zero.
// - Flush denormals to zero (FTZ): roughly, operations produce zero instead
// of denormal floats.
//
// In theory enabling these shouldn't matter since the compiler should ideally
// not leak its environment into generated code, but we turn off DAZ and FTZ
// to get some defense-in-depth.
tsl::port::ScopedDontFlushDenormal dont_flush_denormals;

Based on llvm/llvm-project#81204 (comment) and https://groups.google.com/g/llvm-dev/c/TDGKHFU4hzE/m/k-LEa3NvBQAJ I don't believe on x86_64 that setting denormal-fp-math actually sets the floating point register and instead might only affect constant folding.

EDIT: Also when on gpu the xla_gpu_ftz flag can be used to control this.

@aniruthraj aniruthraj added enhancement New feature or request CPU Related to XLA on CPU labels Feb 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CPU Related to XLA on CPU enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants