-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compare flux skewers from 2LPT #93
Comments
May be @slosar might have an idea here. |
I don't think exactly the same transformations will work, but it doesn't hurt to try. You'll probably have to retune them. |
Hi @slosar - let me explain a bit more the concern: Current setting:
Potential future setting:
In the long term we'd want to re-tune the parameters in the last two transformations, but given that most of the variance comes from the extra power, I do not expect the statistics in the resulting skewers to be off by order unity. My concern is whether applying a lognormal transformation to the 2LPT fields was the right thing to do, or whether there was something smarter. |
So Gaussian + lognormal is supposed to approximate matter. LPT gives matter already, so I don't think you should be lognormalling that once more. I would apply FGPA to LPT + Gaussian noise, or perhaps LPT + lognormal (gaussian noise). |
On Slack we had suggested 2LPT + lognormal(gaussian noise), but I wasn't quite sure whether this would need to be re-normalised later, to have bias=1 on linear scales. We could do some simple algebra, or run both and compare :-p |
Once we are happy with the 2LPT density skewers, we should go ahead and extract flux skewers.
We could start by applying the same transformations than in the case of the Gaussian skewers, without re-tunning the parameters, and see how it goes.
The only thing we'd need to do is to think how to actually combine the 2LPT skewers with the extra Gaussian noise.
The text was updated successfully, but these errors were encountered: