Extreme large value from estimates.C #1273
Replies: 2 comments
-
I'm assuming this is 1p data. The numbers can be very high sometimes and are in arbitrary units. I would not delete them I would just approach using common sense: do they look qualitatively reasonable in that they are tracking the fluorescence fluctuations in the movie? If so they are reasonable and adjusting parameters isn't needed. The units are simply arbitrary, and you probably just need to normalize the values. Note negative dff is OK it is subtracting from a baseline "zero" value, and you can go below baseline. |
Beta Was this translation helpful? Give feedback.
-
Thank you! And I understand that the negative dff is Ok. And wonder if it is normal that the estimates.C has the same result from detrend_dff (but change some value to negative)? Thank you very much! |
Beta Was this translation helpful? Give feedback.
-
Hello!!
I have some questions about the output.
First I tried detrend_df_f to correct the bleaching with F_dff = cnm.estimates.F_dff.copy()
idx_components = cnm.estimates.idx_components
good_components = F_dff[idx_components]
But it seems there is no difference between estimates.C and detrend_df/f and the later function even changes some numbers to negative.
Second, I checked the time series value, and the biggest number is 1685333.455 (to thousands to 10ish) and the biggest z score value is 15; and many of the numbers excess the max intensity I checked for the movie before running cnmfe
I wonder if should I treat these extreme value as an outlier and delete them? Or should I adjust my parameters?
Thank you very much!
Beta Was this translation helpful? Give feedback.
All reactions