Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does prior containment work? #44

Open
maxbiostat opened this issue Apr 6, 2022 · 0 comments
Open

Does prior containment work? #44

maxbiostat opened this issue Apr 6, 2022 · 0 comments
Assignees
Labels
Milestone

Comments

@maxbiostat
Copy link
Owner

maxbiostat commented Apr 6, 2022

Grinsztajn et al. (2021) argue (pp 6217) that some loose bounds for R0 are [1, 10]. I sort of agree. But what I wanna know is this: if I write

Pr(R0 \in   [1, 10]  | w) >= alpha,

where w are the prior hyperparameters and alpha \in (0, 1) is a prior probability level,
(i) is it easy to set w to achieve a certain alpha? How does that look for each of {gamma, log-normal, half-normal} priors?
(ii) Does this work in the sense of guaranteeing non-degenerate prior predictives and posterior inferences?
One reason to believe (ii) is shaky is that for (certain) gamma priors and the half-normal priors, the induced prior on R0 is heavy-tailed. Thus it might still assign non-trivial mass to intervals in the vicinity of 100, say.

@maxbiostat maxbiostat self-assigned this Apr 6, 2022
@maxbiostat maxbiostat added this to the full draft milestone Apr 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant