Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lambda defaults #15

Open
seiflotfy opened this issue May 13, 2022 · 0 comments
Open

lambda defaults #15

seiflotfy opened this issue May 13, 2022 · 0 comments

Comments

@seiflotfy
Copy link
Member

taken from #13

@a-khaledf :

Default lambda Memory allocation is 128MB, I am afraid if we are batching 10000 events that might sometimes lead to memory exceeding which would crash the lambda function. if we are doing that I would say let's make the batching configurable and add note that memory allocation should be increased relevant to the batching configuration

@seiflotfy:

I tried it already and did some backfilling. memory did not exceed and I was hitting the 10k. But you might be right...
I am making it configurable...
However if we are doing multiple request back to axiom because of the batch sizes I don't think the default 300ms TTL of lambda would suffice!
WDYT?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant