Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Nginx] Fix stubstatus benchmark #8914

Merged
merged 1 commit into from
Jan 17, 2024

Conversation

ruflin
Copy link
Collaborator

@ruflin ruflin commented Jan 17, 2024

When ingesting data with the stream command, the current track failed with the following error:

{"errors":true,"took":0,"ingest_took":1,"items":[{"create":{"_index":"metrics-nginx.stubstatus-ep","_id":null,"status":400,"error":{"type":"illegal_argument_exception","reason":"the document timestamp [2024-01-17T13:58:56.000Z] is outside of ranges of currently writable indices [[2024-01-17T10:46:15.000Z,2024-01-17T13:33:47.000Z]]"}}}

The reason was, that the timestamp used did not contain any timezone so it was in UTC. TSDB automatically sets up a range based on now for the local setup so when you were too far out from UTC ingestion failed. Adding the local timezone to the timezone fixes this and also ensures data that is ingested with stream shows up as "current" data in Discover.

To test this, run the following command:

elastic-package benchmark stream --benchmark stubstatus-benchmark -v

This should also work with the 15min prefill as the timestamp set by TSDB for the first index is in the past.

When ingesting data with the `stream` command, the current track failed with the following error:

```
{"errors":true,"took":0,"ingest_took":1,"items":[{"create":{"_index":"metrics-nginx.stubstatus-ep","_id":null,"status":400,"error":{"type":"illegal_argument_exception","reason":"the document timestamp [2024-01-17T13:58:56.000Z] is outside of ranges of currently writable indices [[2024-01-17T10:46:15.000Z,2024-01-17T13:33:47.000Z]]"}}}
```

The reason was, that the timestamp used did not contain any timezone so it was in UTC. TSDB automatically sets up a range based on `now` for the local setup so when you were too far out from UTC ingestion failed. Adding the local timezone to the timezone fixes this and also ensures data that is ingested with `stream` shows up as "current" data in Discover.

To test this, run the following command:

```
elastic-package benchmark stream --benchmark stubstatus-benchmark -v
```

This should also work with the 15min prefill as the timestamp set by TSDB for the first index is in the past.
@ruflin ruflin self-assigned this Jan 17, 2024
@ruflin ruflin requested a review from a team as a code owner January 17, 2024 14:07
@ruflin ruflin requested a review from ali786XI January 17, 2024 14:07
@ruflin
Copy link
Collaborator Author

ruflin commented Jan 17, 2024

Do we also have to adjust some other benchmark configs with this change in case it is missing in some?

@ruflin
Copy link
Collaborator Author

ruflin commented Jan 17, 2024

I followed up with elastic/elastic-package#1625 to make these kind of issues easier to debug.

@elasticmachine
Copy link

🚀 Benchmarks report

To see the full report comment with /test benchmark fullreport

@elasticmachine
Copy link

💚 Build Succeeded

cc @ruflin

@ali786XI
Copy link
Contributor

Do we also have to adjust some other benchmark configs with this change in case it is missing in some?

Yes I think so this one would be the candidate from at least I raised #8761

Copy link
Contributor

@ali786XI ali786XI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Changes working fine and now there is no error faced like here

image

@ruflin ruflin merged commit 61aad07 into elastic:main Jan 17, 2024
1 check passed
@ruflin ruflin deleted the nginx-timezone-timestamp branch January 17, 2024 14:48
@ruflin
Copy link
Collaborator Author

ruflin commented Jan 17, 2024

@aliabbas-elastic Could you do a follow up PR on the ones that uses metrics with the time zone part?

@ali786XI
Copy link
Contributor

Sure @ruflin I am raising them

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants