Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snowflake destination sync fails with custom namespace #26373

Open
1 task
remisalmon opened this issue May 22, 2023 · 3 comments
Open
1 task

Snowflake destination sync fails with custom namespace #26373

remisalmon opened this issue May 22, 2023 · 3 comments
Labels
area/connectors Connector related issues autoteam community Icebox Stale team/destinations Destinations team's backlog type/bug Something isn't working

Comments

@remisalmon
Copy link
Contributor

Connector Name

destination-snowflake

Connector Version

1.0.4

What step the error happened?

During the sync

Revelant information

Syncs to Snowflake when using a custom namespace (schema name in the destination database) fail because the connector is looking into the public schema instead of the <custom namespace> schema.

Example with a S3 source with destination database = AIRBYTE and custom namespace = S3:

  • a stage gets created in AIRBYTE.S3.<stage name>
  • the sync fails and the log shows Stage 'AIRBYTE.PUBLIC.<stage name>' does not exist or not authorized.

Relevant log output

2023-05-22 19:31:13 �[44msource�[0m > finished reading a stream slice
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.b.BufferedStreamConsumer(periodicBufferFlush):218 Periodic buffer flush started
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):133 Flushing all 1 current buffers (349 KB in total)
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.r.SerializedBufferingStrategy(flushAllBuffers):137 Flushing buffer of stream <stream name> (349 KB)
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.s.StagingConsumerFactory(lambda$flushBufferFunction$3):211 Flushing buffer for stream <stream name> (349 KB) to staging
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.r.BaseSerializedBuffer(flush):131 Wrapping up compression and write GZIP trailer data.
2023-05-22 19:31:13 �[43mdestination�[0m > INFO i.a.i.d.r.BaseSerializedBuffer(flush):138 Finished writing data to cebeee30-96b4-4f27-a79f-cd0be3b2a7242582027464504370606.csv.gz (362 KB)
2023-05-22 19:31:13 �[43mdestination�[0m > ERROR i.a.i.d.s.SnowflakeInternalStagingSqlOperations(uploadRecordsToStage):75 Failed to upload records into stage 2023/05/22/19/D57E552A-5226-4E65-B1A0-87F0D3CE187C/ net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error:
Stage 'AIRBYTE.PUBLIC.<stage name>' does not exist or not authorized.

Contribute

  • Yes, I want to contribute
@remisalmon
Copy link
Contributor Author

@remisalmon
Copy link
Contributor Author

This happens with specific source connectors only, I had this issue with Sentry and S3 source but not others.

@girarda girarda added the Icebox label Jun 19, 2024
@girarda girarda added team/destinations Destinations team's backlog and removed team/connectors-python labels Aug 21, 2024
@octavia-squidington-iii
Copy link
Collaborator

At Airbyte, we seek to be clear about the project priorities and roadmap. This issue has not had any activity for 180 days, suggesting that it's not as critical as others. It's possible it has already been fixed. It is being marked as stale and will be closed in 20 days if there is no activity. To keep it open, please comment to let us know why it is important to you and if it is still reproducible on recent versions of Airbyte.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/connectors Connector related issues autoteam community Icebox Stale team/destinations Destinations team's backlog type/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants