Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Escaping Data for Data Types BOOLEAN / SUPER / GEOMETRY in Hash-Function #240

Merged

Conversation

MatheHostel
Copy link
Contributor

Description

Taking correct data_type for ghost records and escaping hash-values for hashing function for data type boolean, geometry and super. These data types cannot be used in TRIM() function (in hashing-function)

Fixes #(issue)

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Configurate Hash-Columns with "is_hashdiff" with BOOLEAN / GEOMETRY / SUPER data types and create the stage. The error: function pg_catalog.btrim(boolean, "unknown") does not exist will occur.

  • Tests you ran

Test Configuration:

  • datavault4dbt-Version:
  • dbt-Version: cloud/core, version number
  • dbt-adapter-Version:

Checklist:

  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation or included information that needs updates (e.g. in the Wiki)

…or hashing function for data type boolean, geometry and super. These data types cannot be used in TRIM() function (in hashing-function)
@tkirschke
Copy link
Member

Hi @MatheHostel ,

Thanks for your contribution!
Can you please move your new macro "get_field_hash_by_datatype" into a new file within "macros/supporting/field_hash_by_datatype"?

Best regards
Tim

@MatheHostel
Copy link
Contributor Author

Hi @MatheHostel ,

Thanks for your contribution! Can you please move your new macro "get_field_hash_by_datatype" into a new file within "macros/supporting/field_hash_by_datatype"?

Best regards Tim

Done.

@tkirschke tkirschke added the testing To trigger the automated test workflow as internal User. label Sep 2, 2024
empty commit to trigger test pipeline
@remoteworkflow
Copy link

remoteworkflow bot commented Sep 2, 2024

Link to workflow summary: https://github.com/ScalefreeCOM/datavault4dbt-ci-cd/actions/runs/10668880822


RESULTS for Synapse:
❌ dbt-tests
❌ dbt-macro-tests


RESULTS for Postgres:
❌ dbt-tests
❌ dbt-macro-tests


RESULTS for BigQuery:
❌ dbt-tests
❌ dbt-macro-tests


RESULTS for Redshift:
❌ dbt-tests
❌ dbt-macro-tests


RESULTS for Snowflake:
❌ dbt-tests
❌ dbt-macro-tests


RESULTS for Exasol:
❌ dbt-tests
❌ dbt-macro-tests

@remoteworkflow
Copy link

remoteworkflow bot commented Sep 3, 2024

Link to workflow summary: https://github.com/ScalefreeCOM/datavault4dbt-ci-cd/actions/runs/10678132620


RESULTS for Synapse:
✅ dbt-tests
✅ dbt-macro-tests


RESULTS for Postgres:
✅ dbt-tests
✅ dbt-macro-tests


RESULTS for BigQuery:
✅ dbt-tests
✅ dbt-macro-tests


RESULTS for Redshift:
✅ dbt-tests
✅ dbt-macro-tests


RESULTS for Snowflake:
✅ dbt-tests
✅ dbt-macro-tests


RESULTS for Exasol:
❌ dbt-tests
✅ dbt-macro-tests

@tkirschke tkirschke merged commit bd35a70 into ScalefreeCOM:main Sep 3, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing To trigger the automated test workflow as internal User.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants