-
Notifications
You must be signed in to change notification settings - Fork 204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
destination sqlalchemy with Exasol and its sqlachemy-exasol backend does not work in Dlt #2249
Comments
@JZ-poc thx for this report, it was very helpful! regarding the issues
|
* adds exasol enity not found cases to recognized exceptions * makes sqlalchemy indexes optional and off by default
@rudolfix : Thank you. Regarding point 3, I exchanged the read_csv with read_csv_duckdb. Same error. Note that the csv file does not contain any dates at all, so dlt internals actually created the error while trying to write data to either _dlt_version, _dlt_pipeline_state, _dlt_loads. Any idea? Sorry to ask: I can see that your commit was merged to devel branch. How do I see when it is in master/main? Here the log in DEBUG mode: The above exception was the direct cause of the following exception: Traceback (most recent call last): <class 'TypeError'> |
dlt version
1.5.0
Describe the problem
I am experiencing multiple issues when using dlt with the SQLAlchemy destination for Exasol. My expectation was that dlt would successfully generate a table for the CSV file (test_data.csv) in Exasol. However, the process encounters multiple failures, including:
Missing dlt metadata tables, causing dlt to issue a SELECT query on non-existent tables.
Incorrect table creation SQL, leading to an error due to an unsupported UNIQUE constraint on the _dlt_id column.
Serialization error (TypeError: Object of type DateTime is not JSON serializable), preventing further execution.
A similar issue has been reported for SAP HANA (Issue #2110), and a fix has been proposed in PR #2213, which might also resolve the problem for Exasol.
Errors Encountered
Expected behavior
My expectation was that dlt would properly generate a table for the CSV file (test_data.csv) in Exasol. However, the process fails due to multiple errors.
Expected behavior:
Steps to reproduce
python script test_pipeline.py for executing the pipeline
The following secrets.toml is used for setting up the sqlalchemy destination
The csv file test_data.csv is used:
just run the pipeline using the command
python test_pipeline.py
Operating system
Linux
Runtime environment
Docker, Docker Compose
Python version
3.10
dlt data source
verified CSV source:
The csv file test_data.csv is used:
dlt destination
No response
Other deployment details
I ran everything in a docker container.
Additional information
A similar issue has been reported for SAP HANA (Issue #2110), and a fix has been proposed in PR #2213, which might also resolve the problem for Exasol.
The following workarounds were executed:
Creating the tables _dlt_loads, _dlt_pipeline_state, _dlt_version, target table test_data:
The text was updated successfully, but these errors were encountered: