-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SUPPORT] Can't redefine: array #12751
Comments
@rangareddy , thank u for answer! |
Hi @DontSingRus Is there any update with Spark 3.5 and if it is working we will close this issue. |
@rangareddy on spark 3.5 with parquet-common-1.13.1, parquet-hadoop-1.13.1 all right |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the problem you faced
I have a source with protobuf, after parse them in pyspark df with fixed structure start writing in hudi table. First part of data write done, second write causes an error: org.apache.avro.SchemaParseException: Can't redefine: array
df structure:
write options:
To Reproduce
I couldn't reproduce the problem, I tried it locally, on a similar server with the same data, everything is recording successfully. On the same server where the error occurred with different data and a different schema, but with the same code, everything is still successful
Expected behavior
I was expecting to see a successful entry like it was before
Environment Description
Hudi version : 0.15.0
Spark version : 3.0.3
Storage (HDFS/S3/GCS..) : s3
Running on Docker? (yes/no) : no
jars: parquet-common-1.10.1, parquet-hadoop-1.10.1, spark-avro_2.12-3.0.3
Stacktrace
The text was updated successfully, but these errors were encountered: