You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To access hdfs from JAVA I had to add some dependencies (hadoop-common, hadoop-hdfs, hadoop-hdfs-client) and copied core-site.xml and hdfs-site.xml from the container into the resources dir of the java application. Optionally I added lib/native from the containers /opt/hadoop dir to LD_LIBRARY_PATH to prevent the warning WARN NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I suppose I have to perform at least some of the above "Java"-steps when using Elly.jl ... but which and how?
Please help, I would LOVE to use hadoop from Julia.
Greetings
Para
The text was updated successfully, but these errors were encountered:
Hi there,
I'm running a hadoop cluster (v3.2.1) using https://github.com/big-data-europe/docker-hadoop
I can run a Java program to test the existence of an file on the hdfs of the cluster.
But I cannot do this using Elly.jl:
yields
I'm using Julia 1.10.4 and Elly 0.5.1
To access hdfs from JAVA I had to add some dependencies (hadoop-common, hadoop-hdfs, hadoop-hdfs-client) and copied core-site.xml and hdfs-site.xml from the container into the resources dir of the java application. Optionally I added
lib/native
from the containers/opt/hadoop
dir to LD_LIBRARY_PATH to prevent the warningWARN NativeCodeLoader:60 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I suppose I have to perform at least some of the above "Java"-steps when using Elly.jl ... but which and how?
Please help, I would LOVE to use hadoop from Julia.
Greetings
Para
The text was updated successfully, but these errors were encountered: