Connectiong to the Hive metastore on hdfs using Hive.jl or Spark.jl

Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend

This looks like mismatch between your Hadoop/Spark installation and Spark.jl’s version of base libraries. Please ask your Hadoop admin about:

  • Spark version
  • Hadoop version
  • YARN version

If they are different from what we have, try changing the version in that file to the ones in your cluster, rebuild Spark (Pkg.build("Spark")) and try again.

Quite unlikely to work, but you may also try solution from here, e.g.:

sess = SparkSession(master="yarn-client", 
                    config=Dict("spark.driver.extraJavaOptions" => "-Diop.version=4.1.0.0"))