Spark.jl install error

Good evening. I am attempting to install Spark.jl and an error is popping up. I am unsure what to look for given the following error message. Any help would be appreciated. Thank you.

(v1.0) pkg> build Spark
  Building Spark → `~/.julia/packages/Spark/kFCaM/deps/build.log`
┌ Error: Error building `Spark`: 
│ /usr/bin/mvn
│ WARNING: An illegal reflective access operation has occurred
│ WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/usr/share/maven/lib/guice.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
│ WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
│ WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
│ WARNING: All illegal access operations will be denied in a future release
│ [INFO] Scanning for projects...
│ [WARNING] 
│ [WARNING] Some problems were encountered while building the effective model for sparkjl:sparkjl:jar:0.1
│ [WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin org.apache.maven.plugins:maven-enforcer-plugin @ line 338, column 15
│ [WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin org.codehaus.mojo:build-helper-maven-plugin @ line 342, column 15
│ [WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin net.alchim31.maven:scala-maven-plugin @ line 372, column 15
│ [WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-source-plugin is missing. @ line 376, column 15
│ [WARNING] 
│ [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
│ [WARNING] 
│ [WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
│ [WARNING] 
│ [INFO] 
│ [INFO] --------------------------< sparkjl:sparkjl >---------------------------
│ [INFO] Building sparkjl 0.1
│ [INFO] --------------------------------[ jar ]---------------------------------
│ [INFO] 
│ [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ sparkjl ---
│ [INFO] Deleting /home/justin/.julia/packages/Spark/kFCaM/jvm/sparkjl/target
│ [INFO] 
│ [INFO] --- maven-enforcer-plugin:1.1.1:enforce (enforce-versions) @ sparkjl ---
│ [INFO] 
│ [INFO] --- build-helper-maven-plugin:1.7:add-source (add-scala-sources) @ sparkjl ---
│ [INFO] Source directory: /home/justin/.julia/packages/Spark/kFCaM/jvm/sparkjl/src/main/scala added.
│ [INFO] 
│ [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkjl ---
│ [INFO] Using 'UTF-8' encoding to copy filtered resources.
│ [INFO] skip non existing resourceDirectory /home/justin/.julia/packages/Spark/kFCaM/jvm/sparkjl/src/main/resources
│ [INFO] 
│ [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ sparkjl ---
│ [WARNING]  Expected all dependencies to require Scala version: 2.11.8
│ [WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
│ [WARNING] Multiple versions of scala libraries detected!
│ [INFO] /home/justin/.julia/packages/Spark/kFCaM/jvm/sparkjl/src/main/scala:-1: info: compiling
│ [INFO] Compiling 6 source files to /home/justin/.julia/packages/Spark/kFCaM/jvm/sparkjl/target/scala-2.11/classes at 1573556739370
│ [WARNING] OpenJDK 64-Bit Server VM warning: Ignoring option PermSize; support was removed in 8.0
│ [WARNING] OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
│ [ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
│ [ERROR] 	at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
│ [ERROR] 	at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
│ [ERROR] 	at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
│ [ERROR] 	at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
│ [ERROR] 	at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
│ [ERROR] 	at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
│ [ERROR] 	at scala.tools.nsc.Global$Run.<init>(Global.scala:1215)
│ [ERROR] 	at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
│ [ERROR] 	at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
│ [ERROR] 	at scala.tools.nsc.Driver.process(Driver.scala:51)
│ [ERROR] 	at scala.tools.nsc.Driver.main(Driver.scala:64)
│ [ERROR] 	at scala.tools.nsc.Main.main(Main.scala)
│ [ERROR] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
│ [ERROR] 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
│ [ERROR] 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
│ [ERROR] 	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
│ [ERROR] 	at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
│ [ERROR] 	at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
│ [INFO] ------------------------------------------------------------------------
│ [INFO] BUILD FAILURE
│ [INFO] ------------------------------------------------------------------------
│ [INFO] Total time:  3.300 s
│ [INFO] Finished at: 2019-11-12T03:05:39-08:00
│ [INFO] ------------------------------------------------------------------------
│ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project sparkjl: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
│ [ERROR] 
│ [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
│ [ERROR] Re-run Maven using the -X switch to enable full debug logging.
│ [ERROR] 
│ [ERROR] For more information about the errors and possible solutions, please read the following articles:
│ [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
│ ERROR: LoadError: failed process: Process(`mvn clean package -Dspark.version=2.1.0`, ProcessExited(1)) [1]
│ Stacktrace:
│  [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
│  [2] pipeline_error at ./process.jl:705 [inlined]
│  [3] #run#505(::Bool, ::Function, ::Cmd) at ./process.jl:663
│  [4] run at ./process.jl:661 [inlined]
│  [5] (::getfield(Main, Symbol("##3#4")))() at /home/justin/.julia/packages/Spark/kFCaM/deps/build.jl:13
│  [6] cd(::getfield(Main, Symbol("##3#4")), ::String) at ./file.jl:96
│  [7] top-level scope at none:0
│  [8] include at ./boot.jl:317 [inlined]
│  [9] include_relative(::Module, ::String) at ./loading.jl:1044
│  [10] include(::Module, ::String) at ./sysimg.jl:29
│  [11] include(::String) at ./client.jl:392
│  [12] top-level scope at none:0
│ in expression starting at /home/justin/.julia/packages/Spark/kFCaM/deps/build.jl:12
└ @ Pkg.Operations /buildworker/worker/package_linux32/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1096

Not sure, seems to be some mismatch in scala/java/sbt versions. Which version of Java are you using?

A quick google search brings me to this: https://github.com/sbt/sbt/issues/2958 which has some suggestions for troubleshooting.

was this issue resolved?