Spark Shell "Failed to Initialize Compiler" Error on a mac

I just installed spark on my new machine and get the following error after installing Java, Scala and Apache-spark using homebrew. The install process is given below:

$ brew cask install java
$ brew install scala
$ brew install apache-spark

Once installed, when I try to run a basic example using spark-shell, i get the following error. Any help greatly appreciated.

$ spark-shell
 Using Spark's default log4j profile: org/apache/spark/
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
    at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:98)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at org.apache.spark.repl.Main$.doMain(Main.scala:70)
    at org.apache.spark.repl.Main$.main(Main.scala:53)
    at org.apache.spark.repl.Main.main(Main.scala)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(
    at java.base/java.lang.reflect.Method.invoke(
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)`


Spark is incompatible with Java 9, which is the version brew cask install java will install if it is up to date. If you did install Java 9, what you need to do is install Java 8 instead:

brew cask uninstall java
brew tap caskroom/versions
brew cask search java
brew cask install java8

As ZackK wrote spark is incompatible with Java9, so you can check the versions of java you have in your machine and choose a compatible version, assuming you have one.

$ sudo update-alternatives --config java

Which in my case returned:

There are 2 choices for the alternative java (providing /usr/bin/java).

  • 0/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java 1081 auto mode
  • *1/opt/java/jdk-9.0.4/bin/java 1 manual mode

The asterisk in front of 1 denotes the active version. Choosing 0 changed it to a compatible version.

$ java -version

which returned: openjdk version "1.8.0_151"

After the change spark-shell worked.

I faced same problem. But when I checked my laptop java version, it was 9. I just changed to java 8 and found every thing working fine.

Just check this solution. Hope it will work if you getting exact same error as start of this thread.

win10: You have to convert to jdk8: Set up JAVA_HOME = jdk8; Discard from path the C:\ProgramData\Oracle\Java\javapath; (it always show the jdk9)

Problem: Spark incompatible with current Java version

Here is another solution that use SDKMAN

Install sdkman

curl -s "" | bash

Then you should close and open another Terminal.

After that, install Java 8

sdk install java 8.0.181-zulu

Now, test if it work. Go to your spark/bin then run


You should not see that error again.

updating the alternatives is required for not only java $ sudo update-alternatives --config java but also javac, javap etc. $ sudo update-alternatives --config javac

So its better to remove above 8 java versions, and then install the java 8.

Need Your Help

Transactions for C# objects?


Just curious, is there any support for transactions on plain C# objects? Like

How to put jobs in a category for the Throttle Concurrent Builds plugin for Jenkins

plugins concurrency build jenkins

I have downloaded the TCB plugin for Jenkins. I have several builds that run tests. These builds must be run individually, as they access similar files that can cause tests to fail if more than one