Running spark scala example fails

Running spark scala example fails

Asked on January 8, 2019 in Apache-spark.
Add Comment


  • 3 Answer(s)

    For using with 2.10.x versions of scala, spark-core_2.10 is built .

    Here use this:

    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
    

    For scala version. correct _2.10 or _2.11 version is selected.

    Note: When compiling against the same versions of scala and spark as the ones on the cluster.

    Answered on January 8, 2019.
    Add Comment

    Alternatively the problem can be solved by downgrading the scala version to 2.10.4

    name := "test-one"
     
    version := "1.0"
     
    //scalaVersion := "2.11.2"
    scalaVersion := "2.10.4"
     
    libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
    
    Answered on January 8, 2019.
    Add Comment

    Here try by using the below command:

    scalaVersion := "2.11.1"
    libraryDependencies ++= Seq(
              "org.apache.spark" % "spark-core_2.11" % "2.2.0",
              "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
    )
    

    And this works.

    Answered on January 8, 2019.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.