build.sbt: how to add spark dependencies

build.sbt: how to add spark dependencies

Asked on December 21, 2018 in Apache-spark.
Add Comment


  • 2 Answer(s)

    Here there is a issue of confusing between scala 2.11 and 2.10 artifacts. We have:

    scalaVersion := "2.11.8"
    

    After that:

    libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
    

    When the 2.10 artifact is being needed. We are mixing Spark versions instead of using a consistent version:

    // spark 1.6.1
    libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
     
    // spark 1.4.1
    libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"
     
    // spark 0.9.0-incubating
    libraryDependencies += "org.apache.spark" % "spark-streaming-twitter_2.10" % "0.9.0-incubating"
    

    Here the both the issues can be solved using build.sbt:

    name := "hello"
     
    version := "1.0"
     
    scalaVersion := "2.11.8"
     
    val sparkVersion = "1.6.1"
     
    libraryDependencies ++= Seq(
      "org.apache.spark" %% "spark-core" % sparkVersion,
      "org.apache.spark" %% "spark-streaming" % sparkVersion,
      "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
    )
    

    Here it is not required to manually add twitter4j dependencies since they are added transitively by spark-streaming-twitter.

     

    Answered on December 21, 2018.
    Add Comment

    Here try by using below code, Its works:

    name := "spark_local"
     
    version := "0.1"
     
    scalaVersion := "2.11.8"
     
    libraryDependencies ++= Seq(
      "org.twitter4j" % "twitter4j-core" % "3.0.5",
      "org.twitter4j" % "twitter4j-stream" % "3.0.5",
      "org.apache.spark" %% "spark-core" % "2.0.0",
      "org.apache.spark" %% "spark-sql" % "2.0.0",
      "org.apache.spark" %% "spark-mllib" % "2.0.0",
      "org.apache.spark" %% "spark-streaming" % "2.0.0"
    )
    
    Answered on December 21, 2018.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.