Warnings while building Scala/Spark project with SBT

Warnings while building Scala/Spark project with SBT

Asked on December 12, 2018 in Apache-spark.
Add Comment


  • 3 Answer(s)

    One way is to manually tell sbt what dependencies,  for instance:

    dependencyOverrides ++= Set(
      "io.netty" % "netty" % "3.9.9.Final",
      "commons-net" % "commons-net" % "2.2",
      "com.google.guava" % "guava" % "11.0.2"
    )
    

    For further details just refer this link: conflict management in sbt.

    In such case – no, since the conflicts stem from using only spark-related artifacts released under same version. Spark is a project with big userbase and possibility of jar hell introduced due to transitive dependencies is rather low (although, technically not guaranteed).

    Here there is small probability of a issue that may need careful manual dependency resolution (if this is possible at all). In these cases it is really hard to tell whether there is a problem before running the app and bump into some problem like missing class, method, mismatching method signature or some reflection-related problem.

    Answered on December 12, 2018.
    Add Comment

    In sbt, Here the Spark is normally listed as a Provided needed, i.e.,

    "org.apache.spark" %% "spark-core" % sparkVersion % Provided
    

    There are unnecessary and conflicting recursive dependencies.

     

    Answered on December 12, 2018.
    Add Comment

    Here we could disable these warnings by adding this to build settings:

    evictionWarningOptions in update := EvictionWarningOptions.default
    .withWarnTransitiveEvictions(false)
    

     

    Answered on December 12, 2018.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.