Warnings while building Scala/Spark project with SBT
One way is to manually tell sbt what dependencies, for instance:
dependencyOverrides ++= Set( "io.netty" % "netty" % "3.9.9.Final", "commons-net" % "commons-net" % "2.2", "com.google.guava" % "guava" % "11.0.2" )
For further details just refer this link: conflict management in sbt.
In such case – no, since the conflicts stem from using only spark-related artifacts released under same version. Spark is a project with big userbase and possibility of jar hell introduced due to transitive dependencies is rather low (although, technically not guaranteed).
Here there is small probability of a issue that may need careful manual dependency resolution (if this is possible at all). In these cases it is really hard to tell whether there is a problem before running the app and bump into some problem like missing class, method, mismatching method signature or some reflection-related problem.