How to work efficiently with SBT, Spark and “provided” dependencies ?
In this the new ‘Include dependencies is used with “Provided” scope’ in an IntelliJ configuration.”
With IntelliJ, the easy way to add provided dependencies to debug a task :
- Here src/main/scala is right-clicked.
- And Mark Directory as… > Test Sources Root is selected.
It says that IntelliJ to treat src/main/scala as a test folder for which all the dependencies tagged are added as provided to any run config (debug/run).
Whenever SBT is refreshed, repeat these step as IntelliJ will reset the folder to a regular source folder.
Here it is related on creating another subproject for running the project locally which as follows:
We need to change the build.sbt file by:
lazy val sparkDependencies = Seq( "org.apache.spark" %% "spark-streaming" % sparkVersion ) libraryDependencies ++= sparkDependencies.map(_ % "provided") lazy val localRunner = project.in(file("mainRunner")).dependsOn(RootProject(file("."))).settings( libraryDependencies ++= sparkDependencies.map(_ % "compile") )
After that the new subproject is ran locally with the classpath of module is used: localRunner under the Run Configuration.