libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
Further more, since Spark needs to run a spark cluster in a jar, a sbt plugin must be added for the packaging. Create a file named "assembly.sbt" in the "project" folder under the root directory and put the following content:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
Now in the command line, navigate to the project root directory and run "sbt compile", "sbt package" command after you put in your spark scala code, then "sbt run" or "spark-submit" depending on whether you want to run it locally or submit to spark cluster
No comments:
Post a Comment