HmFlashy
(Hugo Maitre)
December 6, 2018, 10:54am
1
Hello,
For a project we need to create a play web server using the apache spark libraries.
When launching the server with only creating a spark context we got this error:
We saw that we need to do a spark-submit to use the Spark Libraries but we need a main file that Play don’t have.
Hope someone will figure it out
Best Regards,
Hugo Maitre
aditya
(Aditya Athalye)
December 6, 2018, 12:46pm
2
The SparkContext
class is present in the spark-core lib.
Check if you have following dependency added to your build.sbt
"org.apache.spark" % "spark-core" % "2.4.0"
Version number may be different.
HTH
HmFlashy
(Hugo Maitre)
December 6, 2018, 1:32pm
3
I was using the org.spark-packages plugin.
I am trying importing the packages one by one.
Thank you
HmFlashy
(Hugo Maitre)
December 6, 2018, 1:53pm
4
Okay,
I used this plugin
It was too old and the build is failing so maybe that’s why.
My build.sbt now looks like this
The dependencyOverrides are there because I got an error about the jackson library being incompatible.
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.9.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.9.7"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.7"
libraryDependencies ++= Seq(
jdbc,
ehcache,
ws,
specs2 % Test,
guice,
"org.apache.spark" %% "spark-core" % "2.4.0",
"org.apache.spark" %% "spark-sql" % "2.4.0",
"org.apache.spark" %% "spark-mllib" % "2.4.0",
"org.apache.spark" %% "spark-streaming" % "2.4.0"
)
And now it works like a charm.
Thank you !