I am here to share you guys the integration of spark in the spring-boot application.
Usually spark is useful in multi-node/cluster environment.If you are using a single node cluster and using sparing-boot to submit jobs and getting workflow results to show somewhere in your web application.Then this approach will be straight forward way.The spring rest-api will launch spark jobs and the computed results will be given as a response.
yashwanth2804 / spring-spark-example
An example of setting up Spring-Boot with Spark.
spring-spark-example
An example of setting up Spring-Boot with Spark.
Job Submissions
Typical way
- CLI
# Run application locally on 8 cores ./bin/spark-submit \ --class org.apache.spark.examples.SparkPi \ --master local[8] \ /path/to/examples.jar \ 100
- CURL
curl -X POST -d http://master-host:6066/v1/submissions/create --header "Content-Type:application/json" --data '{ "action": "CreateSubmissionRequest", "appResource": "hdfs://localhost:9000/user/spark-examples_2.11-2.0.0.jar", "clientSparkVersion": "2.0.0", "appArgs": [ "10" ], "environmentVariables" : { "SPARK_ENV_LOADED" : "1" }, "mainClass": "org.apache.spark.examples.SparkPi", "sparkProperties": { "spark.jars": "hdfs://localhost:9000/user/spark-examples_2.11-2.0.0.jar", "spark.driver.supervise":"false", "spark.executor.memory": "512m", "spark.driver.memory": "512m", "spark.submit.deployMode":"cluster", "spark.app.name": "SparkPi", "spark.master": "spark://master-host:6066" } }
Springboot rest api
http://localhost:8056/api/mongodb
Top comments (0)