Skip to content

suchitgupta01/spark-scala-k8-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

spark-scala-k8-app

A simple code on showing how to package the Spark Scala code and deploy it on Kubernetes using spark-on-k8s-operator.

Project structure

Project Structure

Set up for packaging

  1. Create a project folder.
  2. Create a plugins.sbt file inside project folder and add a sbt plugin. This is required for building a flat jar
    addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10")
    
  3. Create a build.properties file inside project folder and add a sbt version.
    sbt.version=1.3.9
    
  4. In build.sbt, add the task for merge strategy.
    assemblyMergeStrategy in assembly := {
      case "META-INF/services/org.apache.spark.sql.sources.DataSourceRegister" => MergeStrategy.concat
      case PathList("META-INF", xs@_*) => MergeStrategy.discard
      case "application.conf" => MergeStrategy.concat
      case x => MergeStrategy.first
    }
    

Build docker images or use step local volume

  1. docker build -t test/spark-operator:latest .
  2. docker build -f Dockerfile-app -t test/spark-scala-k8-app:latest .

Deploy the code on K8 using docker image

Execute following command to run the code in k8:

kubectl apply -f examples/spark-scala-k8-app.yaml

Kubernetes Dashboard

Deploy the code on K8 using volume

Execute following command to run the code in k8:

kubectl apply -f examples/spark-scala-file-k8-app.yaml

About

A sample on showing how to deploy the Spark Scala code on Kubernetes using spark-ink8s-operation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •