Introduction to Spark with Scala Introduction to Spark with Scala Himanshu Gupta Software Consultant Knoldus Software LLP Himanshu Gupta Software Consultant Knoldus Software LLP
Who am I ?Who am I ? Himanshu Gupta (@himanshug735) Software Consultant at Knoldus Software LLP Spark & Scala enthusiast Himanshu Gupta (@himanshug735) Software Consultant at Knoldus Software LLP Spark & Scala enthusiast
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
What is Apache Spark ?What is Apache Spark ? Fast and general engine for large-scale data processing with libraries for SQL, streaming, advanced analytics Fast and general engine for large-scale data processing with libraries for SQL, streaming, advanced analytics
Spark HistorySpark History Project Begins at UCB AMP Lab 20092009 20102010 Open Sourced Apache Incubator 20112011 20122012 20132013 20142014 20152015 Data Frames Cloudera Support Apache Top level Spark Summit 2013 Spark Summit 2014
Spark StackSpark Stack Img src - http://spark.apache.org/Img src - http://spark.apache.org/
Fastest Growing Open Source ProjectFastest Growing Open Source Project Img src - https://databricks.com/blog/2015/03/31/spark-turns-five-years-old.htmlImg src - https://databricks.com/blog/2015/03/31/spark-turns-five-years-old.html
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
Code SizeCode Size Img src - http://spark-summit.org/wp-content/uploads/2013/10/Zaharia-spark-summit-2013-matei.pdfImg src - http://spark-summit.org/wp-content/uploads/2013/10/Zaharia-spark-summit-2013-matei.pdf
Word Count Ex. public static class WordCountMapClass extends MapReduceBase implements Mapper<LongWritable, Text, Text, IntWritable> { private final static IntWritable one = new IntWritable(1); private Text word = new Text(); } public void map(LongWritable key, Text value, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException { String line = value.toString(); StringTokenizer itr = new StringTokenizer(line); while (itr.hasMoreTokens()) { word.set(itr.nextToken()); output.collect(word, one); } } public static class WorkdCountReduce extends MapReduceBase implements Reducer<Text, IntWritable, Text, IntWritable> { public void reduce(Text key, Iterator<IntWritable> values, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException { int sum = 0; while (values.hasNext()) { sum += values.next().get(); } output.collect(key, new IntWritable(sum)); } } val file = spark.textFile("hdfs://...") val counts = file.flatMap(line => line.split(" ")) .map(word => (word, 1)) .reduceByKey(_ + _) counts.saveAsTextFile("hdfs://...")
Daytona GraySort Record: Data to sort 100TB Daytona GraySort Record: Data to sort 100TB Img src -http://www.slideshare.net/databricks/new-directions-for-apache-spark-in-2015Img src -http://www.slideshare.net/databricks/new-directions-for-apache-spark-in-2015 Hadoop (2013):Hadoop (2013): 2100 nodes2100 nodes 72 minutes72 minutes Spark (2014):Spark (2014): 206 nodes206 nodes 23 minutes23 minutes
Runs EverywhereRuns Everywhere Img src - http://spark.apache.org/
Who are using Apache Spark ?Who are using Apache Spark ? Img src - http://www.slideshare.net/datamantra/introduction-to-apache-spark-45062010Img src - http://www.slideshare.net/datamantra/introduction-to-apache-spark-45062010
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
Brief Introduction to RDDBrief Introduction to RDD  RDD stands for Resilient Distributed Dataset  A fault tolerant, distributed collection of objects.  In Spark all work is expressed in following ways: 1) Creating new RDD(s) 2) Transforming existing RDD(s) 3) Calling operations on RDD(s)  RDD stands for Resilient Distributed Dataset  A fault tolerant, distributed collection of objects.  In Spark all work is expressed in following ways: 1) Creating new RDD(s) 2) Transforming existing RDD(s) 3) Calling operations on RDD(s)
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) This is the Spark Configuration
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) This is the Spark Context Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) This is the Spark Context Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("data.txt") Extract lines from text file Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) Map lines to words map Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) Word Count RDD map groupBy Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) val wordCount = wordCountRDD.collect Map[word, count] map groupBy collect Starts Computation Contd...Contd...
Example (RDD)Example (RDD) val master = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) val wordCount = wordCountRDD.collect map groupBy collect Transformation Action Contd...Contd...
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
Brief Introduction to Spark StreamingBrief Introduction to Spark Streaming Img src - http://spark.apache.org/Img src - http://spark.apache.org/
How Spark Streaming Works ?How Spark Streaming Works ? Img src - http://spark.apache.org/Img src - http://spark.apache.org/
Why we need Spark Streaming ?Why we need Spark Streaming ? High Level API:High Level API: TwitterUtils.createStream(...) .filter(_.getText.contains("Spark")) .countByWindow(Seconds(10), Seconds(5)) //Counting tweets on a sliding window Fault Tolerant:Fault Tolerant: Integration:Integration: Img src - http://spark.apache.org/Img src - http://spark.apache.org/ Integrated with Spark SQL, MLLib, GraphX...
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) Specify Spark Configuration
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) Setup Stream Context Contd...Contd...
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) This is the ReceiverInputDStream lines DStream at time 0 - 1 at time 1 - 2 at time 2 - 3 at time 3 - 4 Contd...Contd...
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 map Creates a Dstream (sequence of RDDs) Contd...Contd...
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) val wordCounts = words.reduceByKey(_ + _) lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 wordCount DStream map groupBy Groups Dstream by Words Contd...Contd...
Example (Spark Streaming)Example (Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) val wordCounts = words.reduceByKey(_ + _) ssc.start() lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 wordCount DStream map groupBy Start streaming & computation Contd...Contd...
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
How to Install Spark ?  Download Spark from - http://spark.apache.org/downloads.html  Extract it to a suitable directory.  Go to the directory via terminal & run following command - mvn -DskipTests clean package  Now Spark is ready to run in Interactive mode ./bin/spark-shell  Download Spark from - http://spark.apache.org/downloads.html  Extract it to a suitable directory.  Go to the directory via terminal & run following command - mvn -DskipTests clean package  Now Spark is ready to run in Interactive mode ./bin/spark-shell
sbt Setup name := "Spark Demo" version := "1.0" scalaVersion := "2.10.5" libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.2.1", "org.apache.spark" %% "spark-streaming" % "1.2.1", "org.apache.spark" %% "spark-sql" % "1.2.1", "org.apache.spark" %% "spark-mllib" % "1.2.1" )
AgendaAgenda ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
Demo
Download Code https://github.com/knoldus/spark-scala
References http://spark.apache.org/ http://spark-summit.org/2014 http://spark.apache.org/docs/latest/quick-start.html http://stackoverflow.com/questions/tagged/apache-spark https://www.youtube.com/results?search_query=apache+spark http://apache-spark-user-list.1001560.n3.nabble.com/ http://www.slideshare.net/paulszulc/apache-spark-101-in-50-min
Presenter: himanshu@knoldus.com @himanshug735 Presenter: himanshu@knoldus.com @himanshug735 Organizer: @Knolspeak http://www.knoldus.com http://blog.knoldus.com Organizer: @Knolspeak http://www.knoldus.com http://blog.knoldus.com Thanks

Introduction to Spark with Scala

  • 1.
    Introduction to Spark withScala Introduction to Spark with Scala Himanshu Gupta Software Consultant Knoldus Software LLP Himanshu Gupta Software Consultant Knoldus Software LLP
  • 2.
    Who am I?Who am I ? Himanshu Gupta (@himanshug735) Software Consultant at Knoldus Software LLP Spark & Scala enthusiast Himanshu Gupta (@himanshug735) Software Consultant at Knoldus Software LLP Spark & Scala enthusiast
  • 3.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 4.
    What is ApacheSpark ?What is Apache Spark ? Fast and general engine for large-scale data processing with libraries for SQL, streaming, advanced analytics Fast and general engine for large-scale data processing with libraries for SQL, streaming, advanced analytics
  • 5.
    Spark HistorySpark History ProjectBegins at UCB AMP Lab 20092009 20102010 Open Sourced Apache Incubator 20112011 20122012 20132013 20142014 20152015 Data Frames Cloudera Support Apache Top level Spark Summit 2013 Spark Summit 2014
  • 6.
    Spark StackSpark Stack Imgsrc - http://spark.apache.org/Img src - http://spark.apache.org/
  • 7.
    Fastest Growing OpenSource ProjectFastest Growing Open Source Project Img src - https://databricks.com/blog/2015/03/31/spark-turns-five-years-old.htmlImg src - https://databricks.com/blog/2015/03/31/spark-turns-five-years-old.html
  • 8.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 9.
    Code SizeCode Size Imgsrc - http://spark-summit.org/wp-content/uploads/2013/10/Zaharia-spark-summit-2013-matei.pdfImg src - http://spark-summit.org/wp-content/uploads/2013/10/Zaharia-spark-summit-2013-matei.pdf
  • 10.
    Word Count Ex. publicstatic class WordCountMapClass extends MapReduceBase implements Mapper<LongWritable, Text, Text, IntWritable> { private final static IntWritable one = new IntWritable(1); private Text word = new Text(); } public void map(LongWritable key, Text value, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException { String line = value.toString(); StringTokenizer itr = new StringTokenizer(line); while (itr.hasMoreTokens()) { word.set(itr.nextToken()); output.collect(word, one); } } public static class WorkdCountReduce extends MapReduceBase implements Reducer<Text, IntWritable, Text, IntWritable> { public void reduce(Text key, Iterator<IntWritable> values, OutputCollector<Text, IntWritable> output, Reporter reporter) throws IOException { int sum = 0; while (values.hasNext()) { sum += values.next().get(); } output.collect(key, new IntWritable(sum)); } } val file = spark.textFile("hdfs://...") val counts = file.flatMap(line => line.split(" ")) .map(word => (word, 1)) .reduceByKey(_ + _) counts.saveAsTextFile("hdfs://...")
  • 11.
    Daytona GraySort Record: Datato sort 100TB Daytona GraySort Record: Data to sort 100TB Img src -http://www.slideshare.net/databricks/new-directions-for-apache-spark-in-2015Img src -http://www.slideshare.net/databricks/new-directions-for-apache-spark-in-2015 Hadoop (2013):Hadoop (2013): 2100 nodes2100 nodes 72 minutes72 minutes Spark (2014):Spark (2014): 206 nodes206 nodes 23 minutes23 minutes
  • 12.
    Runs EverywhereRuns Everywhere Imgsrc - http://spark.apache.org/
  • 13.
    Who are usingApache Spark ?Who are using Apache Spark ? Img src - http://www.slideshare.net/datamantra/introduction-to-apache-spark-45062010Img src - http://www.slideshare.net/datamantra/introduction-to-apache-spark-45062010
  • 14.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 15.
    Brief Introduction toRDDBrief Introduction to RDD  RDD stands for Resilient Distributed Dataset  A fault tolerant, distributed collection of objects.  In Spark all work is expressed in following ways: 1) Creating new RDD(s) 2) Transforming existing RDD(s) 3) Calling operations on RDD(s)  RDD stands for Resilient Distributed Dataset  A fault tolerant, distributed collection of objects.  In Spark all work is expressed in following ways: 1) Creating new RDD(s) 2) Transforming existing RDD(s) 3) Calling operations on RDD(s)
  • 16.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) This is the Spark Configuration
  • 17.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) This is the Spark Context Contd...Contd...
  • 18.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) This is the Spark Context Contd...Contd...
  • 19.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("data.txt") Extract lines from text file Contd...Contd...
  • 20.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) Map lines to words map Contd...Contd...
  • 21.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) Word Count RDD map groupBy Contd...Contd...
  • 22.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) val wordCount = wordCountRDD.collect Map[word, count] map groupBy collect Starts Computation Contd...Contd...
  • 23.
    Example (RDD)Example (RDD) valmaster = "local" val conf = new SparkConf().setMaster(master) val sc = new SparkContext(conf) val lines = sc.textFile("demo.txt") val words = lines.flatMap(_.split(" ")).map((_,1)) val wordCountRDD = words.reduceByKey(_ + _) val wordCount = wordCountRDD.collect map groupBy collect Transformation Action Contd...Contd...
  • 24.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 25.
    Brief Introduction toSpark StreamingBrief Introduction to Spark Streaming Img src - http://spark.apache.org/Img src - http://spark.apache.org/
  • 26.
    How Spark StreamingWorks ?How Spark Streaming Works ? Img src - http://spark.apache.org/Img src - http://spark.apache.org/
  • 27.
    Why we needSpark Streaming ?Why we need Spark Streaming ? High Level API:High Level API: TwitterUtils.createStream(...) .filter(_.getText.contains("Spark")) .countByWindow(Seconds(10), Seconds(5)) //Counting tweets on a sliding window Fault Tolerant:Fault Tolerant: Integration:Integration: Img src - http://spark.apache.org/Img src - http://spark.apache.org/ Integrated with Spark SQL, MLLib, GraphX...
  • 28.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) Specify Spark Configuration
  • 29.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) Setup Stream Context Contd...Contd...
  • 30.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) This is the ReceiverInputDStream lines DStream at time 0 - 1 at time 1 - 2 at time 2 - 3 at time 3 - 4 Contd...Contd...
  • 31.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 map Creates a Dstream (sequence of RDDs) Contd...Contd...
  • 32.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) val wordCounts = words.reduceByKey(_ + _) lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 wordCount DStream map groupBy Groups Dstream by Words Contd...Contd...
  • 33.
    Example (Spark Streaming)Example(Spark Streaming) val master = "local" val conf = new SparkConf().setMaster(master) val ssc = new StreamingContext(conf, Seconds(10)) val lines = ssc.socketTextStream("localhost", 9999) val words = lines.flatMap(_.split(" ")).map((_, 1)) val wordCounts = words.reduceByKey(_ + _) ssc.start() lines DStream at time 0 - 1 words/pairs DStream at time 1 - 2 at time 2 - 3 at time 3 - 4 wordCount DStream map groupBy Start streaming & computation Contd...Contd...
  • 34.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 35.
    How to InstallSpark ?  Download Spark from - http://spark.apache.org/downloads.html  Extract it to a suitable directory.  Go to the directory via terminal & run following command - mvn -DskipTests clean package  Now Spark is ready to run in Interactive mode ./bin/spark-shell  Download Spark from - http://spark.apache.org/downloads.html  Extract it to a suitable directory.  Go to the directory via terminal & run following command - mvn -DskipTests clean package  Now Spark is ready to run in Interactive mode ./bin/spark-shell
  • 36.
    sbt Setup name :="Spark Demo" version := "1.0" scalaVersion := "2.10.5" libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.2.1", "org.apache.spark" %% "spark-streaming" % "1.2.1", "org.apache.spark" %% "spark-sql" % "1.2.1", "org.apache.spark" %% "spark-mllib" % "1.2.1" )
  • 37.
    AgendaAgenda ● What isSpark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo ● What is Spark ? ● Why we need Spark ? ● Brief introduction to RDD ● Brief introduction to Spark Streaming ● How to install Spark ? ● Demo
  • 38.
  • 39.
  • 40.
  • 41.

Editor's Notes

  • #5 Why javascript, why we are bothering to do javascript. beacuse as you know its typical to do web development without javascript. ITs the only language, that&amp;apos;s basically supported web browser. So at some point you need javascript code. ITs scripting language, not designed to scale large rich web application
  • #8 Easy to learn Now Javascript is easy to pick up because of the very flexible nature of the language. Because Javascript is not a compiled language, things like memory management is not big concern. Easy to Edit Its is easy to get started with because you don&amp;apos;t need much to do so. As we know, its a scripting language, so the code you write does not need to be compiled and as such does not require any compiler or any expensive software. Prototyping Language its a prototyping language. In a prototyping language, every object is an instance of a class. What that means is that objects can be defined, and developed on the fly to suit a particular use, rather than having to build out specific classes to handle a specific need Easy to debug There are many tools like firebug to debug javascript. to trace error
  • #11 Why we need to do compiling in JavaScript? gained many new apis, but language itself is mostly the same. Some developers really like javscript, but they feel that there should be other features included in javscript. many platforms that compiles high level language to javascript. It removes many of the hidden dangers that Javascript has like: * Missing critical semicolons you can write better javascript code in othe language. Major Reason:- to consistently work with the same language both on the server and on the client. In this way one doesn&amp;apos;t need to change gears all the time
  • #12 Typescript compilers that compiles in javascript and add some new features such as type annotations, classes and interfaces. CoffeeScript, Dart Coffee script is very popular and targets javascript. One of the main reason of its popularity to get rid of javascript c like syntax, because some people apparently dislike curly braces and semicolon very much. CoffeeScript is inspired by Ruby, Python and Haskell. Google created Dart as a replacement of Dart. They are hoping that one day they will replace javascript. Parenscript, Emscripten, JSIL, GWT. Js.scala
  • #17 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #18 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #19 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #20 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #21 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #22 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #23 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #24 Scala- an acronym for “Scalable Language”. a careful integration of object-oriented and functional language concepts.Scala runs on the JVM. . scala.js supports all of scala language so it can compile entire scala standard library.
  • #26 In Scala, one can define implicit conversions as methods with the implicit keywordcase class ID(val id: String) implicit def stringToID(s: String): ID = ID(s)def lookup(id: ID): Book = { ... } val book = lookup(&amp;quot;foo&amp;quot;) val id: ID = &amp;quot;bar&amp;quot; is valid, because the type-checker will rewrite it as val book = lookup(stringToID(&amp;quot;foo&amp;quot;) User-defined dynamic types :- Since version 2.10, scala has special feature scala.dynamic, which is used to define custom dynamic types. it allows to call method on objects, that don&amp;apos;t exist. It doesn&amp;apos;t have any member. It is marker interface. import scala.language.dynamics empl.lname = &amp;quot;Doe&amp;quot;. empl.set(&amp;quot;lname&amp;quot;, &amp;quot;Doe&amp;quot;) when you call empl.lname = &amp;quot;Doe&amp;quot;, the compiler converts it to a call empl.updateDynamic(&amp;quot;lname&amp;quot;)(&amp;quot;Doe&amp;quot;).
  • #27 compiles Scala code to JavaScript, allowing you to write your web application entirely in Scala!. Scala.js compiles full-fledged Scala code down to JavaScript, which can be integrated in your Web application. It provides very good interoperability with JavaScript code, both from Scala.js to JavaScript and vice versa. E.g., use jQuery and HTML5 from your Scala.js code.Since scala as a language and also its library rely on java standard library, so it is impossible to support all of scala without supporting some of java. hence scala.js includes partial part of java standard library , written in scala itself If you are developing rich internet application in scala and you are using all goodness of scala but you are sacrificing javascript interoperability, then you can use scala.js , a scala to javascript compiler. So that you can build entire web application in scala. A javascript backend for scala
  • #28 scala.js compiles your scala code to javascript code. its just a usual scala compiler that takes scala code and produces javascript code instead of JVM byte code. on the other hand, js-scala is a scala library providing composable javascript code generator. You can use them in your usual scala program to write javascript program generator. your scala program will be compile into JVM byte code using scala compiler and executing of this program generates javasript program. The main difference is that js-scala is a library while scala.js is a compiler. Suppose that you want to write a JavaScript program solving a given problem. In js-scala you write aScala program generating a JavaScript program solving the given problem. In scala.js you write a Scala program solving the given problem.
  • #29 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #30 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #31 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #32 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #33 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #34 Now-a days interoperability between statically typed and dynamically typed is getting demanded day by day that&amp;apos;s why many statically typed languages are targeting javascript. statically typed means, when a type of variable is known at compile time. In dynamically typed means, when a type of variable is interpreted at run time. interoperability with object oriented and functional features of javascript is essential but existing language has poor support for this. But scala.js interoperatibility system is based on powerful for type-directed interoperability with dynamically typed languages. It accommodates both the functional and object oriented features of scala and provides very natural interoperability with both language. It is expressive enough to represnt Dom, jquery in its statically and dynamically typed language. Scala has a very powerful type system with unique combination of features:traits, genrics, implicit conversion, higher order function and user defined dynamic type. As a functional and object-oriented language, its concepts are also very close to JavaScript, behind the type system: no static methods
  • #36 Support all of Scala (including macros!) except few semantic difference Because the target platform of Scala.js is quite different from that of Scala, a few language semantics differences exist.
  • #41 Piyush Mishra