Skip to content

Instantly share code, notes, and snippets.

@RAbraham
Last active August 3, 2021 22:32
Show Gist options
  • Select an option

  • Save RAbraham/585939e5390d46a7d6f8 to your computer and use it in GitHub Desktop.

Select an option

Save RAbraham/585939e5390d46a7d6f8 to your computer and use it in GitHub Desktop.
Execute Apache Spark in a Scala IDE worksheet
package org.apache.spark.graphx
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark._
object repl {
val sc = new SparkContext("local", "test") //> sc : org.apache.spark.SparkContext = org.apache.spark.SparkContext@3724af13
//|
val vertices = sc.parallelize(1L to 5L) //> vertices : org.apache.spark.rdd.RDD[Long] = ParallelCollectionRDD[0] at par
//| allelize at org.apache.spark.graphx.repl.scala:15
println(vertices.count) //> 5
}
@aloneguid
Copy link
Copy Markdown

@detectivebag eclipse compatibility mode solves the issue, thanks!

@gjke
Copy link
Copy Markdown

gjke commented Nov 21, 2017

Yes, it solved it for me also. There are 4 options there (IntelliJ Idea Community edition 2016.3 on MacOs):

  1. Run worksheet in the compiler
  2. Run worksheet in the interactive mode
  3. Use "eclipse compatibility" mode
  4. Treat Scala scratch files as worksheet files.

It worked for me when I checked only the third option and left the other three unchecked.

@paulochf
Copy link
Copy Markdown

paulochf commented Aug 3, 2021

Still works! Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment