Spark 2.0 missing spark implicits

Spark 2.0 missing spark implicits

Asked on January 11, 2019 in Apache-spark.
Add Comment


  • 2 Answer(s)

    Here spark.implicits is not a package.

    This below can be done, once spark is defined:

    import spark.implicits._
    

    When the  SparkSession is defined, then modify the code:

    val mySpark = SparkSession
      .builder()
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()
     
    // For implicit conversions like converting RDDs to DataFrames
    import mySpark.implicits._
    
    Answered on January 11, 2019.
    Add Comment

    In this for the SparkSession, spark is used as spark identifier.

    val ss = SparkSession
      .builder()
      .appName("test")
      .master("local[2]")
      .getOrCreate()
    

    The import implicits as follows:

    import ss.implicits._
    
    Answered on January 11, 2019.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.