“INSERT INTO …” with SparkSQL HiveContext



  • 4 Answer(s)

    The append mode is used on the DataFrameWriter which Data can be appended to a Hive table.

    data = hc.sql("select 1 as id, 10 as score")
    data.write.mode("append").saveAsTable("my_table")
    

    The output will be same as an insert.

    Answered on January 12, 2019.
    Add Comment

    Here the similar problem is seen in version Spark 1.5.1, And  tried by different versions.

    The following given

    sqlContext.sql("create table my_table(id int, score int)")
    

    The only versions that worked is:

    sqlContext.sql("insert into table my_table select t.* from (select 1, 10) t")
    sqlContext.sql("insert into my_table select t.* from (select 2, 20) t")
    
    Answered on January 12, 2019.
    Add Comment

    In this saveAsTable solution is failed with an AnalysisException . Rather the below works fine:

    data = hc.sql("select 1 as id, 10 as score")
    data.write.mode("append").insertInto("my_table")
    

    Here the version of Spark v2.1.0 is used.

    Answered on January 12, 2019.
    Add Comment

    EXAMPLES::

    1.hiveContext.sql('insert into my_table (id, score) values (1, 10)')
    2.data = hc.sql("select 1 as id, 10 as score") data.write.mode("append").insertInto("my_table")
    Answered on January 13, 2019.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.