How to load IPython shell with PySpark

How to load IPython shell with PySpark

Asked on January 11, 2019 in Apache-spark.
Add Comment


  • 3 Answer(s)

    When using version Spark < 1.2, The bin/pyspark is executed with an environmental variable IPYTHON=1.

    IPYTHON=1 /path/to/bin/pyspark
    

    Or else

    export IPYTHON=1
    /path/to/bin/pyspark
    

    And this will also works in version Spark 1.2 and For setting Python environment for these versions is PYSPARK_DRIVER_PYTHON

    PYSPARK_DRIVER_PYTHON=ipython /path/to/bin/pyspark
    

    or

    export PYSPARK_DRIVER_PYTHON=ipython
    /path/to/bin/pyspark
    

    In this ipython can be replaced with a path to the interpreter.

    Answered on January 11, 2019.
    Add Comment

    Alternatively the below command can be used.

    # if you run your ipython with 2.7 version with ipython2
    # whatever you use for launching ipython shell should come after '=' sign
    export PYSPARK_DRIVER_PYTHON=ipython2
    

    With the SPARK_HOME directory:

    ./bin/pyspark
    
    Answered on January 11, 2019.
    Add Comment

    Here .bashrc is added to the following config for the version of spark >= 2.0.

    export PYSPARK_PYTHON=/data/venv/your_env/bin/python
    export PYSPARK_DRIVER_PYTHON=/data/venv/your_env/bin/ipython
    
    Answered on January 11, 2019.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.