問題描述
我正在嘗試將 pyspark
與 anaconda 一起導入和使用.
I am trying to import and use pyspark
with anaconda.
安裝 spark 后,我嘗試設置 $SPARK_HOME
變量:
After installing spark, and setting the $SPARK_HOME
variable I tried:
$ pip install pyspark
(當然)這行不通,因為我發現我需要通過 tel python 在 $SPARK_HOME/python/
下查找 pyspark
.問題是要做到這一點,我需要設置 $PYTHONPATH
而 anaconda 不使用該環境變量.
This won't work (of course) because I discovered that I need to tel python to look for pyspark
under $SPARK_HOME/python/
. The problem is that to do that, I need to set the $PYTHONPATH
while anaconda don't use that environment variable.
我試圖將 $SPARK_HOME/python/
的內容復制到 ANACONDA_HOME/lib/python2.7/site-packages/
但它不起作用.
I tried to copy the content of $SPARK_HOME/python/
to ANACONDA_HOME/lib/python2.7/site-packages/
but it won't work.
有沒有在anaconda中使用pyspark的解決方案?
Is there any solution to use pyspark in anaconda?
推薦答案
這可能只是最近才成為可能,但我使用了以下方法并且效果很好.在此之后,我可以將 pyspark 作為 ps 導入"并毫無問題地使用它.
This may have only become possible recently, but I used the following and it worked perfectly. After this, I am able to 'import pyspark as ps' and use it with no problems.
conda install -c conda-forge pyspark
這篇關于如何在 anaconda 中導入 pyspark的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!