Issue
I've been following this tutorial to install spark for scala: rel="noreferrer">https://www.tutorialspoint.com/apache_spark/apache_spark_installation.htm
However, When I try to run spark-shell
I receive this error in my console.
/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory
My bashrc looks like this:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python
So what am I getting wrong? I've installed spark for python before but now I'm trying to use scala. Is spark confusing the variables? Thanks.
Solution
You have one bin
too many in the path it's searching:
/usr/local/spark/bin/bin/spark-submit
should be
/usr/local/spark/bin/spark-submit
The SPARK_HOME
should be /usr/local/spark/
in your case, not /usr/local/spark/bin/
as it seems to be the case now.
Answered By - Wilmerton Answer Checked By - Clifford M. (WPSolving Volunteer)