java -version
c: cd c:\dev\spark-1.2.0-bin-hadoop2.4 bin\spark-shell
... 15/01/17 23:17:46 INFO httpServer: Starting HTTP Server 15/01/17 23:17:46 INFO Utils: Successfully started service 'HTTP class server' on... Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.2.0 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-bit Server VM, Java 1.7.0_71) Type in expression to have them evaluated. Type :help for more information. ... 15/01/17 23:17:53 INFO BlockManagerMaster: Registered BlockManager 15/01/17 23:17:46 INFO SparkLoop: Created spark context... Spark context avaliable as sc.
sc.version (ou) sc.appName
:quit
Antes de instalar o Apache Spark no Ubuntu/debian vamos atualizar os pacotes.
sudo apt update sudo apt -y upgrade
Apache Spark requer o Java para rodar, vamos ter certeza que temos o hava em nosso sistema Ubuntu/Debian.
sudo apt install default-jdk java -version
Baixe a ultima versão do Apache Spark na pagina de downloads do Apache Spark.
Extraia o arquivo baixado
tar xvf spark-*
Mova o arquivo extraido para /opt/directory
sudo mv spark-2.4.5-bin-hadoop2.7/ /opt/spark
Verifique se o Spark foi instalado acessando o Spark Shell
spark-shell
Você recebera a seguinte tela
# /opt/spark/bin/spark-shell 19/04/25 21:48:59 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 116.203.127.13 instead (on interface eth0) 19/04/25 21:48:59 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/opt/spark/jars/spark-unsafe_2.11-2.4.1.jar) to method java.nio.Bits.unaligned() WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release 19/04/25 21:49:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://static.13.127.203.116.clients.your-server.de:4040 Spark context available as 'sc' (master = local[*], app id = local-1556221755866). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.1 /_/ Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 11.0.2) Type in expressions to have them evaluated. Type :help for more information. scala>