2016-11-10 1 views
1

EMR에서 Spark 2를 사용 중입니다. 마스터 노드로 ssh를 실행하고 spark-shell을 실행하면 sqlContext에 액세스 할 수 없습니다. 내가 빠진 것이 있습니까?오류 : 찾을 수 없음 : EMR의 값 sqlContext

[[email protected] ~]$ spark-shell 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
16/11/10 21:07:05 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 
16/11/10 21:07:14 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 
Spark context Web UI available at http://172.31.13.180:4040 
Spark context available as 'sc' (master = yarn, app id = application_1478720853870_0003). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.0.1 
     /_/ 

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> import org.apache.spark.sql.SQLContext 
import org.apache.spark.sql.SQLContext 

scala> sqlContext 
<console>:25: error: not found: value sqlContext 
     sqlContext 
    ^

것은 나는 내가 아무 소용이 다음 해봤 내 로컬 컴퓨터에서 같은 오류가납니다 이후 : SPARK_LOCAL_IP

➜ play grep "SPARK_LOCAL_IP" ~/.zshrc 
export SPARK_LOCAL_IP=127.0.0.1 
➜ play source ~/.zshrc 
➜ play spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
Setting default log level to "WARN". 
To adjust logging level use sc.setLogLevel(newLevel). 
16/11/10 16:12:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
16/11/10 16:12:19 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 
Spark context Web UI available at http://127.0.0.1:4040 
Spark context available as 'sc' (master = local[*], app id = local-1478812339020). 
Spark session available as 'spark'. 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 2.0.1 
     /_/ 

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79) 
Type in expressions to have them evaluated. 
Type :help for more information. 

scala> sqlContext 
<console>:24: error: not found: value sqlContext 
     sqlContext 
    ^

scala> 

을 수출

/etc/hosts

다음 포함
127.0.0.1  localhost 
255.255.255.255 broadcasthost 
::1    localhost 

답변

3

스파크 2. 0은 더 이상 SQLContext를 사용하지 않습니다 : (sparkspark-shell 초기화)

  • 사용 SparkSession을. 당신이 할 수있는 기존의 응용 프로그램에 대한
  • :

    val sqlContext = spark.sqlContext 
    
+0

그래서'발는 SqlContext = spark.sqlContext; legacy에서는'sqlContext.read..'가, 새로운 애플리케이션에서는'spark.read'가 필요합니까? – Omnipresent

+0

@Omnipresent 사실. 그러나'SQLContext'를 기대하는 메소드가 없다면 spark.sqlContext가 꼭 필요하지는 않습니다. –

관련 문제