1
응용 프로그램을 실행 중이며 Spark에 연결하는 데 문제가 있습니다. 여기에 관련 코드는 다음과 같습니다스파크 클러스터에 프로그래밍 방식으로 연결할 수 없지만 스파크 - 쉘 방식으로 연결할 수 있습니까?
System.setProperty("spark.akka.frameSize", "200")
System.setProperty("spark.akka.timeout", "100")
System.setProperty("spark.default.parallelism", "288")
var sparkmaster = "spark://1.17.8.5:7098"
var sc = new SparkContext(sparkmaster, "tool", "/opt/spark")
관련 출력은 다음과 같습니다
13/08/14 17:15:30 INFO storage.BlockManagerUI: Started BlockManager web UI at ----
13/08/14 17:15:30 INFO spark.SparkContext: Added JAR target/scala-2.9.1/pivot-spark_2.9.1-1.0.jar at http://1.17.8.5:50262/jars/tool_2.9.1-1.0.jar with timestamp 1376500530641
13/08/14 17:15:30 INFO cluster.FairSchedulableBuilder: Create default pool with name:default,schedulingMode:FIFO,minShare:2,weight:1
13/08/14 17:15:30 INFO client.Client$ClientActor: Connecting to master spark://1.17.8.5:7098
당신이 실제로 노동자에 연결되지 않습니다 볼 수 있듯이. 난 그냥 터미널에서 spark-shell
를 실행하는 경우, 내가 얻을 :
13/08/14 17:25:53 INFO BlockManagerUI: Started BlockManager web UI at ---
13/08/14 17:25:53 INFO Client$ClientActor: Connecting to master spark://1.17.8.5:7098
Spark context available as sc.
13/08/14 17:25:53 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20130814172553-0029
13/08/14 17:25:53 INFO Client$ClientActor: Executor added: app-20130814172553-0029/0 on worker-20130808012122----42908 with 4 cores
13/08/14 17:25:53 INFO SparkDeploySchedulerBackend: Granted executor ID app-20130814172553-0029/0 on host --- with 4 cores, 8.0 GB RAM
13/08/14 17:25:53 INFO Client$ClientActor: Executor added: app-20130814172553-0029/1 on worker-20130808012122-----59902 --- with 4 cores
...etc
문제가 여기에 무엇입니까?