2017-12-29 5 views
0

대상 VM에서 연결이 끊어졌습니다 : 주소 : '127.0.0.1:39989', 전송 : intellij 아이디어 CE의 'socket'. 프로그램을 디버그 할 수 없습니다. 어떤 제안?intellij 아이디어로 내 프로그램을 디버그 할 수 없습니다 CE

Connected to the target VM, address: '127.0.0.1:39989', transport: 'socket' 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
17/12/29 17:29:47 INFO SparkContext: Running Spark version 2.1.2 
17/12/29 17:29:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
17/12/29 17:29:49 WARN Utils: Your hostname, ashfaq-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3) 
17/12/29 17:29:49 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 
17/12/29 17:29:49 INFO SecurityManager: Changing view acls to: ashfaq 
17/12/29 17:29:49 INFO SecurityManager: Changing modify acls to: ashfaq 
17/12/29 17:29:49 INFO SecurityManager: Changing view acls groups to: 
17/12/29 17:29:49 INFO SecurityManager: Changing modify acls groups to: 
17/12/29 17:29:49 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ashfaq); groups with view permissions: Set(); users with modify permissions: Set(ashfaq); groups with modify permissions: Set() 
17/12/29 17:29:51 INFO Utils: Successfully started service 'sparkDriver' on port 46133. 
17/12/29 17:29:51 INFO SparkEnv: Registering MapOutputTracker 
17/12/29 17:29:51 INFO SparkEnv: Registering BlockManagerMaster 
17/12/29 17:29:51 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 
17/12/29 17:29:51 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 
17/12/29 17:29:51 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-b3b48105-28be-4781-a395-c7e83cc72e8c 
17/12/29 17:29:51 INFO MemoryStore: MemoryStore started with capacity 393.1 MB 
17/12/29 17:29:51 INFO SparkEnv: Registering OutputCommitCoordinator 
17/12/29 17:29:53 INFO Utils: Successfully started service 'SparkUI' on port 4040. 
17/12/29 17:29:53 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4040 
17/12/29 17:29:53 INFO Executor: Starting executor ID driver on host localhost 
17/12/29 17:29:54 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33583. 
17/12/29 17:29:54 INFO NettyBlockTransferService: Server created on 10.0.2.15:33583 
17/12/29 17:29:54 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 
17/12/29 17:29:54 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.2.15, 33583, None) 
17/12/29 17:29:54 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.2.15:33583 with 393.1 MB RAM, BlockManagerId(driver, 10.0.2.15, 33583, None) 
17/12/29 17:29:54 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.2.15, 33583, None) 
17/12/29 17:29:54 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.2.15, 33583, None) 
17/12/29 17:29:58 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 236.5 KB, free 392.8 MB) 
17/12/29 17:29:58 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.9 KB, free 392.8 MB) 
17/12/29 17:29:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.2.15:33583 (size: 22.9 KB, free: 393.1 MB) 
17/12/29 17:29:59 INFO SparkContext: Created broadcast 0 from textFile at scalaApp.scala:13 
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/ashfaq/Desktop/saclaAPP/data/UserPurchaseHistory.csv 
    at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:287) 
    at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) 
    at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) 
    at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:202) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) 
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) 
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) 
    at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) 
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) 
    at scala.Option.getOrElse(Option.scala:121) 
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) 
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1968) 
    at org.apache.spark.rdd.RDD.count(RDD.scala:1158) 
    at ScalaApp$.main(scalaApp.scala:18) 
    at ScalaApp.main(scalaApp.scala) 
17/12/29 17:29:59 INFO SparkContext: Invoking stop() from shutdown hook 
17/12/29 17:29:59 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040 
17/12/29 17:29:59 INFO BlockManagerInfo: Removed broadcast_0_piece0 on 10.0.2.15:33583 in memory (size: 22.9 KB, free: 393.1 MB) 
17/12/29 17:29:59 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 
17/12/29 17:30:00 INFO MemoryStore: MemoryStore cleared 
17/12/29 17:30:00 INFO BlockManager: BlockManager stopped 
17/12/29 17:30:00 INFO BlockManagerMaster: BlockManagerMaster stopped 
17/12/29 17:30:00 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 
17/12/29 17:30:00 INFO SparkContext: Successfully stopped SparkContext 
17/12/29 17:30:00 INFO ShutdownHookManager: Shutdown hook called 
Disconnected from the target VM, address: '127.0.0.1:39989', transport: 'socket' 
17/12/29 17:30:00 INFO ShutdownHookManager: Deleting directory /tmp/spark-58667739-7c15-4665-8ede-fde9c3ff1d83 

Process finished with exit code 1 
+0

VM에 IP가 있는지, 올바른지 확인하십시오. 아니요 인 경우 전원을 끄고 다시 켭니다 (다시 시작하지 않고 시스템을 종료해야 함) –

답변

0

존재하지 않는 파일을 여는 중입니다. 오류 메시지의 fisrt 행은 다음과 같이 표시합니다.

Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/ashfaq/Desktop/saclaAPP/data/UserPurchaseHistory.csv 
관련 문제