나는 스파크 스트리밍 + 카프카의 예가있다. 그것은 IDE에서 잘 작동합니다. 하지만 콘솔에서 SBT로 컴파일하려고하면 sbt가과 같이 컴파일됩니다. 오류가 있습니다.스파크 스트리밍 + 카프카 바이트 편집
메인 클래스 :
val conf = new SparkConf().setMaster("local[*]").setAppName("KafkaReceiver")
val ssc = new StreamingContext(conf, Seconds(5))
val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
//val kafkaStream2 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("topic2" -> 5))
//kafkaStream.fla
kafkaStream1.print()
ssc.start()
ssc.awaitTermination()
오류 메시지 :
[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
Reference to method any2ArrowAssoc in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val kafkaStream1 = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("t1" -> 5))
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
SBT :
name := "test"
val sparkVersion = "2.0.0"
lazy val commonSettings = Seq(
organization := "com.test",
version := "1.0",
scalaVersion := "2.11.8",
test in assembly := {}
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-streaming_2.11" % sparkVersion,
"org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % sparkVersion
)
당신이 어떻게 문제를 해결하는 아이디어가 있습니까?
이것은 몇 가지 단서를 제공합니다. http : //stackoverflow.com/questions/24472645/why-does-sbt-say-bad-symbolic-reference-for-test-with-scalatest – kosa