2017-03-10 3 views
1

내 작은 아파치 스파크 프로젝트 인 스칼라가 Mllib를 추가 할 때까지 잘 작동했습니다.스칼라 sbt 오류가있는 Apache Spark Mllib 2.1.0

내 sbt 빌드 파일은 아래처럼 보이지만 컴파일 오류가 발생합니다. 스칼라 2.11.X로 Apache Spark Mllib을 빌드 할 수 없습니까? 모든 포인터가 도움이 될 것입니다.

error] Modules were resolved with conflicting cross-version suffixes in {file:: 
[error] org.apache.spark:spark-launcher _2.11, _2.10 
[error] org.apache.spark:spark-sketch _2.11, _2.10 
[error] org.json4s:json4s-ast _2.11, _2.10 
[error] org.apache.spark:spark-catalyst _2.11, _2.10 
[error] org.apache.spark:spark-network-shuffle _2.11, _2.10 
[error] org.scalatest:scalatest _2.11, _2.10 
[error] com.twitter:chill _2.11, _2.10 
[error] org.apache.spark:spark-sql _2.11, _2.10 
[error] org.json4s:json4s-jackson _2.11, _2.10 
[error] com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10 
[error] org.json4s:json4s-core _2.11, _2.10 
[error] org.apache.spark:spark-unsafe _2.11, _2.10 
[error] org.apache.spark:spark-tags _2.11, _2.10 
[error] org.apache.spark:spark-core _2.11, _2.10 
[error] org.apache.spark:spark-network-common _2.11, _2.10 
[trace] Stack trace suppressed: run last *:update for the full output. 
[error] (*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.apache.spark:spark-sketch, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, org.scalatest:scalatest, com.twitter:chill, org.apache.spark:spark-sql, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-tags, org.apache.spark:spark-core, org.apache.spark:spark-network-common 
[error] Total time: 18 s, completed 10-Mar-2017 20:41:51 


version := "1.0" 
scalaVersion := "2.11.8" 

libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1" 
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test" 
//libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.5.0" 
//libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7" 
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6" 
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0" 
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0" 
// https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.10 
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0" 

답변

1

Scala 2.11.X로 Apache Spark MLlib을 확실히 빌드 할 수 있습니다.

libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0" 

libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0" 

또는

libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.1.0" 
+0

감사 @himanshullltian : 그것을 위해 당신은에서 스파크 MLlib의 라이브러리 의존성을 변경해야합니다. 마지막 옵션을 시도했지만 작동하지 않았습니다. 첫 번째 옵션을 사용해 보겠습니다. – user3341078

관련 문제