Projekt setup:SBT nie może importować Kafka klasy koder/dekoder
- 1 Producer - serialises obiektów & wysyła bajty do Kafki
- 1 iskra konsumenta - należy używać DefaultDecoder w kafka.serializer pakietu spożywać bajty
Problem:
- SBT importuje prawidłowe biblioteki (kafka-clients + kafka_2.10), ale jest nie może znaleźć żadnych klas w słoiku kafka_2.10.
- Wygląda na to, że szuka pod niewłaściwą ścieżką (org.apache.spark.streaming.kafka zamiast org.apache.kafka).
Komunikat o błędzie::
object serializer is not a member of package org.apache.spark.streaming.kafka [error]
import kafka.serializer.DefaultDecoder.
SBT-tree
[info] +-org.apache.spark:spark-streaming-kafka_2.10:1.6.1
[info] | +-org.apache.kafka:kafka_2.10:0.8.2.1 [S] <-- **DefaultDecoder is in here
but SBT can't find it (org.apache.kafka.serialization.DefaultDecoder)**
[info] | | +-org.apache.kafka:kafka-clients:0.8.2.1
built.sbt:
lazy val commonSettings = Seq(
organization := "org.RssReaderDemo",
version := "0.1.0",
scalaVersion := "2.10.6"
)
resolvers += "Artima Maven Repository" at "http://repo.artima.com/releases"
val spark = "org.apache.spark" % "spark-core_2.10" % "1.6.1"
val sparkStreaming = "org.apache.spark" % "spark-streaming_2.10" % "1.6.1"
val sparkStreamKafka = "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1"
// Needed to be able to parse the generated avro JSON schema
val jacksonMapperAsl = "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
val scalactic = "org.scalactic" %% "scalactic" % "2.2.6"
val scalatest = "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val avro = "org.apache.avro" % "avro" % "1.8.0"
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
libraryDependencies += spark,
libraryDependencies += sparkStreaming,
libraryDependencies += sparkStreamKafka,
libraryDependencies += jacksonMapperAsl,
libraryDependencies += scalactic,
libraryDependencies += scalatest,
libraryDependencies += avro
)
kod, który powoduje błąd w SBT: import kafka.serializer.DefaultDecoder – mds91