Spark-streaming 连接flume

1,程序为spark的example中的FlumeEventCount示例
object FlumeEventCount {
def main(args: Array[String]) {

StreamingExamples.setStreamingLogLevels()

//val Array(host, IntParam(port)) = args

val host = "localhost"
val port = 19999
val batchInterval = Milliseconds(2000)

// Create the context and set the batch size
val sparkConf = new SparkConf().setAppName("FlumeEventCount")
val ssc = new StreamingContext(sparkConf, batchInterval)

// Create a flume stream
val stream = FlumeUtils.createStream(ssc, host, port, StorageLevel.MEMORY_ONLY_SER_2)

// Print out the count of events received from this server in each batch
stream.count().map(cnt => "Received " + cnt + " flume events." ).print()

ssc.start()
ssc.awaitTermination()
}
}

flume中配置文件,spark_avro.conf:
a1.channels = c1
a1.sinks = k1
a1.sources = r1

a1.sinks.k1.type = avro
a1.sinks.k1.channel = c1
a1.sinks.k1.hostname = localhost
a1.sinks.k1.port = 19999

a1.sources.r1.type = avro
a1.sources.r1.bind = localhost
a1.sources.r1.port = 44444
a1.sources.r1.channels = c1

a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
2,运行时报错的话,需要引入如下jar包:
在flume的lib目录下查找.
加载后再次启动
3,运行../bin/flume-ng agent --conf conf --conf-file ./spark_avro.conf --name a1 -Dflume.root.logger=INFO,console
4,运行./flume-ng avro-client --conf ../conf/ -Hlocalhost -p 44444 -F /usr/local/spark-1.4.0/conf/spark-env.sh.template -Dflume.root.logger=DEBUG,console
5,可以在idea console窗口中查看事件.但是有一个问题:
 运行的example中FlumeEventCount,运行的时候显示

等停止了才显示
调整方案:在参数设置中-Dspark.master=local[n], n>1即可

注意:先启动spark程序,即sink.否则会提示连接不到端口19999的错误
原文地址:https://www.cnblogs.com/wuyida/p/6300269.html