spark

scala> val file=sc.textFile("/workspace/bpUserinfo_logs/bpUserinfo_20160212.log")  

scala> val count=file.flatMap(line => line.split(" ")).map(word => (word,1)).reduceByKey(_+_)  

scala> count.collect().map(print) 

原文地址:https://www.cnblogs.com/tugeler/p/5192628.html