flink (2) 读取kafka数据

一、依赖

首先配置maven依赖

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka_2.12</artifactId>
    <version>1.12.3</version>
</dependency>

这里使用的kafka版本是2.12,同时scala版本也是2.12,flink版本是1.12.3
版本选择最好与当前集群里面使用的一致,特别是对于kafka来说,高低版本当中,kafka内部的通信协议存在兼容性问题。
这里贴一份依赖文件
pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <parent>
        <artifactId>BigData</artifactId>
        <groupId>org.example</groupId>
        <version>1.0-SNAPSHOT</version>
    </parent>
    <modelVersion>4.0.0</modelVersion>
    <groupId>org.asiainfo.flink</groupId>
    <artifactId>Flink</artifactId>

    <properties>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>
    </properties>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.2</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <encoding>UTF-8</encoding>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-jar-plugin</artifactId>
                <version>2.4</version>
                <configuration>
                    <archive>
                        <!-- 生成的jar中,不要包含pom.xml和pom.properties这两个文件-->
                        <addMavenDescriptor>false</addMavenDescriptor>
                        <manifest>
                            <!--是否要把第三方jar放到manifest的classpath中-->
                            <addClasspath>true</addClasspath>
                            <!--生成的manifest中classpath的前缀,因为要把第三方jar放到lib目录下,所以classpath的前缀是lib/-->
                            <classpathPrefix>lib/</classpathPrefix>
                            <!-- 应用的main class
                            <mainClass>com.yourClass</mainClass>-->
                        </manifest>
                    </archive>
                    <!--  过滤掉不希望包含在jar中的文件-->
                    <excludes>
                        <exclude>${project.basedir}/xml/*</exclude>
                    </excludes>
                </configuration>
            </plugin>

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>3.3.0</version>
                <configuration>
                    <archive>
                        <manifest>
<!--                            <mainClass>com.asiainfo.wc.WordCount</mainClass>-->
                        </manifest>
                    </archive>
                    <descriptorRefs>
                        <descriptorRef>
                            jar-with-dependencies
                        </descriptorRef>
                    </descriptorRefs>

                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

        </plugins>
    </build>
<dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-scala -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-scala_2.12</artifactId>
        <version>1.12.1</version>

    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-core</artifactId>
        <version>1.12.1</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-scala -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-scala_2.12</artifactId>
        <version>1.12.1</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-clients -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients_2.12</artifactId>
        <version>1.12.1</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-kafka_2.12</artifactId>
        <version>1.12.3</version>
    </dependency>

</dependencies>
</project>

二、消费kafka数据

package com.asiainfo.apitest

import org.apache.flink.api.common.serialization.SimpleStringSchema
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer

import java.util.Properties

object KafkaSource {
  def main(args: Array[String]): Unit = {
    // 从kafka读取数据
    val env = StreamExecutionEnvironment.getExecutionEnvironment

    // 定义kafka consumer的配置信息
    val props = new Properties()
    props.put("bootstrap.servers", "hadoop4:9092")
    props.put("group.id", "consumer1")
    props.put("enable.auto.commit", "true")
    props.put("auto.commit.interval.ms", "1000")
    // 可以消费多个topic数据
    val stream = env.addSource(new FlinkKafkaConsumer[String]("yjt", new SimpleStringSchema(), props))

    // 处理数据的逻辑,这里只是简单打印
    stream.print()

    env.execute()
  }

}

记录学习和生活的酸甜苦辣.....哈哈哈
原文地址:https://www.cnblogs.com/yjt1993/p/14724717.html