Skip to content

Commit

Permalink
Merge pull request #39 from waitspring/Kafka
Browse files Browse the repository at this point in the history
Kafka
  • Loading branch information
waitspring authored May 7, 2024
2 parents 358f89a + 3c3242b commit 6529527
Show file tree
Hide file tree
Showing 9 changed files with 316 additions and 24 deletions.
2 changes: 1 addition & 1 deletion Kafka/01 Preparation
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Kafka 软件的发展历程清晰简单
│ 2020-12-21 │ 2.7.0 │ • PEM format for SSL │
│ │ │ • add sliding-window support for aggregations │
├────────────┼──────────┼─────────────────────────────────────────────────────────────────────────────────────────────┤
│ │ │ • Json request & response debug log │
│ │ │ • JSON request & response debug log │
│ 2021-04-19 │ 2.8.0 │ • limit broker connection creation rate │
│ │ │ • KRaft metadata mode │
├────────────┼──────────┼─────────────────────────────────────────────────────────────────────────────────────────────┤
Expand Down
2 changes: 1 addition & 1 deletion Kafka/02 Introduction
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Kafka 软件的基本架构

• Message [ 消息 ] 最小细粒度的数据单元, 具有消息头与消息体两个构造部分
• Batch [ 批次 ] 数量不定的消息集合, 生产者与消费者按照批次, 批量地读写软件内寄存的消息
• Schema [ 模式 ] 消息使用的数据格式的模版, 较常见的模式有 Apache Avro 序列化器
• Schema [ 模式 ] 消息使用的数据格式的模版, 较常见的模式有 JSON/ Apache Avro/ Protobuf 序列化器
• Topic [ 主题 ] 消息的类别, 生产者与消费者共同使用的一个逻辑概念, 生产者与消费者订阅相同主题时构建起数据传播通道
• Partition [ 分区 ] 主题的分区, 默认从数值 0 开始编号, 与节点上的业务数据目录 (/path/to/data/topic-[0-9]) 形成映射关系
• Replica [ 副本 ] 分区的副本, 数量范围为 1 ~ 节点数量 (最大值为 32767)
Expand Down
22 changes: 0 additions & 22 deletions Kafka/05 Kafka Connect

This file was deleted.

239 changes: 239 additions & 0 deletions Kafka/05 Request Message

Large diffs are not rendered by default.

15 changes: 15 additions & 0 deletions Kafka/Template/configure/connect-file-sink.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#
# 文件位置: /etc/kafka/connect-file-sink.properties
#


# 连接器的名称为 kafka-file-sink
name=kafka-file-sink
# 连接器的工作模式为 FileStreamSink 类型
connector.class=FileStreamSink
# 连接器的最大并发任务为 1 个任务
tasks.max=1
# 连接器的文本文件输出对象为 /tmp/kafka-config.txt
file=/tmp/kafka-config.txt
# 连接器的主题订阅对象为 kafka-file
topics=kafka-file
15 changes: 15 additions & 0 deletions Kafka/Template/configure/connect-file-source.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#
# 文件位置: /etc/kafka/connect-file-source.properties
#


# 连接器的名称为 kafka-file-source
name=kafka-file-source
# 连接器的工作模式为 FileStreamSource 类型
connector.class=FileStreamSource
# 连接器的最大并发任务为 1 个任务
tasks.max=1
# 连接器的文本文件扫描对象为 /etc/kafka/server.properties
file=/etc/kafka/server.properties
# 连接器的主题发布对象为 kafka-file
topic=kafka-file
25 changes: 25 additions & 0 deletions Kafka/Template/configure/connect-standalone.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
#
# 文件位置: /etc/kafka/connect-standalone.properties
#


# Kafka Connect 组件连接到 Kafka 集群服务
bootstrap.servers=kafka-0.season.com:9092,kafka-1.season.com:9092,kafka-2.season.com:9092

# Kafka Connect 组件的转换器配置
# 加载 Json 数据格式的转换器作为消息键的转换器
key.converter=org.apache.kafka.connect.json.JsonConverter
# 允许 Json 数据格式的转换器嵌入模式信息到消息键内
key.converter.schemas.enable=true
# 加载 Json 数据格式的转换器作为消息值的转换器
value.converter=org.apache.kafka.connect.json.JsonConverter
# 允许 Json 数据格式的转换器嵌入模式信息到消息值内
value.converter.schemas.enable=true

# Kafka Connect 使用 /var/kafka/data/connect.offsets 文件存储偏移量元数据
offset.storage.file.filename=/var/kafka/data/connect.offsets
# Kafka Connect 每隔 10000 毫秒 (10 秒钟) 刷新一次偏移量文件
offset.flush.interval.ms=10000

# Kafka Connect 加载固定库文件
plugin.path=/usr/local/kafka/libs
10 changes: 10 additions & 0 deletions Kafka/Template/kafka-rest/kafka-rest.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# 设置 Kafka REST 组件名称为 kafka-rest
id=kafka-rest
# 设置 Kafka REST 组件监听 8082 端口
listeners=http://0.0.0.0:8082
# 设置 Kafka REST 组件连接 Kafka Schema Registry 组件的参数配置
schema.registry.url=http://kafka-0.season.com:8081,http://kafka-1.season.com:8081,http://kafka-2.season.com:8081
# 设置 Kafka REST 组件连接 ZooKeeper 集群
zookeeper.connect=zookeeper-0.season.com:2181,zookeeper-1.season.com:2181,zookeeper-2.season.com:2181
# 设置 Kafka REST 组件连接 Kafka 集群
bootstrap.servers=PLAINTEXT://kafka-0.season.com:9092,kafka-1.season.com:9092,kafka-2.season.com:9092
10 changes: 10 additions & 0 deletions Kafka/Template/schema-registry/schema-registry.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# 设置 Kafka Schema Registry 组件监听 8081 端口
listeners=http://0.0.0.0:8081
# 设置 Kafka Schema Registry 组件连接到 Kafka 集群
kafkastore.bootstrap.servers=PLAINTEXT://kafka-0.season.com:9092,kafka-1.season.com:9092,kafka-2.season.com:9092
# 设置 Kafka Schema Registry 组件使用 _schemas 主题
kafkastore.topic=_schemas
# 关闭 Kafka Schema Registry 组件的 DEBUG 日志模式
debug=false
metadata.encoder.secret=REPLACE_ME_WITH_HIGH_ENTROPY_STRING
resource.extension.class=io.confluent.dekregistry.DekRegistryResourceExtention

0 comments on commit 6529527

Please sign in to comment.