kafka connect 使用

 

connect地址html

https://www.confluent.io/hubnode

 

安裝啓動git

https://docs.confluent.io/current/connect/userguide.html#connect-userguide-distributed-configgithub

 

https://docs.confluent.io/current/connect/managing/install.html#install-connectorsapache

 

 

 

REST操做connectbootstrap

參考:https://docs.confluent.io/current/connect/references/restapi.html#connect-userguide-restapi

 

 

 

 

遇到到的問題ide

 

一、啓動時候日誌中一直刷以下日誌,ui

 Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available.rest

解決:

connect相關配置文件中bootstrap.servers=localhost:9092 ,這與kafka server.properties文件中配置的不一致!修改一直便可。

 

二、Couldn't start HdfsSinkConnector due to configuration error.

 

Caused by: org.apache.kafka.common.config.ConfigException: Invalid value  for configuration locale: Locale cannot be empty

 

解決:

意思是要配置一個參數,經查看相關源碼以及可配置屬性確認爲locale ,中國 配置  locale=zh_CN  ;這個參數是文件分區須要使用的;

 

 

Invalid value  for configuration timezone: Timezone cannot be empty

 

解決:timezone=Asia/Shanghai

 

三、JsonConverter with schemas.enable requires "schema" and "payload" fields and may not contain additional fields. If you are trying to deserialize plain JSON data, set schemas.enable=false in your converter configuration.

 

解決:

須要遵循必定的格式,參見:https://cwiki.apache.org/confluence/display/KAFKA/KIP-301%3A+Schema+Inferencing+for+JsonConverter, https://github.com/confluentinc/kafka-connect-jdbc/issues/574

從源碼中找的範例以下:

{"schema":{"type":"struct","fields":[{"type":"boolean","optional":true,"field":"booleanField"},{"type":"int32","optional":true,"field":"intField"},{"type":"int64","optional":true,"field":"longField"},{"type":"string","optional":false,"field":"stringField"}]},"payload":{"booleanField":"true","intField":88,"longField":32,"stringField":"str"}}

 

四、io.confluent.connect.storage.errors.HiveMetaStoreException: Hive MetaStore exception

 

最後一個Caused by 以下,

 

Caused by: InvalidObjectException(message:default.test_hdfs table not found)

 

解決:

相關文章
相關標籤/搜索