Re: Kafka Avro Table Source
The community is currently working on improving the Kafka Avro integration
for Flink SQL.
There's a PR . If you like, you could try it out and give some feedback.
Timo (in CC) has been working Kafka Avro and should be able to help with
any specific questions.
2018-07-03 3:02 GMT+02:00 Will Du <willddy@xxxxxxxxx>:
> Hi folks,
> I am working on using avro table source mapping to kafka source. By
> looking at the example, I think the current Flink v1.5.0 connector is not
> flexible enough. I wonder if I have to specify the avro record class to
> read from Kafka.
> For withSchema, the schema can get from schema registry. However, the
> class of avro seems hard coded.
> KafkaTableSource source = Kafka010AvroTableSource.builder()
> // set Kafka topic
> // set Kafka consumer properties
> // set Table schema
> .field("sensorId", Types.LONG())
> .field("temp", Types.DOUBLE())
> .field("time", Types.SQL_TIMESTAMP()).build())
> // set class of Avro record* .forAvroRecordClass(SensorReading.class) // ? Any way to get this without hard code class*