Expand description
Kafka source and sink connectors. Kafka source and sink connectors for LaminarDB.
Provides a KafkaSource that consumes from Kafka topics and
produces Arrow RecordBatch data through the SourceConnector
trait, and a KafkaSink that writes Arrow RecordBatch data
to Kafka topics through the SinkConnector trait.
Both connectors support JSON, CSV, Raw, Debezium, and Avro formats, with full Confluent Schema Registry integration for Avro.
§Features
- Per-partition offset tracking with checkpoint/restore (source)
- At-least-once and exactly-once delivery (sink)
- Confluent Schema Registry with caching and compatibility checking
- Avro serialization/deserialization via
arrow-avro(Confluent wire format) - Configurable partitioning: key-hash, round-robin, sticky (sink)
- Backpressure control with high/low watermark hysteresis (source)
- Consumer group rebalance tracking (source)
- Dead letter queue for failed records (sink)
- Atomic metrics counters
§Usage
ⓘ
use laminar_connectors::kafka::{KafkaSource, KafkaSourceConfig};
use laminar_connectors::kafka::{KafkaSink, KafkaSinkConfig};
// Source
let config = KafkaSourceConfig::from_config(&connector_config)?;
let source = KafkaSource::new(schema, config);
// Sink
let config = KafkaSinkConfig::from_config(&connector_config)?;
let sink = KafkaSink::new(schema, config);Re-exports§
pub use avro::AvroDeserializer;pub use config::AssignmentStrategy;pub use config::CompatibilityLevel;pub use config::IsolationLevel;pub use config::KafkaSourceConfig;pub use config::OffsetReset;pub use config::SaslMechanism;pub use config::SecurityProtocol;pub use config::SrAuth;pub use config::StartupMode;pub use config::TopicSubscription;pub use metrics::KafkaSourceMetrics;pub use offsets::OffsetTracker;pub use source::KafkaSource;pub use watermarks::AlignmentCheckResult;pub use watermarks::KafkaAlignmentConfig;pub use watermarks::KafkaAlignmentMode;pub use watermarks::KafkaWatermarkTracker;pub use watermarks::WatermarkMetrics;pub use watermarks::WatermarkMetricsSnapshot;pub use avro_serializer::AvroSerializer;pub use partitioner::KafkaPartitioner;pub use partitioner::KeyHashPartitioner;pub use partitioner::RoundRobinPartitioner;pub use partitioner::StickyPartitioner;pub use sink::KafkaSink;pub use sink_config::Acks;pub use sink_config::CompressionType;pub use sink_config::DeliveryGuarantee;pub use sink_config::KafkaSinkConfig;pub use sink_config::PartitionStrategy;pub use sink_metrics::KafkaSinkMetrics;pub use schema_registry::CachedSchema;pub use schema_registry::CompatibilityResult;pub use schema_registry::SchemaRegistryClient;pub use schema_registry::SchemaType;
Modules§
- avro
- Avro deserialization using
arrow-avrowith Confluent Schema Registry. - avro_
serializer - Avro serialization using
arrow-avrowith Confluent Schema Registry. - backpressure
- Backpressure controller for Kafka source consumption.
- config
- Kafka source connector configuration.
- discovery
- Kafka-based discovery for delta nodes.
- metrics
- Kafka source connector metrics.
- offsets
- Kafka offset tracking for per-partition consumption progress.
- partitioner
- Kafka partitioning strategies.
- rebalance
- Kafka consumer group rebalance state tracking.
- schema_
registry - Confluent Schema Registry client.
- sink
- Kafka sink connector implementation.
- sink_
config - Kafka sink connector configuration.
- sink_
metrics - Kafka sink connector metrics.
- source
- Kafka source connector implementation.
- watermarks
- Kafka watermark integration.
Functions§
- register_
kafka_ sink - Registers the Kafka sink connector with the given registry.
- register_
kafka_ source - Registers the Kafka source connector with the given registry.