Docs
Plugins
Destinations
Kafka
Overview

Kafka Destination Plugin

Latest: v3.0.1

This destination plugin lets you sync data from a CloudQuery source to Kafka in various formats such as CSV, JSON. Each table will be pushed to a separate topic.

Example

This example configures connects to a Kafka destination with no authentication and pushes messages in JSON format. Note that the plugin only supports append write-mode.

The (top level) spec section is described in the Destination Spec Reference.

kind: destination
spec:
  name: "kafka"
  path: "cloudquery/kafka"
  version: "v3.0.1"
  write_mode: "append" # only supports 'append'

  spec:
    brokers: ["<broker-host>:<broker-port>"]
    format: "json"

Plugin Spec

This is the (nested) plugin spec

  • brokers (list(string)) (required)

    list of brokers to connect to.

  • sasl_username (string) (optional)

    If connecting via SASL/PLAIN, the username to use.

  • sasl_password (string) (optional)

    If connecting via SASL/PLAIN, the password to use.

  • verbose (bool) (optional)

    If true, the plugin will log all underlying Kafka client messages to the log.

  • format (string) (required)

    Format of the output file. json and csv are supported.

  • format_spec (map format_spec) (optional) Optional parameters to change the format of the file

format_spec

  • delimiter (string) (optional) (default: ,)

    Character that will be used as want to use as the delimiter if the format type is csv

  • skip_header (bool) (optional) (default: false)

    Specifies if the first line of a file should be the headers (when format is csv).