Name | Description | Values |
name | Name of the connector | <name_connector> |
connector.class | Specifies the connector class that will handle integration with Snowflake | com.onibex.connect.datalake.jdbc.OnibexSnowflakeSinkConnector |
connection.url | JDBC URL to connect to the Snowflake database, specifying the database, warehouse, and schema | dbc:snowflake://<Server_Hostname>:443/?db=<Database_Name>&warehouse=<COMPUTE_WH>&schema=<Schema_Name> |
connection.user | Snowflake user | <User_snow> |
connection.password | Snowflake password | <Password> |
connection.privateKey | Private key used for authentication with Snowflake | <PrivateKey> |
connection.privateKeyPassphrase | Passphrase for the private key used in Snowflake authentication. | <PrivateKeyPassphrase> |
Name | Description | Values |
tasks.max | Defines the maximum number of tasks that the connector will execute | Positive integer value > 1 |
batch.size | Specifies the number of records to batch together in a single SQL transaction, when possible. | Positive integer value > 1 |
Name | Description | Values |
table.name.format | A format string used to define the destination table name. It includes ${topic} as a placeholder for the originating topic name. | ${topic} |
pk.mode | Specifies where to find the primary key for the records being inserted. | record_key |
insert.mode | The insertion mode for the records | insert / upsert / update |
delete.enabled | Enables the deletion of records in the target database | true / false |
Name | Description | Values |
auto.create | Allows automatic creation of tables if they do not exist | true / false |
auto.evolve | Allows automatic evolution of tables if the schema changes | true / false |
Name | Description | Values |
topics | List of Kafka topics that this connector will consume | <Topic_Name> |
Name | Description | Values |
key.converter | Key converter in Avro format | io.confluent.connect.avro.AvroConverter |
value.converter | Value converter in Avro format | io.confluent.connect.avro.AvroConverter |
key.converter.schema.registry.url | URL of the Confluent schema registry for keys | http://<Ip_Host>:<Port> |
value.converter.schema.registry.url | URL of the Confluent schema registry for values | http://<Ip_Host>:<Port> |
Name | Description | Values |
transforms | Transformations applied to the data | ExtractTimestamp, InsertTimezone |
transforms.ExtractTimestamp.type | Transformation type to add the timestamp to the records | org.apache.kafka.connect.transforms.InsertField$Value |
transforms.ExtractTimestamp.timestamp.field | Field where the event timestamp will be inserted | timestamp |
transforms.InsertTimezone.type | Transformation type to add the timezone | org.apache.kafka.connect.transforms.InsertField$Value |
transforms.InsertTimezone.static.field | Static field where the timezone will be inserted | timezone |
transforms.InsertTimezone.static.value | Value of the timezone field | America/Mexico_City |
Field | Type | Ordinal | Description |
raw_message_header | Varchar | 4 | Represents the Kafka message headers in text format |
raw_message_info | Varchar | 1 | Additional information or metadata about the message; typically contains identification or type data |
raw_message_key | Varchar | 2 | Stores the Kafka message key in text format, aiding in record search and joins |
raw_message_timestamp | Number | 5 | Timestamp of the message, stored as a number to facilitate temporal ordering and filtering |
raw_message_value | Varchar | 3 | Main content of the message in `String` format, representing the message body |
1. raw_message_header (Varchar): Stores any header information accompanying the Kafka message. This may include context data like event type, partition identifier, or additional metadata necessary for interpreting the message.2. raw_message_info (Varchar): Holds supplementary information or metadata related to the message. Based on connector configuration, this could include application-specific identifiers or additional tags classifying the message.3. raw_message_key (Varchar): Contains the message key in `String` format. This field is essential for unique message identification or operations such as `UPSERT`, as it serves to uniquely identify each record in the Snowflake table.4. raw_message_timestamp (Number): Stores the message's timestamp in numeric format. Useful for audits, ordering, and temporal filtering in Snowflake, enabling queries based on when the message was sent or processed.5. raw_message_value (Varchar): The primary field storing the content or message value in `String` format. Represents the body of the message and holds the data being transmitted from Kafka.