Onibex Kafka Connector APP - Snowflake Native APP

Onibex Kafka Connector APP - Snowflake Native APP

Onibex Kafka Connector App

The Onibex Kafka Connector App is a Snowflake Native Application that provides a fully integrated framework for managing Snowflake connectors in Confluent Cloud directly from Snowflake. It allows users to create, delete, and update connectors using the official Onibex Snowflake Sink Connector, simplifying end-to-end data streaming management.

1. Application Architecture Overview

The application is composed of several logical components that work together to ensure secure, reliable, and efficient operation:

  • Streamlit-based User Interface:
    Delivers an intuitive and interactive interface within Snowflake, enabling users to easily create, update, and delete Confluent connectors without leaving the Snowflake environment.

  • Snowpark Core Layer:
    Implements the core business logic of the application, validating configurations and managing the interaction between Snowflake and Confluent Cloud through the Onibex Kafka Connector. This layer ensures operational consistency, reliability, and transparency.

  • External Access Integrations (EAIs):
    Provide secure and controlled communication between Snowflake and Confluent Cloud. These integrations safeguard credentials, enforce strict access controls, and maintain compliance with security standards.

  • Configuration and Metadata Storage:
    Stores all connector-related metadata, configuration parameters, and environment details in Snowflake tables such as core.app_config. This design enables centralized management, persistence, and auditability of all connector operations.

This architecture achieves an optimal balance between automation, security, and maintainability, empowering organizations to efficiently manage Confluent connectors directly within Snowflake while leveraging the robustness of the Onibex platform.

2. Setup Wizard Functionalities

The Onibex Kafka Connector App includes a guided setup wizard designed to simplify onboarding and ensure that all required configurations and integrations between Snowflake and Confluent Cloud are properly established.
The process consists of eight intuitive steps, enabling users to securely and efficiently manage their Snowflake connectors in Confluent Cloud.

1. Terms and Conditions
Users must review and accept the Onibex licensing terms and responsibilities before proceeding with the installation and configuration process.

2. Cloud Provider Selection
Allows users to specify their preferred Confluent Cloud deployment environment (AWS, Azure, or GCP), ensuring compatibility with their existing infrastructure.

3. Confluent API Access
Configures authentication by securely registering the user’s Confluent Cloud API key and secret within Snowflake using an External Access Integration (EAI). This step validates access and establishes a secure communication channel with Confluent Cloud.

4. Environment Selection
Enables users to select the target Confluent environment in which Snowflake connectors will be created, updated, or deleted.

5. Kafka Cluster Configuration
Automatically retrieves the available Kafka clusters from the selected environment, validates user permissions, and verifies connectivity between Snowflake and Confluent Cloud through the Onibex connector.

6. Schema Registry Setup
Integrates with the Confluent Schema Registry when required, enabling schema-based data management for compatible connectors and ensuring reliable data serialization workflows.

7. Database and Warehouse Configuration
Requests and validates the required Snowflake account-level privileges to create database objects. Users define the database, schema, and warehouse that will be used to store and process Kafka data. Once approved, the wizard automatically creates these objects in Snowflake.

8. Custom Connector Plugin Deployment
Handles the secure deployment of the Onibex Snowflake Sink Connector plugin. This step requests the necessary license and cloud storage permissions, generates provider-specific presigned URLs, and uploads the connector plugin to Confluent Cloud. Upon successful deployment, the plugin is registered and made available for connector creation.

Once the setup is completed, all configuration details are securely stored within Snowflake, allowing users to access and manage their connectors in future sessions without repeating the setup process.

3. Connector Management Interface

The Connector Management Interface in the Onibex Kafka Connector App provides complete visibility and control over all Snowflake connectors deployed in Confluent Cloud, allowing users to efficiently monitor, configure, and maintain their streaming integrations directly from Snowflake.

  • Consolidated View:
    Displays all active and inactive connectors in a single, unified interface. Key attributes such as connector name and status are clearly presented for quick reference.

  • Detailed View:
    Offers an in-depth overview of each connector, including configuration parameters and status updates retrieved directly from Confluent Cloud via the Onibex integration layer.

  • Search and Filtering:
    Enables users to quickly locate specific connectors by name, environment, or cluster, streamlining navigation and simplifying management at scale.

  • Edit and Maintenance:
    Provides full lifecycle management capabilities, allowing users to update configurations or permanently delete connectors. All operations are executed securely through Snowpark functions interacting with Confluent APIs.

  • Real-Time Refresh:
    Ensures that connector information displayed in the interface always reflects the latest operational state in Confluent Cloud. Updates are fetched dynamically to maintain accuracy and reliability.

This interface empowers users to manage all aspects of their Snowflake-to-Confluent connectivity from within the Snowflake environment, minimizing context switching while maintaining full administrative control.

5. Connector Creation and Configuration

The Connector Configuration Interface in the Onibex Kafka Connector App enables the creation and modification of Snowflake connectors in Confluent Cloud with full control over connection parameters. It offers two modes to accommodate different user preferences: Guided and Advanced.

Guided Mode

In Guided Mode, users can easily configure a new connector through a structured form without manually editing JSON configurations. Required parameters include:

  • Connector Name

  • Kafka Topic

  • Snowflake Database, Warehouse, and Schema

  • Snowflake Authentication Method

  • Endpoints retrieved from SYSTEM$ALLOWLIST()

This mode automatically:

  • Inserts the Onibex Kafka Connector license, ensuring proper operation.

  • Includes the plugin ID corresponding to the connector provided by Onibex.

  • Performs real-time validation to confirm all parameters are compatible with both Snowflake and Confluent Cloud before deployment.

Advanced Mode

Advanced Mode allows users to paste a complete JSON configuration for full customization and flexibility. This mode is ideal for advanced users who need to fine-tune connector behavior or deploy complex configurations. It also:

  • Automatically embeds the Onibex license and plugin ID.

  • Performs syntax and compatibility checks in real time to prevent misconfiguration.

  • Validates essential parameters to ensure the connector can be successfully deployed in Confluent Cloud.

Additional Features

  • General Configuration: Define connector name, associated Kafka topics, and number of parallel tasks.

  • Snowflake Target Parameters: Specify target database, warehouse, schema, and automatically generate the JDBC URL.

  • Endpoint Configuration: Transform JSON data from SYSTEM$ALLOWLIST() into validated endpoint formats for secure connection setup.

  • Authentication: Supports both OAUTH as the default authentification method and private key (PEM) authentication as an optional.

  • Advanced Settings: Configure transaction behavior, batch sizes, deletion rules, and table naming conventions.

This dual-mode design allows both novice and expert users to create and manage connectors efficiently while maintaining security, compliance, and reliability.

6. Schema Registry and Data Format Handling

The Onibex Kafka Connector App includes intelligent schema and format handling capabilities to ensure seamless data integration between Snowflake and Confluent Cloud (Kafka).

  • Automatic Format Detection:
    Automatically detects the message format (Avro, JSON, or String) and applies the appropriate Kafka converter.

  • Schema Registry Verification:
    Validates Schema Registry credentials and network connectivity before schema-based processing is enabled.

  • Compatibility Checks:
    Ensures that incoming data conforms to expected serialization structures, preventing schema drift or misalignment between Snowflake and Kafka.

This system guarantees robust, schema-aware data flow while minimizing manual configuration and reducing the risk of serialization errors.

7. Security and Permissions Mode

The Onibex Kafka Connector App enforces a strong security framework to protect both Snowflake and Confluent Cloud environments, ensuring compliance, data privacy, and operational integrity.

  • Secure Credential Management:
    All credentials are managed through Snowflake Secret References, keeping sensitive information encrypted and isolated.

  • Controlled Network Access:
    External Access Integrations (EAIs) strictly limit outbound communication to pre-approved Confluent Cloud endpoints.

  • Encrypted Data Transfers:
    All communication between Snowflake and Confluent Cloud uses TLS encryption to maintain data confidentiality.

  • License Protection:
    The Onibex license is securely stored, encrypted, and validated at runtime to prevent unauthorized use of the application.

  • Data Isolation:
    The entire app operates within the user’s Snowflake account, ensuring complete separation from Onibex infrastructure and maintaining tenant-level data isolation.

This security model ensures trust, compliance, and integrity across all operations, allowing organizations to safely manage Confluent connectors from within Snowflake.

8. Multi-Cloud Plugin Deployment

The Onibex Kafka Connector App supports multi-cloud plugin deployment, enabling users to deploy connectors seamlessly across AWSAzure, and GCP environments.

Each provider utilizes its respective External Access Integration:

  • AWS: S3_EXTERNAL_ACCESS

  • Azure: AZURE_EXTERNAL_ACCESS

  • GCP: GCP_EXTERNAL_ACCESS

Connector plugins are stored in Snowflake stages and registered directly within the application, enabling:

  • Version control and rollback support for plugin updates.

  • Secure and auditable access to connector artifacts.

  • Cross-cloud compatibility, simplifying deployment across heterogeneous infrastructures.

This capability ensures consistent and flexible connector management in any cloud environment, maintaining the same reliability and governance standards across all platforms.

9. Full Endpoints Connections

The core.get_configuration(ref_name STRING) procedure provides configuration and access control information for external service endpoints used by the Onibex Kafka Connector App.
It plays a critical role in managing secure network connections between Snowflake and external systems such as Confluent CloudKafka ClustersSchema Registrycloud storage services, and the Onibex license server.


General Functionality

  • Returns a JSON configuration object containing:

    • host_ports – The list of URLs or endpoints that the application is authorized to access.

    • allowed_secrets – Indicates whether credentials are permitted for each endpoint (ALL or NONE).

  • Serves as a configuration callback for secure references such as:
    CONFLUENT_CLOUD_API_EXTERNAL_ACCESS,
    AZURE_EXTERNAL_ACCESS,
    AWS_EXTERNAL_ACCESS,
    GCP_EXTERNAL_ACCESS,
    and other external integrations managed by the Onibex Kafka Connector App.

This design ensures that only approved and validated endpoints are used, maintaining full compliance with Snowflake’s External Access Integration (EAI) security model.

Key Endpoints and Reference Mapping

Reference

Description

Host(s) / URL

Allowed Secrets

CONFLUENT_CLOUD_API_EXTERNAL_ACCESS

Access to Confluent Cloud API

api.confluent.cloud

ALL

KAFKA_CLUSTER_EXTERNAL_ACCESS

Network access to Kafka Cluster (extracted from core.app_config table)

Variable depending on configured cluster

ALL

SCHEMA_REGISTRY_EXTERNAL_ACCESS

Access to Confluent Schema Registry

Variable depending on configured Schema registry

ALL

AZURE_EXTERNAL_ACCESS

Access to presigned URLs in Azure Blob Storage

byocprodcentralus.blob.core.windows.net

ALL

ONIBEX_CONNECTOR_LICENSE

Access to Onibex license service

apigateway.onibex2.com

ALL

GCP_EXTERNAL_ACCESS

Access to presigned URLs in Google Cloud Storage

storage.googleapis.com

ALL

AWS_EXTERNAL_ACCESS

Access to presigned URLs in AWS S3 (extracted from core.app_config)

Variable depending on configured bucket

ALL

S3_EXTERNAL_ACCESS

Direct access to multiple S3 buckets

Examples: *.s3.dualstack.us-east-1.amazonaws.com, *.s3.dualstack.eu-west-1.amazonaws.com, etc.

NONE

Access Credentials

The following credentials are used to authenticate and authorize secure communication with external Confluent components:

  • CONFLUENT_CLOUD_API_CREDENTIALS

  • KAFKA_CLUSTER_API_CREDENTIALS

  • SCHEMA_REGISTRY_API_CREDENTIALS

Type: PASSWORD
These credentials are securely managed using Snowflake Secret References, ensuring encrypted storage and controlled runtime access.

Additional Notes

  • Fixed URLs restrict network access to approved domains, preventing unauthorized external communication.

  • Dynamic endpoints are resolved at runtime from the core.app_config table based on the user’s connector configuration.

  • The procedure returns a standardized JSON object containing:

    • type – Defines the configuration type.

    • payload – Contains the detailed endpoint and secret configuration.

This endpoint management model ensures security, flexibility, and compliance, providing a robust foundation for all network interactions within the Onibex Kafka Connector App.


Azure OAuth Integration with Snowflake – Generic Guide

This guide explains how to configure Azure Active Directory (Azure AD) OAuth authentication in Snowflake to allow external systems (such as Kafka connectors or other services) to authenticate securely.

All values in this guide are generic placeholders and must be replaced with your organization’s actual configuration.
The steps focus on what is being configured and why, not on specific credentials.

Prerequisites

Before starting, ensure the following prerequisites are met:

Azure Active Directory

  • An Azure AD application configured for OAuth authentication

  • A service principal associated with the application

  • OAuth tokens issued by Azure AD that include a stable user identifier claim

Snowflake

  • Access to a Snowflake account with the ACCOUNTADMIN role

  • An existing warehouse, database, and schema (or a plan to create them)

  • A clear definition of which Snowflake objects external systems should be allowed to access

Step 1: Connect to the Target Snowflake Account

Connect to the Snowflake account where the OAuth integration will be configured and switch to the administrative role required for security configuration.

USE ROLE ACCOUNTADMIN;

Optionally, confirm the current account and region to ensure changes are applied in the correct environment.

SELECT CURRENT_ACCOUNT_NAME();
SELECT CURRENT_REGION();

Step 2: Create the Azure OAuth Security Integration

Create an External OAuth Security Integration in Snowflake.
This integration establishes trust between Snowflake and Azure AD, allowing Snowflake to validate OAuth tokens issued by Azure.

Key concepts configured in this step:

  • The external identity provider (Azure AD)

  • The token issuer and audience

  • How token claims are mapped to Snowflake users

  • Whether OAuth-authenticated users can assume roles dynamically

CREATE OR REPLACE SECURITY INTEGRATION <OAUTH_INTEGRATION_NAME>
TYPE = EXTERNAL_OAUTH
ENABLED = TRUE
EXTERNAL_OAUTH_TYPE = AZURE
EXTERNAL_OAUTH_ISSUER = '<AZURE_OAUTH_ISSUER_URL>'
EXTERNAL_OAUTH_JWS_KEYS_URL = '<AZURE_JWKS_ENDPOINT>'
EXTERNAL_OAUTH_AUDIENCE_LIST = ('<APPLICATION_ID_URI>')
EXTERNAL_OAUTH_TOKEN_USER_MAPPING_CLAIM = '<TOKEN_CLAIM>'
EXTERNAL_OAUTH_SNOWFLAKE_USER_MAPPING_ATTRIBUTE = '<SNOWFLAKE_USER_ATTRIBUTE>'
EXTERNAL_OAUTH_ANY_ROLE_MODE = 'ENABLE'
COMMENT = 'Azure AD OAuth integration';

Verify the integration configuration:

DESC SECURITY INTEGRATION <OAUTH_INTEGRATION_NAME>;

Step 3: Create a Dedicated Snowflake Role

Create a role that represents the permissions granted to OAuth-authenticated users.
This role defines what actions external systems are allowed to perform in Snowflake.

CREATE ROLE IF NOT EXISTS <OAUTH_ROLE_NAME>;

Step 4: Grant Required Permissions to the Role

Grant the minimum set of permissions required for the integration to function correctly.

OAuth Integration Usage

Allow the role to use the OAuth security integration:

GRANT USAGE ON INTEGRATION <OAUTH_INTEGRATION_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

Warehouse Permissions

Grant access to the warehouse used for data processing:

GRANT USAGE ON WAREHOUSE <WAREHOUSE_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

GRANT OPERATE ON WAREHOUSE <WAREHOUSE_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

Database and Schema Permissions

Grant access to the database and schema where data will be written or queried:

GRANT USAGE ON DATABASE <DATABASE_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

GRANT USAGE ON SCHEMA <DATABASE_NAME>.<SCHEMA_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

GRANT CREATE TABLE ON SCHEMA <DATABASE_NAME>.<SCHEMA_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

Table-Level Access

Grant permissions on both existing and future tables:

GRANT SELECT, INSERT, UPDATE, DELETE
ON ALL TABLES IN SCHEMA <DATABASE_NAME>.<SCHEMA_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

GRANT SELECT, INSERT, UPDATE, DELETE
ON FUTURE TABLES IN SCHEMA <DATABASE_NAME>.<SCHEMA_NAME>
TO ROLE <OAUTH_ROLE_NAME>;

Review the assigned privileges:

SHOW GRANTS TO ROLE <OAUTH_ROLE_NAME>;

Step 5: Create the OAuth-Mapped Snowflake User

Create a Snowflake user that represents the external identity (for example, an Azure service principal).
This user will not authenticate with a password; instead, it will be mapped to an OAuth token.
CREATE USER IF NOT EXISTS <OAUTH_USER_NAME>
LOGIN_NAME = '<EXTERNAL_IDENTITY_IDENTIFIER>'
DEFAULT_ROLE = <OAUTH_ROLE_NAME>
DEFAULT_WAREHOUSE = <WAREHOUSE_NAME>
DEFAULT_NAMESPACE = '<DATABASE_NAME>.<SCHEMA_NAME>'
DEFAULT_SECONDARY_ROLES = ('ALL')
MUST_CHANGE_PASSWORD = FALSE
COMMENT = 'User mapped to external OAuth identity';

Validate the user configuration:

DESC USER <OAUTH_USER_NAME>;

Step 6: Assign the Role to the User

Associate the previously created role with the OAuth-mapped user.
GRANT ROLE <OAUTH_ROLE_NAME>
TO USER <OAUTH_USER_NAME>;

Confirm the role assignment:

SHOW GRANTS TO USER <OAUTH_USER_NAME>;

Step 7: End-to-End Verification

Verify that all components are correctly configured and connected:

DESC SECURITY INTEGRATION <OAUTH_INTEGRATION_NAME>;
SHOW GRANTS ON INTEGRATION <OAUTH_INTEGRATION_NAME>;

SHOW GRANTS TO ROLE <OAUTH_ROLE_NAME>;
SHOW GRANTS OF ROLE <OAUTH_ROLE_NAME>;

DESC USER <OAUTH_USER_NAME>;

Outcome

After completing these steps:

  • Snowflake trusts Azure AD as an OAuth identity provider

  • OAuth tokens are mapped to Snowflake users

  • Users assume a controlled role with scoped permissions

  • External systems can securely authenticate and interact with Snowflake without storing credentials



    • Related Articles

    • Onibex Snowflake Sink Connector Benefits

      The Onibex Snowflake Sink Connector enables real-time data ingestion from Confluent Platform and Confluent Cloud into topic-based subscription tables in Snowflake. It supports idempotent writes through elevator logic and allows for automatic table ...
    • Performance Between Snowflake Connector vs Onibex Connector

      Onibex Connector vs Snowflake Native Performance Analysis and Comparison Study Onibex Connector: Up to 35% Performance Improvement Optimized for high-throughput data ingestion into Snowflake Executive Summary This comprehensive performance analysis ...
    • Onibex Snowflake Sink Connector for Confluent Platform and Cloud

      The JDBC snowflake connector sends real-time data from Confluent Platform and Cloud for writing to the theme-subscription Snowflake Tables. It is possible to achieve idempotent writings with elevators. Self-creation of tables and self-evolution is ...
    • Onibex Snowflake Iceberg Sink Connector for Confluent Platform and Cloud

      Snowflake Connector Setup Guide (JSON, No Primary Key Configuration for Confluent Cloud) Prerequisites Before setting up the Snowflake connector, gather the following information: 1. API Key - Your Confluent Cloud API key. You can create your Kafka ...
    • Onibex Clickhouse Sink Connector

      The Onibex Clickhouse JDBC connector sends real-time data from Kafka to write to Tables based on the topics subscription. It is possible to achieve idempotent writes with upserts. Auto-creation of tables and auto-evolution is supported using the ...