One connect by Onibex (Delta Sharing- Databricks)
Curated SAP table data by Onibex in Databricks, shared as Bronze, Silver, and Gold datasets via Delta Sharing.
This tool allows you to consume data shared via Delta Sharing and store it inside your own Databricks Unity Catalog.
Why is this important?
-
Delta Sharing data is read-only
-
To analyze, transform, or govern the data, you must import it into your own catalog
1. Connection (Data provider by Onibex)
Delta Sharing access token provided by the data owner
Delta Sharing API URL

Name of the shared dataset
 Schema inside the share |
2. Destination
Target Unity Catalog

Schema where imported tables will be stored3. Action Buttons
Validates connection and discovers tables
Reloads available shared tables

Creates schema in Unity Catalog if missing4. Process
Step 1: Enter Connection Details
-
Paste the Delta Sharing token
-
Paste the endpoint URL
-
Enter the Share name
-
Enter the Schema name
These values are provided by Onibex.
Step 2: Connect to Delta Sharing
Click 🚀 Connect to Share
✔ What happens:
Token and endpoint are validated
-
Share and schema are verified
-
Available tables are discovered
If successful, the catalog is automatically refreshed.
Step 3: Review Available Tables
After connection, the interface displays:
📚 TABLES AVAILABLE IN DELTA SHARING
Each table represents:
Step 4: Create Target Schema (First Time Only)
Click 🏗️ Create Target Schema
✔ This will:
⚠️ This step is required before saving tables.
Step 5: Load a Table from Delta Sharing
For each table you want to import:
-
Click 📥 Refresh Catalog
The system:
-
Reads data from Delta Sharing
-
Loads it temporarily into memory
-
Displays row and column counts
Step 6: Save the Table into Unity Catalog
Click 💾 Save to Catalog
✔ What happens:
✔ Result:
Step 7: Verify Imported Tables
Click 📋 View My Catalog
This displays:
Delta Sharing
-
Secure data sharing mechanism
-
Token-based authentication
-
Read-only access
-
Data does not appear automatically in Unity Catalog
Unity Catalog
-
Central governance layer in Databricks
-
Stores managed Delta tables
-
Supports permissions, lineage, and auditing
Related Articles
One Connect - Pricing Manual
How One Connect Pricing Works One Connect uses a T-shirt sizing model: Small, Medium, and Large. Each component of the solution is quoted independently based on the size that best matches the data volume, processing load, or consumption of the ...
Onibex Databricks JDBC Connector for Confluent Cloud
JDBC Onibex Connector for Databricks The JDBC Onibex connector for Databricks sends real-time data from Kafka to write into live DeltaLake tables. Idempotent writes can be achieved using upserts. Automatic table creation and schema evolution are ...
Onibex Clickhouse Sink Connector
The Onibex Clickhouse JDBC connector sends real-time data from Kafka to write to Tables based on the topics subscription. It is possible to achieve idempotent writes with upserts. Auto-creation of tables and auto-evolution is supported using the ...
AWS - EKS EC2 One Connect Deployment Manual with Terraform
This manual provides step-by-step instructions for deploying an Amazon EKS (Elastic Kubernetes Service) cluster using Terraform. It includes the configuration of essential components such as AWS credentials, infrastructure provisioning, EBS CSI ...
One Connect Platform - Functionalities
Introduction The table below outlines key features of the OneConnect platform, showcasing out of the box functionalities that support user management, system monitoring, SAP connectivity, and real-time data operations. No Function Description 1 ...