Establishment of the Database and Requirements
One Connect Manual in Azure
Establishment of the Database and Requirements
Prerequisites
For the SQL database, the following are recommended:
- Download the following sql.zip file - Database Deployment
Deployment Steps
Before proceeding, extract the sql.zip file by running the following command:
Navigate to the extracted folder:
cd sql
The .env
file contains the necessary variables for the database setup. Edit this file as needed before running the script.

For deployment in Kubernetes, set the variable KUBERNETES_MODE to true
3. Run the run-sql.sh Script
Before executing the script, assing execution permissions:
Run the Script
This process loads the variables from the .env
file, verifies the database type and corresponding container, and executes the SQL scripts to create and populate the database.

If you are unable to execute the script, make sure to install dos2unix and then run the script again. You can install it and convert the script using the following commands:
sudo apt-get install dos2unix
sudo yum install dos2unix
dos2unix setup_db.sh
./setup_db.sh
Related Articles
OneConnect Deployment and Configuration
One Connect Platform: Kubernetes Azure Deployment Manual for Private Network This manual outlines the steps required to deploy the One Connect system on a Kubernetes cluster within a Private Network (meaning the OneConnect Platform will not be ...
OneConnect General Architecture
The following representation shows a general architecture diagram for OneConnect, considering SAP ECC or S4HANA as the producer and destinations such as Clickhouse, Snowflake, and Databricks as subscribers. The OneConnect structure consists of 3 main ...
One Connect Cloud Deployment
Prerequisites Download the required .zip folders attached at the end of the document: sql.zip one-connect.zip kafka-compose.zip Requirements for the Virtual Machine System: Linux Architecture: 64-bit processors (x86_64) support Instance ...
Onibex Databricks JDBC Connector for Confluent Cloud
JDBC Onibex Connector for Databricks The JDBC Onibex connector for Databricks sends real-time data from Kafka to write into live DeltaLake tables. Idempotent writes can be achieved using upserts. Automatic table creation and schema evolution are ...
Onibex Clickhouse Sink Connector
The Onibex Clickhouse JDBC connector sends real-time data from Kafka to write to Tables based on the topics subscription. It is possible to achieve idempotent writes with upserts. Auto-creation of tables and auto-evolution is supported using the ...