One Connect Cloud Deployment
Prerequisites
Download the required .zip folders attached at the end of the document:
- sql.zip
- one-connect.zip
- kafka-compose.zip
Requirements for the Virtual Machine
- System: Linux
- Architecture: 64-bit processors (x86_64) support
- Instance Specifications: vCPUs with 4 virtual cores, RAM with 16 GiB, Network bandwidth up to 12.5 Gbps
- Storage Specifications: Disk capacity with 280 GiB, Storage performance with 6000 IOPS (Input/Output Operations Per Second)
- Network Configuration:The following ports must be enabled in the security group to ensure the correct operation of associated services:
- Kafka Ports:
- 9093 – Broker
- 9102 – Broker
- 8086 – Schema Registry
- 8083 – Connect
- 8085 – Control Center
- 8089 – KsqlDB
- One Connect Ports:
- 5050 – Frontend
- 9000 – API Gateway
- 707070 – Metrics
- 7072 – Logs
- 50501 – Producer
- Database Ports:
3306 – MySQL
Database creation
Initial conditions
The Onibex OneConnect Cloud platform has a number of microservices that must persist for theworkspaces operation. Although workspaces can run standalone, there are certain functionalitiesof the platform that provide them with additional value to the customer.
Services are Java projects in Spring that performa JDBC connection implementing JPA and Hibernate for SQL queries.
It is suggested to use as an SQL database:
Note:Theoretically, the JPA standard and Hibernate implementation make it possible to use any type of supported database,as PostgreSQL or SQL Server. Although the platform has not been tested with these databases, they should work properly. Importanttake this into account when implementing implementation.Prerequisites
- Download the following sql.zip file: https://oneconnectdeploymentaks.s3.us-east-2.amazonaws.com/database/sql.zip
Deployment Steps
Before proceeding, extract the sql.zip file:
Enter to the file:
cd sql.zip
2. .env File configuration
The .env file contains the necessary variables for database setup, edit this file as needed before running the script.

For deployment in docker, set the variable KUBERNETES_MODE to false
3. Run the run-sql.sh Script
Before executing the script, assing execution permissions:
Run the Script
This process loads the variables from .env file, verifies the database type and corresponding container and executes SQL scripts to create and populate the database.

If you are unable to execute the script, make sure to install dos2unix and then run the script again. You can install it and convert the script using the following commands:
sudo apt-get install dos2unix
sudo yum install dos2unix
dos2unix setup_db.sh
./setup_db.sh
Initial Credentials
Administrator:
Password: mzve$JQ@bg#zWmiDC3G$Jt
Default user:
Password: mzve$JQ@bg#zWmiDC3G$Jt
Preparation
Download the kafka-compose folder: https://oneconnectdeploymentaks.s3.us-east-2.amazonaws.com/oneconnectdocker/kafka-compose-latest.zip
Which contains the necessary files for deploying Kafka and its components. Inside, you will find:
- docker-compose.yml file
- setup_kafka_dirs.sh script
- .env file
- control-center folder
If you want to check the configured credentials, you can review the password.properties file located in the control-center
folder.
.env File configuration
Configure the .env file by entering the address of the server where Kafka will be deployed.
Once the files are configured, transfer the kafka-compose folder to the desired location on the server.
Run the script
Inside the kafka-compose folder, grant execution permissions to the setup_kafka_dirs.sh script using the following command:
chmod +x setup_kafka_dirs.sh
Run the script to create the volume folders and deploy Kafka using Docker Compose with the following command:
Access to Control Center
Access Control Center from your preferred browser by entering the IP address of the server where Kafka was deployed, followed by port 9022, using the following format:
<Server_IP>:9022
Enter the credentials configured in the password.properties file.
Once authenticated, you will be able to view all deployed Kafka components.
Requirements:
Deployed Confluent Platform
Configured Database
Deployment configuration
It includes three internal folders:
Frontend
- Platform
- Workspace
And the following files:
.env File configuration
Before uploading the folder to the server, it is necessary to configure the .env file, which contains the environment variables required for the system’s operation. It is recommended to use a code editor to complete this task.
To enable email services, you must configure the following variables with the details of your SMTP server:
- EMAIL_PASSWORD
- EMAIL_SMTPAUTH
- EMAIL_SMTPFROM
- EMAIL_SMTPHOST
- EMAIL_SMTPPORT
- EMAIL_SMTPSSLPROTOCOLS
- EMAIL_SMTPSTARTTLSENABLE
- EMAIL_USER
Run the run_compose.sh Script
Upload the one-connect-compose folder, with the configured files, to the destination server.
Once the folder is on the server, navigate to its directory.
Run the following command to grant execution permissions to the script:
Start the deployment of the containers by running:
./run_compose.sh
Access to interface
Wait about a minute for all containers to start up completely. Then, open a browser and enter the IP address of the server where OneConnect was deployed, followed by port 5050, using the following format:
Initial Credentials
Administrator:
Password: mzve$JQ@bg#zWmiDC3G$Jt
Default user:

If you are unable to execute the script, make sure to install dos2unix and then run the script again. You can install it and convert the script using the following commands:
sudo apt-get install dos2unix
sudo yum install dos2unix
dos2unix setup.sh
./setup unix
Related Articles
OneConnect Deployment and Configuration
One Connect Kubernetes Deployment Manual This manual outlines the steps required to deploy the One Connect system on a Kubernetes cluster using an automated script. The process sets up the necessary services, deployments, and resources for the system ...
Onibex Databricks JDBC Connector for Confluent Cloud
JDBC Onibex Connector for Databricks The JDBC Onibex connector for Databricks sends real-time data from Kafka to write into live DeltaLake tables. Idempotent writes can be achieved using upserts. Automatic table creation and schema evolution are ...
Onibex Snowflake Sink Connector for Confluent Platform and Cloud
The JDBC snowflake connector sends real-time data from Confluent Platform and Cloud for writing to the theme-subscription Snowflake Tables. It is possible to achieve idempotent writings with elevators. Self-creation of tables and self-evolution is ...
Establishment of the Database and Requirements
One Connect Cloud Manual in Azure Establishment of the Database and Requirements Initial conditions The Onibex OneConnect Cloud platform has a number of microservices that must persist for theworkspaces operation. Although workspaces can run ...
OneConnect General Architecture
The following representation shows a general architecture diagram for OneConnect, considering SAP ECC or S4HANA as the producer and destinations such as Clickhouse, Snowflake, and Databricks as subscribers. The OneConnect structure consists of 3 main ...