One Connect - Pricing Manual

One Connect - Pricing Manual

How One Connect Pricing Works

One Connect uses a T-shirt sizing model: Small, Medium, and Large. Each component of the solution is quoted independently based on the size that best matches the data volume, processing load, or consumption of the customer's SAP environment.

How to Use the Pricing Calculator (Excel)

Onibex provides a pricing Excel file divided into 4 blocks, one for each One Connect component. The process is as follows:

  1. Identify the correct size (t-shirt) for each component based on the criteria described in this article.
  2. Fill in only the cells marked in blue in each block. These cells correspond to the Quantity field for the selected size.
  3. The Excel will automatically calculate the subtotal for each block based on the size and quantity entered.
Notes
Note
Only fill in the blue cells (Quantity field). The formulas are already configured and will calculate the subtotal automatically.

The Three Sizes

All components use the same three standard sizes. The criteria for choosing each size varies by component:

  • Small: Smaller-scale environments, low data volumes, or a reduced number of integrations.
  • Medium: Mid-scale environments, moderate volumes, and higher concurrency.
  • Large: High-scale environments, large SAP databases, and high event throughput.

Block 1: Data Modeler

The One Connect SAP Data Modeler is the low-code/no-code tool that defines and extracts SAP entities (tables, CDS Views, documents) toward the Smart Gateway.

How is the size defined?

The size is selected based on the total size of the customer's SAP database:

  • Small: Smaller SAP databases (small environments or subsidiaries).  Size 0TB - 2TB
  • Medium: Mid-sized SAP databases (regional operations or multi-module deployments). Size 2TB - 5TB
  • Large: Large-scale SAP databases (enterprise environments with high transactional volume). Size  5TB +

Block 2: Smart Gateway

The One Connect Smart Gateway receives data from the Data Modeler, translates it into Kafka Schemas and Topics, and serializes it in Avro format. It is built on Kubernetes.

This component is available in two deployment modes: SaaS (managed by Onibex) and Bring Your Own Cloud (deployed on the customer's infrastructure).

How is the size defined?

The size is determined based on two main factors:

  • Volume of records processed per second (SAP event throughput).
  • Number of active entities and Data Products running simultaneously.

For a detailed sizing guide based on Kubernetes resources (CPU, memory, replicas), refer to the following article:


Notes
Note
The Smart Gateway has differentiated pricing based on the deployment mode. SaaS and Bring Your Own Cloud 

Block 3: Data Streaming Platform

The Data Streaming Platform is the Kafka-based messaging layer that connects the Smart Gateway with the final destinations (Databricks, Snowflake, HANA, etc).

Notes
Note
This component applies only in SaaS mode. If the customer uses their own Kafka platform (Bring Your Own Cloud  or self-managed cloud), this block does not apply to the quote.

How is the size defined?

The size is determined by the data volume and the number of topics/partitions required:

  • Small: A small number of topics and low message throughput.
  • Medium: Moderate flows, several SAP modules integrated simultaneously.
  • Large: High messages-per-second volume, multiple active sources and destinations.

Block 4: Connectors (Premium Kafka Connectors)

The One Connect Premium Kafka Connectors are Confluent Verified connectors that move data from Kafka to the analytics and AI destinations

Like the Smart Gateway, connectors are available in both SaaS and Bring Your Own Cloud  modes.

How is the size defined?

The size is defined based on the connector's consumption volume — that is, the amount of data the connector must write to the destination:

  • Small: Few active Data Products and low write volume to the destination.
  • Medium: Multiple Data Products, frequent writes with UPSERT support.
  • Large: High volume of idempotent writes, multiple destinations active simultaneously.
Notes
Note
Connectors have differentiated pricing based on the deployment mode. SaaS and Bring Your Own Cloud .

Ready to Get Your Quote?

Once you have identified the correct t-shirt size for each component, reach out to the Onibex Sales Team to request your official quote. To speed up the process, please have the following information ready:

  • SAP database size (GB/TB) and version (ECC or S/4HANA).
  • Estimated volume of records processed per second (event throughput).
  • Number of Data Products and SAP modules to be integrated.
  • Preferred deployment mode for each component (SaaS or Bring Your Own Cloud ).
  • Target destination platform (Databricks, Snowflake, and/or ClickHouse).
Idea
Contact our Sales Team at contact@onibex.com and we will guide you through the sizing process and prepare a tailored proposal for your organization.
    • Related Articles

    • AWS - EKS EC2 One Connect Deployment Manual with Terraform

      This manual provides step-by-step instructions for deploying an Amazon EKS (Elastic Kubernetes Service) cluster using Terraform. It includes the configuration of essential components such as AWS credentials, infrastructure provisioning, EBS CSI ...
    • One Connect Platform - Functionalities

      Introduction The table below outlines key features of the OneConnect platform, showcasing out of the box functionalities that support user management, system monitoring, SAP connectivity, and real-time data operations. No Function Description 1 ...
    • Azure - One Connect Deployment Prerequisites

      Before you Begin: The following deployment manuals assumes a basic understanding of Kubernetes concepts. For more information, refer to Kubernetes core concepts for Azure Kubernetes Service (AKS). Before you begin: If you don't have an Azure account, ...
    • One connect by Onibex (Delta Sharing- Databricks)

      Curated SAP table data by Onibex in Databricks, shared as Bronze, Silver, and Gold datasets via Delta Sharing. This tool allows you to consume data shared via Delta Sharing and store it inside your own Databricks Unity Catalog. Why is this important? ...
    • How to connect Kubectl to BTP Kyma environment using AWS EC2

      Overview This guide walks through the steps required to connect kubectl to a BTP Kyma environment using an AWS EC2 instance as the intermediary host. Prerequisites EC2 instance running Amazon Linux. pem key file for the EC2 instance (e.g., ...