At a glance
WSO2 Streaming Integrator
A streaming data processing runtime that allows you to integrate event streams and take action based on it. Its sophisticated, web-based IDE provides an enhanced developer experience.
Streaming Integrator allows you to connect any data source to any destination. It has 60+ prebuilt, production-grade connectors that can be used to connect to sources and destinations—such as Kafka, files, DBs, and HTTP endpoints. The component offers a web-based IDE that enables building complex streaming data processing logic easily using a drag-and-drop graphical editor or a streaming SQL editor. Container-friendly by design, Streaming Integrator can be deployed in VM and Docker and Kubernetes environments.
Provides a rich and agile developer experience along with a graphical drag-and-drop editor that enables faster development for non-technical users.
Native Support for Streaming Systems
Fully compatible with messaging systems, such as Kafka and NATS, to fully leverage the potential of your streaming data.
Provides the ability to develop robust data integration flows by processing various event sources and advanced integration capabilities.
Easy and Simple Deployment
Achieve higher SLAs with small, lightweight, and easy-to-maintain deployments.
Includes over 60 pre-built connectors for well-known sources and destinations and out-of-the-box support for various transports, protocols, and data formats.
An overview of
WSO2 Streaming Integrator
Consume streaming data and apply stream processing techniques to process them. Integrate the processed data with one or more destinations and trigger integrations.
Seamless Integration with WSO2 Micro Integrator
Trigger complex integration flows with WSO2 Micro Integrator based on the results yielded by processing streaming data. This makes it possible to build stateful integration flows and robust data processing pipelines that can both integrate data and act on it.
Real-time ETL and Streaming Data Integration
Pre-built connectors to extract and load data by connecting to various databases, cloud storage, and endpoints communicating in different protocols with different formats. Siddhi’s sophisticated data and stream capabilities allow you to transform, enrich, correlate, analyze, cleanse, and aggregate data on the fly.
Fully Compatible with Streaming Messaging Systems, such as Kafka and NATS
It is designed to utilize streaming messaging-based systems with advanced stream processing and data-processing capabilities. Streaming Integrator supports all major functionalities of Kafka and NATS out of the box, allowing you to develop complex scenarios with minimal time and effort.
A Sophisticated Web-based Development IDE
A state-of-the-art, web-based IDE for creating Siddhi Applications, using graphical drag-and-drop and streaming SQL editors, including smart editing, event replay and simulation, and debugging capabilities to support the complete development workflow.
Deploy in VM, Docker, or Kubernetes
Native support for Kubernetes with a Kubernetes Operator designed to provide a convenient way of deploying Streaming Integrator directly on a Kubernetes cluster. Streaming Integrator is container friendly by design with small image sizes and a low resource footprint.
Handles 100K TPS Without a Data Loss with Just Two Nodes
Handles large loads with a simple deployment. Only two nodes are enough to process data with high availability by eliminating the risk of losing data.
WSO2 Streaming Integrator in an Integration Environment
- Integrate streams of data published by external apps, streaming systems such as Kafka, and data at rest in databases and log files. Streaming Integrator let’s you treat every data source as a stream.
- Load data into various destinations, such as data warehouses, legacy systems, HTTP endpoints, files, DBs, Broker, etc.
- Build complex business logic with the power of stream processing and act based on the results by executing complex integration flows through Micro Integrator.
- Expose processed and stored data via a REST API to be fetched on demand with ad hoc queries.
- Front streaming systems, such as Kafka, to realize the full potential of streaming data with advanced stream processing capabilities.