Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

Kinetica Joins Confluent Partner Program and Releases Confluent Certified Connector for Apache Kafka®

This guest post is written by Chris Prendergast, VP of Business Development and Alliances at Kinetica.

Today, we’re excited to announce that we have joined the Confluent Partner Program and completed development and certification of our Apache Kafka® Connector, which lets you read and write data directly between Kafka and Kinetica’s GPU-Accelerated, In-Memory Analytics Database so you can ingest real-time data streams from Apache Kafka and take immediate action on incoming data.

Joint customers can now ingest streaming data from sensors, mobile apps, connected devices, and social media into Kinetica, combine it with data at rest, and analyze it in real-time to improve customer experience, deliver targeted marketing offers, and for operational efficiencies.

The Certified Kinetica Connector enables you to:

  • Easily leverage Kinetica’s GPU-accelerated, in-memory analytics database with Kafka for streaming analytics, so you can power real-time decision making.
  • Gain powerful insights by using Kinetica for machine learning, deep learning, and OLAP on real-time, streaming data
  • Develop robust data integration based on Kafka’s Connect API.
  • Build stream processing applications with the streams API in Kafka.

The Source code for the connector is available here: https://github.com/kineticadb/kinetica-connector-kafka

The connector contains two classes that integrate Kinetica database with Kafka:

  • KineticaSourceConnector: A Kafka Source Connector, which receives a data stream from the Kinetica database via table monitor. Data is streamed in flat Kafka Connect “Struct” format with one field for each table column. A separate Kafka topic is created for each database table configured.
  • KineticaSinkConnector: A Kafka Sink Connector, which receives a data stream from a Kafka Source Connector and writes it to the Kinetica database. Streamed data must be in a flat Kafka Connect “Struct” that uses only supported data types for fields (BYTES, FLOAT64, FLOAT32, INT32, INT64, and STRING). No translation is performed on the data and it is streamed directly into a table. The target table and collection will be created if they do not exist.

The Kinetica Connector can be deployed into any Confluent cluster from the Control Center GUI or command line using the Kafka Connect RESTful API. The Kafka Connect API ensures fault-tolerant integration between the Kafka topic stream and the Kinetica. For example, retailers can use the Kinetica connector to capture real-time, streaming geospatial data from shopper’s mobile phones as Kafka streams, combine it with customer loyalty data in Kinetica, and push out targeted, personalized, location-based offers through mobile apps.

You can now seamlessly add Kinetica to your scalable and secure stream data pipelines. Kinetica’s GPU-accelerated, distributed, in-memory analytics database provides truly real-time response to queries on large, complex and streaming data sets.

Combined with the Confluent enterprise-grade streaming data platform, this powerful solution will help you capitalize on streaming data to power real-time decision making and drive your business results.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Announcing the Snowflake Sink Connector for Apache Kafka in Confluent Cloud

We are excited to announce the preview release of the fully managed Snowflake sink connector in Confluent Cloud, our fully managed event streaming service based on Apache Kafka®. Our managed […]

How Merging Companies Will Give Rise to Unified Data Streams

Company mergers are becoming more common as businesses strive to improve performance and grow market share by saving costs and eliminating competition through acquisitions. But how do business mergers relate […]

Build Real-Time Observability Pipelines with Confluent Cloud and AppDynamics

Many organisations rely on commercial or open source monitoring tools to measure the performance and stability of business-critical applications. AppDynamics, Datadog, and Prometheus are widely used commercial and open source […]

Jetzt registrieren

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

Nur neue Registrierungen.

Wenn Sie oben auf „registrieren“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer und bin damit einverstanden.

Indem Sie oben auf „Registrieren“ klicken, akzeptieren Sie die Nutzungsbedingungen und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem ist Ihnen bekannt, dass wir Ihre personenbezogenen Daten gemäß unserer und bin damit einverstanden.

Auf einem einzigen Kafka Broker unbegrenzt kostenlos verfügbar
i

Die Software ermöglicht die unbegrenzte Nutzung der kommerziellen Funktionen auf einem einzelnen Kafka Broker. Nach dem Hinzufügen eines zweiten Brokers startet automatisch ein 30-tägiger Timer für die kommerziellen Funktionen, der auch durch ein erneutes Herunterstufen auf einen einzigen Broker nicht zurückgesetzt werden kann.

Wählen Sie den Implementierungstyp aus
Manuelle Implementierung
  • tar
  • zip
  • deb
  • rpm
  • docker
oder
Automatische Implementierung
  • kubernetes
  • ansible

Wenn Sie oben auf „kostenlos herunterladen“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer Datenschutzerklärung zu.

Indem Sie oben auf „kostenlos herunterladen“ klicken, akzeptieren Sie die Confluent-Lizenzvertrag und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten gemäß unserer Datenschutzerklärung zu.

Diese Website verwendet Cookies zwecks Verbesserung der Benutzererfahrung sowie zur Analyse der Leistung und des Datenverkehrs auf unserer Website. Des Weiteren teilen wir Informationen über Ihre Nutzung unserer Website mit unseren Social-Media-, Werbe- und Analytics-Partnern.