Learn how AO.com are enabling real-time event-driven applications to improve customer experience using Confluent Platform.The introduction of Apache Kafka and the Confluent platform is supporting AO.com in modernizing the technical approach to delighting its customers. A key part of this enablement is the introduction of an event-streaming eco-system enabling event-driven applications and architecture.Throughout this session, we’ll look at the challenges AO.com faced when looking to adopt Kafka, their use of Confluent Platform including Kafka Connect and KSQL and the adoption of Confluent Cloud. We’ll look at the first steps, where the team are at now and what the future looks like.Watch Now
Organizations today are looking to stream IoT data to Apache Kafka. However, connecting tens of thousands or even millions of devices over unreliable networks can create some architecture challenges. In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.Watch Now
In this technical deep dive, we’ll discuss the proposition of Incremental Cooperative Rebalancing as a way to alleviate stop-the-world and optimize rebalancing in Kafka APIs.Watch Now
Integrating Apache Kafka with other systems in a reliable and scalable way is a key part of an event streaming platform. This session will show you how to get streams of data into and out of Kafka with Kafka Connect and REST Proxy, maintain data formats and ensure compatibility with Schema Registry and Avro, and build real-time stream processing applications with Confluent KSQL and Kafka Streams.
This session is part 4 of 4 in our Fundamentals for Apache Kafka series.Watch Now
Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer. From there we’ll cover strategies for building powerful stream processing applications, including high availability through replication, data retention policies, producer design and producer guarantees, and more.
This session is part 3 of 4 in our Fundamentals for Apache Kafka series.Watch Now
In diesem Webinar stellen wir die neuen Security-Komponenten (Role Based Access Control -RBAC und Secret Protection) von Confluent Platform live vor und gehen auch auf die Best Practices in diesem Umfeld ein.Watch Now
This session explains Apache Kafka’s internal design and architecture. Companies like LinkedIn are now sending more than 1 trillion messages per day to Apache Kafka. Learn about the underlying design in Kafka that leads to such high throughput.
This session is part 2 of 4 in our Fundamentals for Apache Kafka series.Watch Now
This talk explains how companies are using event-driven architecture to transform their business and how Apache Kafka serves as the foundation for streaming data applications. Learn how major players in the market are using Kafka in a wide range of use cases.
This session is part 1 of 4 in our Fundamentals for Apache Kafka series.Watch Now
En este webinar en castellano describiremos porqué y para qué existe Apache Kafka. Veremos algunos de los usos más habituales, y analizaremos los tres componentes básicos laplataforma: El log de almacenamiento inmutable, el motor de procesado en tiempo real (Kafka Streams) y el framework que lo conecta con el mundo exterior (Kafka Connect).Watch Now
Learn how Centene improved their ability to interact and engage with healthcare providers in real time with MongoDB and Confluent Platform.Watch Now
Leaders in organizations who are responsible for global supply chain planning need to work with and integrate data from disparate sources around the world. Many of these data sources output information in real-time, which assists planners in operationalizing plans and interacting with manufacturing output. This talk showcases different use cases in automation and Industrial IoT (IIoT) where an event streaming platform adds business value.Watch Now
This online talk dives into the new Verified Integrations Program and the integration requirements, the Connect API and sources and sinks that use Kafka Connect. We cover the verification steps and provide code samples created by popular application and database companies. We will discuss the resources available to support you through the connector development process.
Part 2 of 2 in Building Kafka Connectors - The Why and HowWatch Now
This online talk focuses on the key business drivers behind connecting to Kafka and introduces the new Confluent Verified Integrations Program. We’ll discuss what it takes to participate, the process and benefits of the program.
Part 1 of 2 in Building Kafka Connectors - The Why and HowWatch Now
This session shows how various sub-systems in Apache Kafka can be used to aggregate, integrate and attribute these signals into signatures of interest.Watch Now
Sucht euch ein schattiges Plätzchen und spitzt die Ohren: Confluent Platform 5.3 ist GA und wir freuen uns über viele neue Funktionen. Wir sprechen unter anderem über den Confluent Operator (also Kafka auf Kubernetes), Ansible Playbooks und die neue UI des Confluent Control Centers. Sprecher: Kai Waehner, Technology Evangelist, ConfluentWatch Now
This talk provides a deep dive into the details of the rebalance protocol, starting from its original design in version 0.9 up to the latest improvements and future work. We discuss internal technical details, pros and cons of the existing approaches, and explain how you configure your client correctly for your use case. Additionally, we discuss configuration tradeoffs for stateless, stateful, on-prem, and containerized deployments.Watch Now
This talk discusses the key design concepts within Apache Kafka Connect and the pros and cons of standalone vs distributed deployment modes. We'll do a live demo of building pipelines with Apache Kafka Connect for streaming data in from databases, and out to targets including Elasticsearch. The talk will finish off by discussing more advanced topics including Single Message Transforms, and deployment of Kafka Connect in containers.Watch Now
This interactive whiteboard presentation discusses use cases leveraging the Apache Kafka® open source ecosystem as an event streaming platform to process IoT data. The session shows architectural alternatives of how devices like cars, machines or mobile devices connect to Apache Kafka via IoT standards like MQTT or OPC-UA.Watch Now
Apache Kafka® is used by thousands of companies across the world but, how difficult is it to operate? Which parameters do you need to set? What can go wrong? This online talk is based on real-world experience of Kafka deployments and explores a collection of common mistakes that are made when running Kafka in production and some best practices to avoid them.Watch Now
This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics. Apache Kafka and Druid form the backbone of an architecture that meet the demands imposed on the next generation applications you are building right now.Watch Now
Real-time data has value. But how do you quantify that value in order to create a business case for becoming data, or event driven? This talk explores why valuing Kafka is important - but covers some of the problems in quantifying the value of a data infrastructure platform.Watch Now
This talk explores the benefits around cloud-native platforms and running Apache Kafka on Kubernetes, what kinds of workloads are best suited for this combination, and best practices to determine the path forward for legacy monoliths in your application portfolio.Watch Now
In this session, we will cover the easiest ways to start developing event-driven applications with Apache Kafka using Confluent Platform. We will also demo a contextual event-driven application built using our ecosystem of connectors, REST proxy, and a variety of native clients.Watch Now
Erfahren Sie in diesem Online Talk von unserem Kafka-Experten wie leicht man Apache Kafka und Confluent Platform auf Kubernetes.Watch Now
Wie Audi mithilfe von Kafka und Confluent eine Fast Data IoT-Plattform umgesetzt hat, die den Bereich "Connected Car“ revolutioniert.Watch Now
Confluent KSQL is the streaming SQL engine that enables real-time data processing against Apache Kafka®. It provides an easy-to-use, yet powerful interactive SQL interface for stream processing on Kafka, without the need to write code in a programming language such as Java or Python.Watch Now
In this session, we'll compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases.Watch Now
In this online talk, you will learn why, when facing Open Banking regulation and rapidly increasing transaction volumes, Nationwide decided to take load off their back-end systems through real-time streaming of data changes into Apache Kafka®.Watch Now
In this all too fabulous talk, we will be addressing the wonderful and new wonders of KSQL vs. KStreams and how Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products.Watch Now
This talk looks at one of the most common integration requirements – connecting databases to Apache Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, the different methods for connecting databases to Apache Kafka, and the pros and cons of each.Watch Now
Express Scripts is reimagining its data architecture to bring best-in-class user experience and provide the foundation of next-generation applications. This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds.Watch Now
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to GCP. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.Watch Now
This online talk includes in depth practical demonstrations of how Confluent and Panopticon together support several key financial services and IoT applications, including transaction cost analysis and risk monitoring.Watch Now
Detecting fraudulent activity in real time can save a business significant amounts of money, but has traditionally been an area requiring a lot of complex programming and frameworks, particularly at scale. Using KSQL, it's possible to use just SQL to build scalable real-time applications.Watch Now
Join experts from Confluent, Arcadia Data and RCG for a discussion and demo on how companies are integrating streaming data technologies to transform their business. This talk will cover how to integrate real-time analytics and visualizations to drive business processes and how KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time.Watch Now
Insurance companies are facing similar challenges like all other disrupted market segments like the change of customer expectations and hence the need of differentiating itself new as a brand in a challenging market environment. Learn how Generali Switzerland set up an event-driven architecture to support their digital transformation project.Watch Now
Curious about Apache Kafka®? Find out why you would want to use the de facto standard for real-time streaming, the easiest way to get started and how to leverage the extensive Apache Kafka ecosystem. In this chat, we'll demo how to implement real-time clickstream analytics on Confluent Cloud, fully managed Apache Kafka as a service.Watch Now
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to AWS. By implementing a central-pipeline architecture using Apache Kafka to sync on-prem and cloud deployments, companies can accelerate migration times and reduce costs.Watch Now
Join us for a demo to learn how easy it is to integrate your Apache Kafka® streams in Apache Druid (incubating) to provide real-time insights into the data. In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.Watch Now
If you are familiar with banking processes, you will understand that this is not simple. Many banking processes are implemented as batch jobs on not-so-commodity hardware, meaning that any migration effort is immense.
Organizations are quickly adopting microservice architectures to achieve better customer service and improve user experience while limiting downtime and data loss. Learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®.Watch Now
As streaming platforms become central to data strategies, companies both small and large are re-thinking their enterprise architecture with real-time context at the forefront. Monoliths are evolving into microservices. Datacenters are moving to the cloud. What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.Watch Now
With the evolution of data-driven strategies, event-based business models are influential in innovative organizations. As businesses look to expand traditional revenues, sourcing events from enterprise applications, mobile apps, IoT devices and social media in real time becomes essential to staying ahead of the competition.Watch Now
Modern streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, can help companies catch and detect fraud in real time instead of after the fact. Combine this with Arcadia Data visualizations designed for modern data types, and you have a powerful foundation for combating fraud.Watch Now
There’s a prevailing enterprise perception that compliance with data protection regulations and standards is a burden: limiting the leverage of data. However, the core requirement of compliance—better control of data—has multiple downstream benefits. When compliance objectives are aligned with existing business objectives, the business can experience net gain.Watch Now
Confluent Co-founder Jun Rao discusses how Apache Kafka® became the predominant publish/subscribe messaging system that it is today, Kafka's most recent additions to its enterprise-level set of features and how to evolve your Kafka implementation into a complete real-time streaming data platform.Watch Now
See how Kinetica enables businesses to leverage the streaming data delivered with Confluent Platform to gain actionable insights, how to leverage the Kafka Connect API to integrate data sources and destinations without writing cumbersome code and a KSQL demo showcasing an end-to-end flow of the complete data pipeline from a live source, to KSQL and finally into Kinetica.Watch Now
Attend this webinar featuring Dave Menninger of Ventana Research to learn from the firm’s benchmark research about what streaming data is and why it is important. Guest speaker Joanna Schloss will join to discuss how event streaming platforms deliver real time actionability on data as it arrives into the business.Watch Now
In this online talk, Technology Evangelist Kai Waehner will discuss and demo how you can leverage technologies such as TensorFlow with your Kafka deployments to build a scalable, mission-critical machine learning infrastructure for ingesting, preprocessing, training, deploying and monitoring analytic models.Watch Now
Watch Lyndon Hedderly's keynote from Big Data Analytics London 2018.Watch Now
In this talk we will look at what event driven systems are; how they provide a unique contract for services to communicate and share data and how stream processing tools can be used to simplify the interaction between different services, be them closely coupled or largely disconnected.Watch Now
The ‘current state of stream processing’ walks through the origins of stream processing, applicable use cases and then dives into the challenges currently facing the world of stream processing as it drives the next data revolution.Watch Now
We’ll discuss how to leverage some of the more advanced transformation capabilities available in both KSQL and Kafka Connect, including how to chain them together into powerful combinations for handling tasks such as data-masking, restructuring and aggregations.
This is part 3 of 3 in Streaming ETL - The New Data Integration online talk series.Watch Now
In this talk, we'll build a streaming data pipeline using nothing but our bare hands, the Kafka Connect API and KSQL. We'll stream data in from MySQL, transform it with KSQL and stream it out to Elasticsearch. Options for integrating databases with Kafka using CDC and Kafka Connect will be covered as well.
This is part 2 of 3 in Streaming ETL - The New Data Integration series.Watch Now
Join experts from VoltDB and Confluent to see why and how enterprises are using Apache Kafka as the central nervous system in combination with VoltDB. We’ll walk through an application that leverages real-time data for machine learning in a scalable way.Watch Now
Capital One supports interactions with real-time streaming transactional data using Apache Kafka®. Kafka helps deliver information to internal operation teams and bank tellers to assist with assessing risk and protect customers in a myriad of ways. Join us for this online talk on lessons learned, best practices and technical patterns of Capital One’s deployment of Apache Kafka.Watch Now
In this online talk, Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent and Kafka PMC member, will help you navigate through the hype, address frequently asked questions and deliver critical information to help you decide if running Kafka on Kubernetes is the right approach for your organization.Watch Now
Join Gwen Shapira, Apache Kafka® committer and co-author of "Kafka: The Definitive Guide," as she presents core patterns of modern data engineering and explains how you can use microservices, event streams and a streaming platform like Apache Kafka to build scalable and reliable data pipelines designed to evolve over time.
This is part 1 of 3 in Streaming ETL - The New Data Integration series.Watch Now
Joignez-vous à notre série de conférences en ligne en deux parties, accélérez les flux via KSQL pour plonger au coeur de cet outil. Nos experts vous expliqueront l’architecture du moteur KSQL et montreront comment concevoir et déployer des requêtes interactives et continues.Watch Now
Joignez-vous à notre série de conférences en ligne en deux parties, accélérez les flux via KSQL pour plonger au coeur de cet outil. Nos experts vous expliqueront l’architecture du moteur KSQL et montreront comment concevoir et déployer des requêtes interactives et continues.Watch Now
Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content while still maintaining searchability, accuracy and accessibility through a variety of applications and services—all through the power of a real-time streaming platform.Watch Now
In diesem Online Talk zeigen wir gibt eine kurze Einführung in Apache Kafka und die Verwendung als Daten-Streaming-Plattform. Es wird erklärt, wie Kafka als Grundlage sowohl für Daten-Pipelines als auch für Anwendungen dient, die Echtzeit-Datenströme konsumieren und verarbeiten. Außerdem zeigen wir, wie Confluent Enterprise, Confluents Angebot von Apache Kafka, es Unternehmen ermöglicht, den Betrieb, die Verwaltung und Überwachung einer Kafka-Streaming-Plattform zu zentralisieren und zu vereinfachen.Watch Now
In this interactive discussion, the KSQL team will answer 10 of the toughest, most frequently asked questions about KSQL. These range from technical examples of managing streaming topics to practical applications and common use cases, such as market basket pattern identification and network monitoring patterns.Watch Now
Dans ce wébinaire présenté par Confluent et Attunity, découvrez comment le streaming de données peut être accéléré afin de permettre des analyses en temps réel avec des données en direct provenant de nombreuses sources.Watch Now
Im gemeinsamen Online Talk von Confluent und Attunity zeigen wir, wie die Bereitstellung von Daten beschleunigt werden kann, um Echtzeitanalysen mit Live-Daten aus vielen Quellen zu ermöglichen.Watch Now
Dank Apache Kafka® können Unternehmen Echtzeitdaten optimal nutzen. Im Webinar erfahrt ihr mehr über Daten-Streaming und wie es Entwicklungskosten reduzieren kann.Watch Now
In this talk, members of the Pinterest team offer lessons learned from their Confluent Go client migration and discuss their use cases for adopting Kafka Streams.Watch Now
In this session, Nick Dearden covers the planning and operation of your KSQL deployment, including under-the-hood architectural details. You will learn about the various deployment models, how to track and monitor your KSQL applications, how to scale in and out and how to think about capacity planning.
This is part 3 out of 3 in the Empowering Streams through KSQL series.Watch Now
Join us as we build a complete streaming application with KSQL. There will be plenty of hands-on action, plus a description of our thought process and design choices along the way. Look out for advice on best practices and handy tips and tricks as we go.
This is part 2 out of 3 in the Empowering Streams through KSQL series.Watch Now
This session covers the patterns and techniques of using KSQL. Tim Berglund discusses the various building blocks that you can use in your own applications, starting with the language syntax itself and covering how and when to use its powerful capabilities like a pro.
This is part 1 out of 3 in the Empowering Streams through KSQL series.Watch Now
In this presentation, we discuss best practices of monitoring Apache Kafka®. We look at which metrics are critical to alert on, which are useful in troubleshooting and what may actually be misleading. We review a few “worst practices” - common mistakes that you should avoid. We then look at what metrics don’t tell you - and how to cover those essential gaps.
This is part 5 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.Watch Now
In this session, we discuss disaster scenarios that can take down entire Apache Kafka® clusters and share advice on how to plan, prepare and handle these events. This is a technical session full of best practices - we want to make sure you are ready to handle the worst mayhem that nature and auditors can cause.
This is part 4 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.Watch Now
In this short session, we discuss the basic patterns of multi-datacenter Apache Kafka® architectures, explore some of the use cases enabled by each architecture and show how Confluent Enterprise products make these patterns easy to implement.
This is part 3 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.Watch Now
In this session, we go over everything that happens to a message – from producer to consumer, and pinpoint all the places where data can be lost. You will learn how developers and operation teams can work together to build a bulletproof data pipeline with Apache Kafka.
This is part 2 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.Watch Now
In this talk, Gwen Shapira describes the reference architecture of Confluent Enterprise, which is the most complete platform to build enterprise-scale streaming pipelines using Apache Kafka®. This talk is intended for data architects and system administrators planning to deploy Apache Kafka in production.
This is part 1 out of 5 in the Best Practices for Apache Kafka in Production Confluent Online Talk Series.Watch Now
Learn about the KSQL architecture and how to design and deploy interactive, continuous queries for streaming ETL and real-time analytics.Watch Now
Join the discussion on the relationship between microservices and stream processing with Data-Intensive Apps author Martin Kleppmann, Confluent engineers Damian Guy and Ben Stopford, chaired by Jay Kreps, co-founder and CEO, Confluent.Watch Now
Microservices guru Sam Newman, Buoyant CTO Oliver Gould and Apache Kafka® engineer Ben Stopford are joined by Jay Kreps, co-founder and CEO, Confluent for a Q&A session where they discuss and debate all things Microservices.Watch Now
Learn about the recent additions to Apache Kafka® to achieve exactly-once semantics (EoS) including support for idempotence and transactions in the Kafka clients. The main focus will be the specific semantics that Kafka distributed transactions enable and the underlying mechanics which allow them to scale efficiently.Watch Now
Neha Narkhede talks about the experience at LinkedIn moving from batch-oriented ETL to real-time streams using Apache Kafka and how the design and implementation of Kafka was driven by this goal of acting as a real-time platform for event data. She covers some of the challenges of scaling Kafka to hundreds of billions of events per day at Linkedin, supporting thousands of engineers, etc.Watch Now
Join us as we walk through an overview of this exciting new service from the experts in Kafka. Learn how to build robust, portable and lock-in free streaming applications using Confluent Cloud.Watch Now
How small can a microservice be? This talk will look at how Stateful Stream Processing is used to build truly autonomous, often minuscule services. With the distributed guarantees of Exactly Once Processing, Event-Driven Services supported by Apache Kafka® become reliable, fast and nimble, blurring the line between business system and big data pipeline.
This is part 3 of 3 in the Apache Kafka for Microservices: A Confluent Online Talk Series.Watch Now
Should you use REST to sew services together? Is it better to use a richer, brokered protocol? This practical talk will dig into how we piece services together in event driven systems, how we we use a distributed log to create a central, persistent narrative and what benefits we reap from doing so.
This is part 2 of 3 in the Apache Kafka® for Microservices: A Confluent Online Talk Series.Watch Now
Services come with a problem: they’re not well suited to sharing data. This talk will examine the underlying dichotomy we all face as we piece such systems together--one that is not well served today. The solution lies in blending the old with the new and Apache Kafka® plays a central role.
This is part 1 of 3 in the Apache Kafka for Microservices: A Confluent Online Talk Series.Watch Now
This talk focuses on how to integrate all the components of the Apache Kafka® ecosystem into an enterprise environment and what you need to consider as you move into production.
This concluded the 6-part Apache Kafka: Online Talk Series.Watch Now
In this talk, we survey the stream processing landscape, the dimensions along which to evaluate stream processing technologies, and how they integrate with Apache Kafka®.
This is part 5 of 6 in the Apache Kafka: Online Talk Series.Watch Now
Learn how to map practical data problems to stream processing and write applications that process streams of data at scale using Kafka Streams.
This is part 4 of 6 in the Apache Kafka: Online Talk Series.Watch Now
Learn typical use cases for Apache Kafka®, how you can get real-time data streaming from Oracle databases to move transactional data to Kafka and enable continuous movement of your data to provide access to real-time analytics.Watch Now
Learn different options for integrating systems and applications with Apache Kafka® and best practices for building large-scale data pipelines using Apache Kafka.
This is part 3 of 6 in the Apache Kafka: Online Talk Series.Watch Now
In this talk by Jun Rao, co-creator of Apache Kafka®, get a deep dive on some of the key internals that makes Apache Kafka popular, including how it delivers reliability and compaction.
This is part 2 of 6 in the Apache Kafka: Online Talk Series.Watch Now
Get an introduction to Apache Kafka® and how it serves as a foundation for streaming data pipelines and applications that consume/process real-time data streams.
This is part 1 of 6 in the Apache Kafka: Online Talk Series.Watch Now
Get answers to: How you would use Apache Kafka® in a micro-service application? How do you build services over a distributed log and leverage the fault tolerance and scalability that comes with it? When should you use these tools and when should you not? And where does stream processing fit in?Watch Now
Experts from Confluent and Attunity share how you can: realize the value of streaming data ingest with Apache Kafka®, turn databases into live feeds for streaming ingest and processing, accelerate data delivery to enable real-time analytics and reduce skill and training requirements for data ingest.Watch Now
Explore the use cases and architecture for Apache Kafka®, and how it integrates with MongoDB to build sophisticated data-driven applications that exploit new sources of data.Watch Now
Wir verwenden Cookies, damit wir nachvollziehen können, wie Sie unsere Website verwenden, und um Ihr Erlebnis zu optimieren. Klicken Sie hier, wenn Sie mehr erfahren oder Ihre Cookie-Einstellungen ändern möchten. Wenn Sie weiter auf dieser Website surfen, stimmen Sie unserer Nutzung von Cookies zu.