Project Metamorphosis: Unveiling the next-gen event streaming platform.Learn More
Unternehmen

Confluent will be at QCon NYC next week

Some of us from Confluent will be speaking at QCon NYC next week about Apache Kafka and Confluent’s stream data platform. Here are some things to look forward to from us.

Tutorial: Capturing and processing streaming data with Apache Kafka

Tuesday June 9th 9am-12pm

Kafka provides high throughput, low latency pub/sub messaging and many large companies are quickly adopting it to handle their realtime and streaming data at large scale. But what can you use it for, and how do you get started? Come to Confluent’s tutorial conducted by our first engineer, Ewen Cheslack-Postava on June 9th at 9am to find out.

We’ll start out with an overview of Kafka starting from the basics. You’ll learn about Kafka’s unifying abstraction, a partitioned and replicated low-latency commit log. Then we’ll discuss concrete applications of Kafka across multiple domains so you can see how Kafka can work for your company.

With a solid understanding of Kafka fundamentals, you’ll develop an end-to-end application that performs anomaly detection on streaming data to see how quickly you can get up and running with Kafka. The implementation will be broken into two parts. First, you’ll take an existing front-end application and instrument it with a Kafka producer to store user activity events in Kafka. Second, you’ll build a distributed, fault tolerant service that detects and reports anomalies in the activity data.

By the end of the session, you’ll understand and be able to apply all the core functionality of Kafka.

And, the fun doesn’t stop there because you can still attend…

The Many Faces of Apache Kafka: Leveraging real-time data at scale

Thursday June 11th 1:40pm-2.30pm

If you are curious about how Kafka is adopted at large scale in production or if you are looking to learn how to adopt Kafka in practice, attend my talk at 1.40pm on June 11th.

Since we open sourced Kafka more than 4 years ago, it has been adopted very widely from web companies like Uber, Netflix, LinkedIn to more traditional enterprises like Cerner, Goldman Sachs and Cisco. These companies use Kafka in a variety of ways – as the infrastructure for ingesting high-volume log data into Hadoop, to collect operational metrics for monitoring and alerting applications, for low latency messaging use cases, and to power near realtime stream processing.

In this talk, you will learn how Kafka’s unique architecture allows it to be used both for real time processing and as a bus for feeding batch systems like Hadoop. You will also learn how Kafka is fundamentally changing the way data flows through an organization and presents new opportunities for processing data in real time that were not possible before. I will discuss how Kafka impacts the way data is integrated across a variety of data sources and systems.

Lastly, you can expect to learn how you can go about adopting Kafka in your company to leverage real-time data at scale in practice.

If you can’t make it to the tutorial or talk, feel free to ping me or Ewen if you’d like to talk about Apache Kafka or Confluent.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

ksqlDB: The Missing Link Between Real-Time Data and Big Data Streaming

Is event streaming or batch processing more efficient in data processing? Is an IoT system the same as a data analytics system, and a fast data system the same as […]

Providing Timely, Reliable, and Consistent Travel Information to Millions of Deutsche Bahn Passengers with Apache Kafka and Confluent Platform

Every day, about 5.7 million rail passengers rely on Deutsche Bahn (DB) to get to their destination. Virtually every one of these passengers needs access to vital trip information, including […]

Building a Real-Time, Event-Driven Stock Platform at Euronext

As the head of global customer marketing at Confluent, I tell people I have the best job. As we provide a complete event streaming platform that is radically changing how […]

Jetzt registrieren

Erhalten Sie in den ersten drei Monaten einen Rabatt von bis zu 50 USD pro Kalendermonat auf Ihre Rechnung.

Nur neue Registrierungen.

By clicking “sign up” above you understand we will process your personal information in accordance with our und bin damit einverstanden.

Indem Sie oben auf „Registrieren“ klicken, akzeptieren Sie die Nutzungsbedingungen und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem ist Ihnen bekannt, dass wir Ihre personenbezogenen Daten gemäß unserer und bin damit einverstanden.

Auf einem einzigen Kafka Broker unbegrenzt kostenlos verfügbar
i

Die Software ermöglicht die unbegrenzte Nutzung der kommerziellen Funktionen auf einem einzelnen Kafka Broker. Nach dem Hinzufügen eines zweiten Brokers startet automatisch ein 30-tägiger Timer für die kommerziellen Funktionen, der auch durch ein erneutes Herunterstufen auf einen einzigen Broker nicht zurückgesetzt werden kann.

Wählen Sie den Implementierungstyp aus
Manual Deployment
  • tar
  • zip
  • deb
  • rpm
  • docker
oder
Automatische Implementierung
  • kubernetes
  • ansible

By clicking "download free" above you understand we will process your personal information in accordance with our Datenschutzerklärung zu.

Indem Sie oben auf „kostenlos herunterladen“ klicken, akzeptieren Sie die Confluent-Lizenzvertrag und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten gemäß unserer Datenschutzerklärung zu.

Diese Website verwendet Cookies zwecks Verbesserung der Benutzererfahrung sowie zur Analyse der Leistung und des Datenverkehrs auf unserer Website. Des Weiteren teilen wir Informationen über Ihre Nutzung unserer Website mit unseren Social-Media-, Werbe- und Analytics-Partnern.