Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

The Top Sessions from This Year’s Kafka Summit Are…

This past April, Confluent hosted the inaugural Kafka Summit in San Francisco. Bringing together the entire Kafka community to share use cases, learnings and to participate in the hackathon. The summit contributed valuable content to help Kafka users share their experiences using the technology. If you weren’t able to attend, and you’re kinda busy, we’ve curated a short list of the most popular sessions below.

Based on a combination of live attendance and online views after the event, the top sessions from Kafka Summit were:

  • Introducing Kafka Streams – Large-scale Stream processing with Kafka, Neha Narkhede, Confluent
    The concept of stream processing has been around for a while. Yet the idea of directly applying stream processing in infrastructure systems is just coming into its own after a few decades on the periphery. At its core, stream processing is simple: read data in, process it, and maybe emit some data out. So why are there so many stream processing frameworks that all define their own terminology? This talk discusses the fundamentals around stream processing.
  • The Rise of Real-time, Jay Kreps, Confluent
    In this keynote talk, co-creator of Kafka Jay Kreps describes the big change happening in enterprises from a batch-oriented method of data engineering to a truly digital business, based on stream data. Delving into the impacts of creating distributed systems that span an entire company, Jay elevates multitenancy, connectivity, and stream processing as key themes for Kafka users in the coming year.
  • 101 Ways to Configure Kafka – Badly, Henning Spjelkavik and Audun Fauchald Strand, FINN.no
    Kafka was introduced as part of a proof of concept for collecting 20 million click events a day at Norway’s biggest site, FINN.no. Other teams started using Kafka for different purposes, but our configuration was still as if it was a proof of concept (prototype). This lead to scaling problems, stability problems, and lost messages. We’ll tell you what we did, and how we solved it.
  • Kafka + Uber – The World’s Real-time Transit Infrastructure, Aaron Schildkrout, Uber
    How Uber uses Kafka to drive its real-time business. Aaron describes the internal architecture of Uber’s systems and the use of Kafka to create solutions for their real-time business.
  • Building an Event-oriented Data Platform with Kafka, Eric Sammer, Rocana
    While we frequently talk about how to build interesting products on top of machine and event data, the reality is that collecting, organizing, providing access to, and managing this data is where most people get stuck. Many organizations understand the use cases around their data – fraud detection, quality of service and technical operations, user behavior analysis, for example – but are not necessarily data infrastructure experts. In this session, we’ll follow the flow of data through an end to end system built to handle tens of terabytes an hour of event-oriented data, providing real time streaming, in-memory, SQL, and batch access to this data.

Soon, we’ll be releasing new information about the plans for Kafka Summit 2017. We’re so excited about next year’s event, and expect that the whole community will be delighted at the new opportunities to meet and exchange experiences and ideas. Subscribe to our blog to get notified when we release the details.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Walmart’s Real-Time Inventory System Powered by Apache Kafka

Consumer shopping patterns have changed drastically in the last few years. Shopping in a physical store is no longer the only way. Retail shopping experiences have evolved to include multiple […]

Kafka Summit London 2020 Agenda, Keynotes, and Other News

Do you make New Year’s resolutions? The most I personally hear about them is people making a big show about how they don’t do them. And sure enough, I don’t […]

Providing Timely, Reliable, and Consistent Travel Information to Millions of Deutsche Bahn Passengers with Apache Kafka and Confluent Platform

Every day, about 5.7 million rail passengers rely on Deutsche Bahn (DB) to get to their destination. Virtually every one of these passengers needs access to vital trip information, including […]

Jetzt registrieren

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

Nur neue Registrierungen.

Wenn Sie oben auf „registrieren“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer und bin damit einverstanden.

Indem Sie oben auf „Registrieren“ klicken, akzeptieren Sie die Nutzungsbedingungen und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem ist Ihnen bekannt, dass wir Ihre personenbezogenen Daten gemäß unserer und bin damit einverstanden.

Auf einem einzigen Kafka Broker unbegrenzt kostenlos verfügbar
i

Die Software ermöglicht die unbegrenzte Nutzung der kommerziellen Funktionen auf einem einzelnen Kafka Broker. Nach dem Hinzufügen eines zweiten Brokers startet automatisch ein 30-tägiger Timer für die kommerziellen Funktionen, der auch durch ein erneutes Herunterstufen auf einen einzigen Broker nicht zurückgesetzt werden kann.

Wählen Sie den Implementierungstyp aus
Manuelle Implementierung
  • tar
  • zip
  • deb
  • rpm
  • docker
oder
Automatische Implementierung
  • kubernetes
  • ansible

Wenn Sie oben auf „kostenlos herunterladen“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer Datenschutzerklärung zu.

Indem Sie oben auf „kostenlos herunterladen“ klicken, akzeptieren Sie die Confluent-Lizenzvertrag und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten gemäß unserer Datenschutzerklärung zu.

Diese Website verwendet Cookies zwecks Verbesserung der Benutzererfahrung sowie zur Analyse der Leistung und des Datenverkehrs auf unserer Website. Des Weiteren teilen wir Informationen über Ihre Nutzung unserer Website mit unseren Social-Media-, Werbe- und Analytics-Partnern.