Project Metamorphosis: Unveiling the next-gen event streaming platformLearn More

Why I Can’t Wait for Kafka Summit San Francisco

The Kafka Summit Program Committee recently published the schedule for the San Francisco event, and there’s quite a bit to look forward to.

For starters, it is a two-day event, which means we get to attend 14 talks, miss out on 42 talks (that we’ll later watch on video), and spend two days hanging out with our favorite community friends.

Kafka Summit San Francisco

While the keynotes have not been announced yet (they will be soon!), there are quite a few exciting talks that are not to be missed. Of course, this is entirely personal—depending on your role, interests, and the different ways you use Apache Kafka®, you’ll find different talks exciting—and this is basically the reason there are four tracks.

Interests evolve over time too. I remember two to three years back, I spent all my time listening to talks about various ETL architectures in the Pipelines track. Last year, I attended mostly sessions about event-driven microservices, and this year, I’m especially interested in talks about running Kafka at scale and internals—good thing there are many of those!

Here is just a sample of the talks I’m especially looking forward to:

  • Kafka Cluster Federation at Uber: From the abstract, it sounds like Uber is scaling their Kafka deployment by running many Kafka clusters and somehow making them look like a single cluster to the client. I can’t wait to hear why did they decided to go in that direction, what the benefits are, and how are they are hiding these details from the clients.
  • Tackling Kafka, with a Small Team: Since I’m familiar with the work that Jaren Glover and his team are doing, I know the answer is: “with lots and lots of automation.” It’s always fascinating to hear how top-notch teams are running Kafka in production.
  • What’s the Time?…and Why? Even though I don’t work much with Kafka Streams these days, I really can’t miss a talk with this delicious of a title. It sounds like a great mix of theory and practice that is bound to be enjoyable and educational.
  • Kafka Needs No Keeper: If there is one talk you really shouldn’t miss, this is the one. Two prolific Kafka committers share their ideas for removing the ZooKeeper dependency in Kafka and outline a plan on how the community can collaborate on this massive and exciting project.
  • Achieving a 50% Reduction in Cross-AZ Network Costs from Kafka: This may seem like a niche discussion but…just imagine coming back to the office and presenting to your director detailed plans to cut network costs by half. The conference just paid for itself and more!

I could have gone on and on—there are so many good options, but lists can be overwhelming. I’ll be speaking at the event too and would love to see you there. You can register for Kafka Summit San Francisco using the code Gwen30 to get 30% off and take a look at the full agenda. Don’t forget to share your picks in the comments and on social media using #kafkasummit.

Gwen Shapira is a software engineer on the Core Kafka Team at Confluent. She has 15 years of experience working with code and customers to build scalable data architectures, integrating relational and big data technologies. Gwen is the author of “Kafka—The Definitive Guide” and “Hadoop Application Architectures,” and a frequent presenter at industry conferences. Gwen is a PMC member on the Apache Kafka project and a committer on Apache Sqoop. When Gwen isn’t building data pipelines or thinking up new features, you can find her pedaling on her bike exploring the roads and trails of California, and beyond.

Did you like this blog post? Share it now

Subscribe to the Confluent blog

More Articles Like This

Walmart’s Real-Time Inventory System Powered by Apache Kafka

Consumer shopping patterns have changed drastically in the last few years. Shopping in a physical store is no longer the only way. Retail shopping experiences have evolved to include multiple […]

Kafka Summit London 2020 Agenda, Keynotes, and Other News

Do you make New Year’s resolutions? The most I personally hear about them is people making a big show about how they don’t do them. And sure enough, I don’t […]

Kafka Summit San Francisco 2019 Session Videos

Last week, the Kafka Summit hosted nearly 2,000 people from 40 different countries and 595 companies—the largest Summit yet. By the numbers, we got to enjoy four keynote speakers, 56 […]

Jetzt registrieren

Start your 3-month trial. Get up to $200 off on each of your first 3 Confluent Cloud monthly bills

Nur neue Registrierungen.

Wenn Sie oben auf „registrieren“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer und bin damit einverstanden.

Indem Sie oben auf „Registrieren“ klicken, akzeptieren Sie die Nutzungsbedingungen und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem ist Ihnen bekannt, dass wir Ihre personenbezogenen Daten gemäß unserer und bin damit einverstanden.

Auf einem einzigen Kafka Broker unbegrenzt kostenlos verfügbar
i

Die Software ermöglicht die unbegrenzte Nutzung der kommerziellen Funktionen auf einem einzelnen Kafka Broker. Nach dem Hinzufügen eines zweiten Brokers startet automatisch ein 30-tägiger Timer für die kommerziellen Funktionen, der auch durch ein erneutes Herunterstufen auf einen einzigen Broker nicht zurückgesetzt werden kann.

Wählen Sie den Implementierungstyp aus
Manuelle Implementierung
  • tar
  • zip
  • deb
  • rpm
  • docker
oder
Automatische Implementierung
  • kubernetes
  • ansible

Wenn Sie oben auf „kostenlos herunterladen“ klicken, erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten verarbeiten – gemäß unserer Datenschutzerklärung zu.

Indem Sie oben auf „kostenlos herunterladen“ klicken, akzeptieren Sie die Confluent-Lizenzvertrag und den gelegentlichen Erhalt von Marketing-E-Mails von Confluent. Zudem erklären Sie sich damit einverstanden, dass wir Ihre personenbezogenen Daten gemäß unserer Datenschutzerklärung zu.

Diese Website verwendet Cookies zwecks Verbesserung der Benutzererfahrung sowie zur Analyse der Leistung und des Datenverkehrs auf unserer Website. Des Weiteren teilen wir Informationen über Ihre Nutzung unserer Website mit unseren Social-Media-, Werbe- und Analytics-Partnern.