8 Years of Event Streaming with Apache Kafka
Since I first started using Apache Kafka® eight years ago, I went from being a student who had just heard about event streaming to contributing to the transformational, company-wide event
Project Metamorphosis: Wir präsentieren die Event-Streaming-Plattform der nächsten Generation. Mehr erfahren
Since I first started using Apache Kafka® eight years ago, I went from being a student who had just heard about event streaming to contributing to the transformational, company-wide event
We launched a transformation initiative three years ago that transitioned SEI Investments from a monolithic database-oriented architecture to a containerized services platform with an event-driven architecture based on Confluent Platform.
Every day, about 5.7 million rail passengers rely on Deutsche Bahn (DB) to get to their destination. Virtually every one of these passengers needs access to vital trip information, including
As the head of global customer marketing at Confluent, I tell people I have the best job. As we provide a complete event streaming platform that is radically changing how
This past April, Confluent hosted the inaugural Kafka Summit in San Francisco. Bringing together the entire Kafka community to share use cases, learnings and to participate in the hackathon. The
I’m really excited to announce a major new feature in Apache Kafka v0.10: Kafka’s Streams API. The Streams API, available as a Java library that is part of the official
For a long time, a substantial portion of data processing that companies did ran as big batch jobs — CSV files dumped out of databases, log files collected at the
We are very excited to announce the general availability of Confluent Platform 2.0. For organizations that want to build a streaming data pipeline around Apache Kafka, Confluent Platform is the
I am very excited that LinkedIn’s deployment of Apache Kafka has surpassed 1.1 trillion (yes, trillion with a “t”, and 4 commas) messages per day. This is the largest deployment of Apache
Apache Kafka is widely used to enable a number of data intensive operations from collecting log data for analysis to acting as a storage layer for large scale real-time stream
Use CL60BLOG to get an additional $60 of free Confluent Cloud