More companies are using distributed event streaming platforms that enable not only publish-and-subscribe, but also the storage and process data. Companies across all industries are now relying on real-time data to personalize customer experiences or detect fraudulent behavior.
On July 24, 2019, Confluent, provider of an event streaming platform based on Apache Kafka®, and ATIX, the Linux and open source specialist, invite you to a three-hour workshop in Munich to demonstrate different use cases of streaming technology in large- and midsize- companies. These include examples from Deutsche Bahn, AirBnB and more.
Please RSVP to reserve your seat, we only have limited spaces available!
15:30 – 16:00: Registration and snacks
16:00 – 17:00: Real-time processing of large amounts of data: What is a streaming platform? How do companies use Apache Kafka and Confluent? – Confluent
17:00 – 17:45: How to use Apache Kafka & Confluent (and how not to) – Bernhard Hopfenmüller, ATIX
17:45 – 18:30: Your streaming journey, more use cases and open discussions – all
18:30 – 21:00: Get together, Dinner, Q&A
- Real-time processing of large amounts of data: an event streaming platform as the central nervous system in the enterprise
- Challenges and solutions for integrating, processing and storing large amounts of data in real time
- Why both the tech giants from Silicon Valley and established, successful businesses (must) rely on event streaming platforms
- Use cases from the automotive and financial sector, retail, logistics companies and more
- Apache Kafka as the de facto standard both for Global 2000 and in SMEs for future transformation projects
Sessions will be held in English per request.
Parkring 15, 3rd floor
85748 Garching bei München