The growth of Kafka inside an organization sometimes follows the development of the broader Kafka ecosystem over its lifetime. The initial use case may be something conceptually simple, like mainframe offload or point-to-point integration, evoking the simple Large Pipe architectures of Kafka’s infancy. Then those newly populated streams of events present themselves as fertile grounds for real-time analytics, as stream processing applications grow up around them to perform analysis event-by-event, leaving behind legacy ETL processes and their long batch times. Finally, a rich set of event streams gradually comes to describe more and more of the evolving state of the business, forming the substrate on which an ecosystem of event-driven microservices can thrive.This growth in architectural sophistication of an organization’s Kafka usage mirrors the development of those same concepts in the Kafka community over the past decade. In many cases, the process can be played forward at an accelerated rate as leaders draw on lessons learned and concepts developed by the community. This talk traces this development, ending with a comprehensive vision of an event-driven architecture suitable for the next generation of information technology deployments. You’ll leave knowing where you need to go and how this new architectural paradigm will help you get there.