Building a scalable real-time data architecture is not an easy task. Moving from monolithic architectures to event-driven microservices can seem daunting or even impossible. No longer does a single application and a database make sense; there are thousands of technology choices that may solve a particular use case better than another. How do you connect all of these technologies? How do you leverage them to their full potential?
Apache Kafka is a data streaming platform that acts as the central nervous system of your architecture. By streaming events into Kafka as they happen, events can be made available to the enterprise for systems to react to them in real-time. Relational databases, NoSQL stores, search replicas, caches -- they all have a place in architecture, yet they all need something: data. Kafka provides us with the ability to scale data integration and support the migration to an event-driven architecture within an organization.
This talk will:
Describe why events are important & why you need a streaming platform
Discuss the role of Kafka & its ecosystem in an organization’s architecture
Detail how to integrate your databases & external systems with Kafka Connect
Distributed systems engineer and Apache Kafka enthusiast. At work, I'm currently architecting and implementing a data streaming platform. Outside of work, I like to write code, listen to music, and share cool tech with people.
Does this session sound interesting? You may also like these: