Put on your platform architect hat! You’re a member of the development team at Piper Data Concepts (PDC), and your client is looking to modernize its workflow. An existing benchmarked development environment, made up of Kafka, Python, and Postgres, is at your disposal. Now it’s time to start conceptualizing the new and improved workflow. You’ll use Kafka to create an event-driven data pipeline, review and understand business requirements, use Python Poetry to package the project, write Python code using the Faust library to communicate with Kafka, and store the consumed data in a PostgreSQL database.
This liveProject is for programmers interested in learning the concepts and skills used in event-driven development. To begin these liveProjects you’ll need to be familiar with the following:TOOLS
In this liveProject, you’ll create a more resilient pipeline by decoupling systems within a workflow.