Robert Koch

Robert Koch is a technical lead at S&P Global, and one of the community leaders of DeafintheCloud.com. He helps drive cloud-native architecture, blogs about migrating to the cloud and use of Lambdas, and has a passion for data- and event-driven systems.

Having earned five AWS certifications (Cloud Practitioner, Big Data Specialty, DevOps Engineer Associate, SysOps Administrator Associate, and Solution Architect Associate), Robert is actively involved in the development community in Denver, often speaking at Denver Dev Day and the AWS Denver Meetup. Robert’s goal is to help the community understand the advantages of migrating to the cloud, being cloud-native, and having “serverless” applications and databases.

projects by Robert Koch

Event-Driven Data Pipeline with Python and Kafka

4 weeks · 4-6 hours per week average · BEGINNER

Welcome to the Piper Data Concepts (PDC) team! You’re a member of its development team, and a Fortune 1000 client has asked you to modernize its workflow process, which just happens to be PDC’s specialty. In this liveProject series, you’ll review the client’s 15-year-old batch-based system, identify issues and bottlenecks, and determine what’s needed to transform its workflow into a more reactive, extensible, and dynamic system. To create observability, you’ll build an event-driven data pipeline with Kafka, use Python Poetry to package the project, write Python code using the Faust library to communicate with Kafka, and store the consumed data in a PostgreSQL database.

Your final goal will be to enable the client’s staff to gather workflow process information in real-time. You’ll write Python code that consumes messages from Kafka and prepares them for storing in the database, create Postgres queries to get the aggregated data, and build reports in CSV files to be read by visualization tools. When you’re done with these projects, your client’s workflow will be more resilient, responsive, and plugin-ready—and you’ll have a solid understanding of event-driven architecture.

Automate Reports

1 week · 4-6 hours per week · BEGINNER

As a member of the development team at Piper Data Concepts, you’ll carry out the final steps of a workflow-improvement project: enabling your client’s staff to gather workflow process information in real-time. Several prototypes have been built, and the client’s workflow is more resilient than ever. You’ll write Python code that consumes messages from Kafka and prepares them for storing in the database, create Postgres queries to access the aggregated data, and build reports in CSV files to be read by visualization tools—and ultimately, your client’s staff. When you’re done, your client’s modern system will provide a feedback loop, enable external API access to status updates, and be ready for more specialized services to be plugged in later, with no code changes.

Observability

1 week · 4-6 hours per week · BEGINNER

Put on your platform architect hat! You’re a member of the development team at Piper Data Concepts (PDC), and your client is looking to modernize its workflow. An existing benchmarked development environment, made up of Kafka, Python, and Postgres, is at your disposal. Now it’s time to start conceptualizing the new and improved workflow. You’ll use Kafka to create an event-driven data pipeline, review and understand business requirements, use Python Poetry to package the project, write Python code using the Faust library to communicate with Kafka, and store the consumed data in a PostgreSQL database.

Python and Kafka

1 week · 4-6 hours per week · BEGINNER

Step into the role of a developer at Piper Data Concepts (PDC), a company that specializes in helping Fortune 1000 companies improve their workflows. Your task is to review the 15-year-old workflow architecture of one of your clients, Trade Data Systems. You’ll identify issues and bottlenecks, then determine what’s needed to transform its workflow into a more modern, responsive architecture. To accomplish this, you’ll set up a development environment with Docker using Kafka, Python, and Postgres. As you go, you’ll deploy a Kafka cluster and write Python code using the Faust library to seamlessly process pre-defined business events.