Back to Mid Level
Kafka Pub/Sub
Prerequisite
- Prior experience or knowledge of Kafka
- Prior experience or knowledge of streaming applications
Challenge
You’ve been hired by Storm Rebels Crop. Storm Rebels is a team of highly trained storm chasers driving around in a purpose built vehicle called the dominator. The dominator has gadgets on it that record temperature every second. You’ve been tasked to write a streaming pipeline that takes the temperature readings and stores them into a Kafka topic.
Tasks
- Get Kafka and PostgreSQL running locally using docker compose. We’ve prepared a
docker-compose.yml
file for you that will spin up PostgreSQL and Kafka. It is available here. The YAML file includes instructions on how to access Kafka and PostgreSQL from inside/outside docker. - After you do
docker-compose up
you should have the following containers up
Requirements
- Write a microservice in language of your choice that publishes randomly generated temperature readings in Celcius to a kafka topic called (
celcius-readings
) every 1 second. The data in the topic should be: adouble
celcius reading and along
epoch timestamp. Example:44.56:1638841414
- Write a microservice in language of your choice that subscribes to the
celcius-readings
kafka topic, converts it to fahrenheit and inserts it into a PostgreSQL table calledstorm_chasers_data
. Be creative about what columns should be in the table. - Write a SQL query that finds the two hottest and two coldest temperatures in a single query.
- The entire solution (including microservices) should be in a single docker-compose.yml so we can run the entire thing with one command:
docker-compose up
Deliverable
- A GitHub repo with read permissions given to GitHub users
rafty8s
,bsneider
,omnipresent07
, andbarakstout
(how to invite collaborators) - The repo should have enough README.md instructions
- A docker-compose.yml using which everything can be spun up
- Any additional commands that need to be executed