Skip to content

Submissions: Streaming and Cloud (Modules 08-09)​‌​‌​‌​​‍​‌​​​‌​‌‍​​‌‌‌​‌​‍​​‌‌​‌​​‍​‌‌​​‌​‌‍​​‌‌‌​​​‍​‌‌​​‌​​‍​​‌‌‌​​‌‍​‌‌​​​‌​‍​​‌‌​​​‌‍​‌‌​​​​‌‍​​‌‌​‌​‌‍​‌‌​​‌‌​‍​​‌‌​‌‌​‍​‌‌​​‌​‌‍​​‌‌​‌‌‌‍​‌‌​​​‌‌‍​​‌‌​​‌‌‍​‌‌​​‌​​‍​​‌‌‌​‌​‍​​‌‌​​‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​‌‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​​​‌‍​​‌‌​​‌‌‍​​‌‌‌​‌​‍​‌‌​​‌​​‍​​‌‌​‌‌‌‍​​‌‌​​‌‌‍​​‌‌​​​‌‍​​‌‌​‌‌​‍​​‌‌​‌​‌‍​​‌‌‌​​​‍​​‌‌​​‌​

This guide explains how to submit the exercises for Streaming with Kafka and Cloud with LocalStack.​‌​‌​‌​​‍​‌​​​‌​‌‍​​‌‌‌​‌​‍​​‌‌​‌​​‍​‌‌​​‌​‌‍​​‌‌‌​​​‍​‌‌​​‌​​‍​​‌‌‌​​‌‍​‌‌​​​‌​‍​​‌‌​​​‌‍​‌‌​​​​‌‍​​‌‌​‌​‌‍​‌‌​​‌‌​‍​​‌‌​‌‌​‍​‌‌​​‌​‌‍​​‌‌​‌‌‌‍​‌‌​​​‌‌‍​​‌‌​​‌‌‍​‌‌​​‌​​‍​​‌‌‌​‌​‍​​‌‌​​‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​‌‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​​​‌‍​​‌‌​​‌‌‍​​‌‌‌​‌​‍​‌‌​​‌​​‍​​‌‌​‌‌‌‍​​‌‌​​‌‌‍​​‌‌​​​‌‍​​‌‌​‌‌​‍​​‌‌​‌​‌‍​​‌‌‌​​​‍​​‌‌​​‌​


Summary

These modules are optional advanced. You complete them if you want to go beyond the basic Final Project.

Module Topic Submission
08 Kafka + Spark Streaming entregas/streaming_cloud/kafka/
09 LocalStack + Terraform entregas/streaming_cloud/localstack/

Submission Structure

entregas/streaming_cloud/lastname_firstname/
├── PROMPTS.md              ← THE MOST IMPORTANT
├── kafka/                  ← Module 08 (if you completed it)
│   ├── docker-compose.yml  ← Kafka in KRaft mode
│   ├── productor.py        ← Your producer
│   ├── consumidor.py       ← Your consumer
│   └── capturas/
│       ├── kafka_ui.png
│       └── alertas.png
└── localstack/             ← Module 09 (if you completed it)
    ├── docker-compose.yml  ← LocalStack
    ├── main.tf             ← Your Terraform
    ├── lambdas/
    │   └── capturar.py
    └── capturas/
        ├── terraform_apply.png
        └── s3_bucket.png

The PROMPTS.md File

Just like the Final Project, the most important thing is to document your real prompts.

Template

# PROMPTS - Streaming and Cloud

**Student:** [Your name]
**Date:** [Submission date]
**Modules completed:** [08 / 09 / both]

---

## Part 1: My Prompts (EXACTLY as I wrote them)

### Challenge 1: Launch Kafka
[Paste your real prompt, with errors and all]

**AI Response:**
[Summary of what it answered]

**Result:**
- [ ] Worked on the first try
- [ ] Had to adjust (explain what)
- [ ] Did not work (explain the error)

### Challenge 2: Producer
[...]

### Challenge 3: Consumer
[...]

(continue with each challenge you completed)

---

## Part 2: Screenshots

Include screenshots of:
- Kafka UI showing messages (if you did 08)
- Alerts in console (if you did 08)
- Successful `terraform apply` (if you did 09)
- S3 bucket with data (if you did 09)

---

## Part 3: Reflection

### What I learned about streaming/cloud
[2-3 paragraphs]

### Difference between batch and streaming
[Your understanding]

### What I would do differently
[Self-criticism]

---

## Part 4: Blueprint (generated by AI)

Ask your AI:
> "Summarize in professional format the previous prompts,
> highlighting learning patterns and progression"

[Paste the response here]

What Is Evaluated

Criterion Weight Description
Prompt authenticity 40% Real prompts, not cleaned
Challenges completed 30% How many challenges you finished
Screenshots 15% Visual evidence that it works
Reflection 15% Conceptual understanding

Dashboard Bonus

If you created your own earthquake dashboard or ISS tracker:

Criterion Bonus
Working dashboard +10%
Live updates +5%
Professional design +5%

How to Submit

Step 1: Create your folder

# From the root of your fork
mkdir -p entregas/streaming_cloud/your_lastname_firstname

Step 2: Copy your files

# If you did Kafka
mkdir -p entregas/streaming_cloud/your_lastname_firstname/kafka
cp docker-compose.yml productor.py consumidor.py entregas/.../kafka/

# If you did LocalStack
mkdir -p entregas/streaming_cloud/your_lastname_firstname/localstack
cp -r *.tf lambdas/ entregas/.../localstack/

Step 3: Create PROMPTS.md

Use the template above and document your entire process.

Step 4: Upload

git add .
git commit -m "Streaming/Cloud Submission - Lastname Firstname"
git push

Available Challenges

Module 08: Kafka

Challenge Difficulty Description
1 Basic Launch Kafka with Docker
2 Basic Create Python producer
3 Basic Create Python consumer
4 Intermediate Connect USGS API
5 Intermediate Alert system
6 Advanced Spark Structured Streaming
Final Advanced Your own dashboard

See details: Streaming with Kafka

Module 09: LocalStack

Challenge Difficulty Description
1 Basic Launch LocalStack
2 Basic S3 Bucket with Terraform
3 Intermediate Lambda Hello World
4 Intermediate Lambda consumes API
5 Intermediate Save to S3
6 Advanced EventBridge scheduling
7 Advanced DynamoDB metadata
Final Advanced Your own ISS Tracker

See details: Cloud with LocalStack


Reference Examples

Reference dashboards are available for inspiration:

Do not copy

These are the professor's examples. Your dashboard should have your own style. The system detects excessive similarity.


Frequently Asked Questions

Is it mandatory?

No. These modules are bonus for those who want to go deeper.

Can I do only one?

Yes. You can submit only Kafka (08) or only LocalStack (09).

How does it affect my grade?

Completing correctly = bonus on top of your Final Project grade.

Do I need Spark for module 08?

The Spark Streaming challenge is advanced. You can do challenges 1-5 without Spark.

Is LocalStack the same as real AWS?

The code is identical. The difference is that LocalStack runs on your machine at no cost. If your code works on LocalStack, it works on AWS.

​‌​‌​‌​​‍​‌​​​‌​‌‍​​‌‌‌​‌​‍​​‌‌​‌​​‍​‌‌​​‌​‌‍​​‌‌‌​​​‍​‌‌​​‌​​‍​​‌‌‌​​‌‍​‌‌​​​‌​‍​​‌‌​​​‌‍​‌‌​​​​‌‍​​‌‌​‌​‌‍​‌‌​​‌‌​‍​​‌‌​‌‌​‍​‌‌​​‌​‌‍​​‌‌​‌‌‌‍​‌‌​​​‌‌‍​​‌‌​​‌‌‍​‌‌​​‌​​‍​​‌‌‌​‌​‍​​‌‌​​‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​‌‌​‍​​‌‌​​​​‍​​‌‌​​‌​‍​​‌‌​​​‌‍​​‌‌​​‌‌‍​​‌‌‌​‌​‍​‌‌​​‌​​‍​​‌‌​‌‌‌‍​​‌‌​​‌‌‍​​‌‌​​​‌‍​​‌‌​‌‌​‍​​‌‌​‌​‌‍​​‌‌‌​​​‍​​‌‌​​‌​---

Course: Big Data with Python - From Zero to Production Professor: Juan Marcelo Gutierrez Miranda | @TodoEconometria Hash ID: 4e8d9b1a5f6e7c3d2b1a0f9e8d7c6b5a4f3e2d1c0b9a8f7e6d5c4b3a2f1e0d9c Methodology: Progressive exercises with real data and professional tools

References: - Kreps, J., Narkhede, N., & Rao, J. (2011). Kafka: A distributed messaging system for log processing. - Brikman, Y. (2019). Terraform: Up & Running (2nd ed.). O'Reilly Media. - LocalStack Team (2024). LocalStack Documentation. https://docs.localstack.cloud/