What You'll Bring
✅ 5+ years of experience in designing and building scalable data lakes in cloud environments.
✅ Advanced programming skills in Python or Java.
✅ Proven experience in ETL/ELT pipeline development, ideally with Airflow.
✅ Expertise in DBT and data warehouse modeling for metrics development.
✅ Solid experience working with real-time data streaming tools like Kafka or Google Pub/Sub.
✅ Hands-on experience with big data processing tools, such as Hadoop, Spark, or their GCP equivalents (e.g., Dataproc, Dataflow). ✅Advanced SQL skills – you know how to write and optimize complex queries across large datasets.
Reportar empleo