We are searching for a Senior Data Engineer with strong cloud experience and a passion for delivering impactful data products. You'll help shape the foundation for analytics, reporting, and machine learning by working on the entire data lifecycle—from ingestion to insight—using cutting-edge tools on Google Cloud Platform.
Key Responsibilities
Define and maintain data models and warehouse schemas optimized for analytics performance and reporting needs.
Use Cloud Composer and scripting tools to orchestrate automated workflows, ensuring smooth and efficient operations.
Participate in the definition and implementation of business KPIs, supporting BI teams with the right data structures.
Apply DevOps practices to data engineering workflows, such as CI/CD, infrastructure-as-code (e.g., Terraform) and test automation.
Troubleshoot and optimize pipelines, solving issues related to transformation logic, availability, and system latency.
Construct scalable ETL/ELT processes for real-time and batch data ingestion, transformation, and delivery.
Build pipelines using GCP services such as BigQuery, Pub/Sub, Cloud Functions, Cloud Run, and Dataflow.
Collaborate with analysts and data scientists to design data solutions aligned with business questions.
Implement governance controls, validation layers, and tracking to ensure data quality and compliance.
Monitor cloud resource usage and optimize architectures for cost-efficiency without compromising performance.
Skills & Experience Required
Hands-on experience managing structured, semi-structured, and unstructured data across various storage and processing systems.
Strong skills in SQL and Python, with proven ability to build efficient data pipelines and automation scripts.
At least 3 years of solid experience with Google Cloud Platform, especially services like BigQuery, Pub/Sub, Dataflow, and Cloud Functions.
7+ years working in data engineering or architecture, with a proven ability to design data systems at scale.
Knowledge of dimensional modeling, data lake/lakehouse approaches, and modern data mesh patterns.
Experience with BI platforms such as Looker, Power BI, or Tableau.
Familiarity with cloud-native software development practices, including Agile methodology and automated deployment workflows.
Solid understanding of security frameworks, data governance, and regulatory compliance requirements.
Background in infrastructure-as-code using tools like Terraform is a plus.
Bonus points if you hold a GCP Professional Certification (Data Engineer or Architect), or if you have experience integrating engineering solutions with frontend applications or platforms.
If you're ready to join this incredible team, apply now!
Recuerda que ningún reclutador puede pedirte dinero a cambio de una entrevista o un puesto. Asimismo, evita realizar pagos o compartir información financiera con las empresas.