Be a part of Stefanini!
At Stefanini we are more than 30,000 geniuses, connected from more than 40 countries, we co-create a better future.
¡Apply Big Data Engineer!
Requirements:
- 2 years of experience as Big Data Engineer
- Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development.
- Experience in agile models
- Experience in design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
- Experience with Java, Python and/or Scala to write data pipelines and data processing layers
- Experience in Airflow & Github.
- English (intemediate)
Essential Duties and Responsibilities:
- Develops efficient map-reduce jobs to process and analyze large-scale data sets.
- Demonstrates expertise in writing complex, highly optimized queries using HIVE and Spark.
- Applies in-depth knowledge of Big Data technologies including Spark, Hive, Kafka, BigQuery SQL, and HBase to deliver scalable solutions.
- Exhibits strong proficiency in writing and tuning advanced SQL queries.
- Leverages cloud platforms such as GCP, Azure, or AWS to build and maintain robust data pipelines.
- Understands relational models and works effectively with in-memory data stores such as Oracle, Cassandra, and Druid.
- Implements and maintains reliable data pipelines and analytical solutions for enterprise-scale operations.
- Performs performance tuning and optimization on systems handling large data volumes to ensure high efficiency.
- Consumes data from REST APIs and integrates it into existing data frameworks and services.
- Brings valuable experience in the retail sector, contributing domain-specific insights to data-driven strategies.
?What's in for you?
- Fully remote
- Training Path
- Life insurance
- Punctuality bonus
- Grocery vouchers
- Restaurant vouchers
- Legal benefits + Profit sharing (PTU)
- Learning and Mentoring platforms
- Discounts at language schools
- Gym discount