Aún no hay resultados para tu búsqueda
Encontramos estas vacantes similares que podrían interesarte.
Hace 1 día
SR DATABRIKS ENGINEER
$120,000 - $130,000 Mensual
Sobre el empleo
Detalles
Contratación:
PermanenteHorario:
Tiempo completoEspacio de trabajo:
Desde casaDescripción
AT CL-DBS WE ARE LOOKING FOR:
SR DATABRICKS ENGINEER- BILIGUAL
"As a Senior Databricks Engineer, you will play a pivotal role in designing, implementing, and optimizing data processing pipelines and analytics solutions on the Databricks platform. You will collaborate closely with cross-functional teams to understand business requirements, architect scalable solutions, and ensure the reliability and performance of our data infrastructure. This role requires deep expertise in Databricks, strong programming skills, and a passion for solving complex engineering challenges.
What you'll do:
Design and develop data processing pipelines and analytics solutions using Databricks.
Architect scalable and efficient data models and storage solutions on the Databricks platform.
Collaborate with architects and other teams to migrate current solution to use Databricks.
Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements.
Use best practices for data governance, security, and compliance on the Databricks platform.
Mentor junior engineers and provide technical guidance.
Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement.
You'll be expected to have:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
8+ years of overall experience and 3+ years of experience designing and implementing data solutions on the Databricks platform.
Proficiency in Python and SQL.
Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark.
Expertise in AWS services and cloud-native development. Azure or GCP experience is a plus.
Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services.
5
Proven track record of delivering scalable and reliable data solutions in a fast-paced environment.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills with the ability to work effectively in cross-functional teams.
Good to have experience with containerization technologies such as Docker and Kubernetes.
Knowledge of DevOps practices for automated deployment and monitoring of data pipelines.
WE OFFER:
$130,000 GROSS WITH LEGAL BENEFITS. REMOTE SCHEDULE.
HOURS: MONDAY TO FRIDAY 9:00 AM - 6:00 PM
If you are interested and fit the profile, please send your CV for WhatsApp to 5 37 31 88 62 or apply through this channel.
ID: 20664253
Refina la ubicación de tu búsqueda
Refina la ubicación de tu búsqueda