Azure Databricks Consultant
Remote work, Romania
Posted: | 11d ago |
Location: | Remote work, Romania |
Job Ref: | BH-51456 |
Salary: | CHF33 - CHF39 per hour |
Expiry date: | 2/21/2025 |
We are seeking a highly skilled and certified Databricks Engineer to join our team. The ideal candidate will have a strong foundation in data engineering, data modelling, data integration, and cloud technologies. As a Databricks Engineer, you will be responsible for designing, implementing, and optimizing data workflows using Databricks tools and technologies.
Key Responsibilities:
Design and implement data ingestion, transformation, and storage solutions using Databricks.
Develop and maintain data models that support business requirements and analytics.
Integrate data from various sources, ensuring data quality and consistency.
Deploy and manage Databricks environments on cloud platforms such as Azure,
Write and optimize code in programming languages such as Python, Scala, and SQL.
Build and manage ETL (Extract, Transform, Load) pipelines using Databricks.
Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders.
Troubleshoot and optimize data workflows to ensure efficiency and performance.
Stay up-to-date with the latest developments in Databricks and related technologies.
Qualifications:
Databricks certification is required.
Experience in data modelling and data integration.
Azure.
Expertise in programming languages such as Python, Scala, and SQL.
Experience with ETL processes and tools.
Ability to work collaboratively with cross-functional teams.
Key Responsibilities:
Design and implement data ingestion, transformation, and storage solutions using Databricks.
Develop and maintain data models that support business requirements and analytics.
Integrate data from various sources, ensuring data quality and consistency.
Deploy and manage Databricks environments on cloud platforms such as Azure,
Write and optimize code in programming languages such as Python, Scala, and SQL.
Build and manage ETL (Extract, Transform, Load) pipelines using Databricks.
Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders.
Troubleshoot and optimize data workflows to ensure efficiency and performance.
Stay up-to-date with the latest developments in Databricks and related technologies.
Qualifications:
Databricks certification is required.
Experience in data modelling and data integration.
Azure.
Expertise in programming languages such as Python, Scala, and SQL.
Experience with ETL processes and tools.
Ability to work collaboratively with cross-functional teams.
Apply now
Contact:
Position:
Manager
Sector:
IT Infrastructure
Contact Email:
Telephone: