Senior Databricks Engineer CGEMJP00319492
| Dyddiad hysbysebu: | 14 Tachwedd 2025 |
|---|---|
| Oriau: | Llawn Amser |
| Dyddiad cau: | 21 Tachwedd 2025 |
| Lleoliad: | Glasgow, Glasgow, g58dp |
| Gweithio o bell: | Hybrid - gweithio o bell hyd at 2 ddiwrnod yr wythnos |
| Cwmni: | Experis |
| Math o swydd: | Cytundeb |
| Cyfeirnod swydd: | BBBH427796_1763129781 |
Crynodeb
Role Title: Senior Databricks Engineer
Duration: contract to run until 31/12/2026
Location: Glasgow, hybrid 2/3 days per week onsite
Rate: up to £414 p/d Umbrella inside IR35
Role purpose / summary
We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on designing, building, and optimizing scalable data solutions using the Databricks platform.
Key Skills/ requirements
- Lead the migration of existing AWS-based data pipelines to Databricks.
- Design and implement scalable data engineering solutions using Apache Spark on Databricks.
- Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines.
- Optimize performance and cost-efficiency of Databricks workloads.
- Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
- Ensure data quality and reliability through robust unit testing and validation frameworks.
- Implement best practices for data governance, security, and access control within Databricks.
- Provide technical mentorship and guidance to junior engineers.
Must-Have Skills:
- Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
- Proven track record of building and optimizing data pipelines in cloud environments.
- Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC.
- Proficiency in Python for data engineering tasks.
- Familiarity with GitLab for version control and CI/CD.
- Strong understanding of unit testing and data validation techniques.
Preferred Qualifications:
- Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
- Knowledge of CloudFormation or other infrastructure-as-code tools.
- AWS or Databricks certifications.
- Experience in large-scale data migration projects.
- Background in Finance Industry.
All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!