Senior Data Engineer (SC Clearance Required)
Posting date: | 02 September 2024 |
---|---|
Salary: | £610 to £610 per day |
Hours: | Full time |
Closing date: | 02 October 2024 |
Location: | Newcastle upon Tyne, Tyne and Wear, NE98 1YX |
Remote working: | Hybrid - work remotely up to 5 days per week |
Company: | Experis |
Job type: | Contract |
Job reference: | BBBH234324_1725295263 |
Summary
Role Title: Senior Data Engineer (SC CLEARED)
Duration: 4 Months
Location: Remote / Newcastle once per month
Rate: £610/d - Umbrella only
NOTE: Applicants MUST hold Active Security Clearance to be considered for this role
Would you like to join a global leader in consulting, technology services and digital transformation?
Our client is at the forefront of innovation to address the entire breadth of opportunities in the evolving world of cloud, digital and platforms.
Role purpose / summary
The role involves leading the implementation of technical designs and blueprints to establish scalable data engineering infrastructure within the Retirement directorate.
This person will add detail to framework designs and define specific tasks to integrate the designs with existing components.
Proactive problem-solving is essential, as this person will lead the response to technical problems arising with data flows and keep the Product, Delivery, and Data Engineering leadership informed of progress.
Pragmatic patience is also essential, as this role will involve adopting and adapting to department-wide design patterns that are difficult to influence, and prioritisation may shift in response to changing requirements and needs from outside of Retirement.
- Engineering background with experience working with analysts
- Experience working with version control (git / gitlab) and structuring repositories for data pipelines
- Proficient with Python, PySpark, and SQL for data engineering uses
- Knowledge of current database solutions, particularly semi-structured document databases (MongoDB)
- Experience working with AWS, and data flow within AWS, and Infrastructure as Code (IaC)
- Comfortable and confident following preexisting design blueprints
- Comfortable translating high-level requirements and designs into low-level implementations on cloud
- Capable of leading the implementation of data engineering pipelines and delegating implementation work
- Comfortable faithfully reporting obstacles and problems, and feeding back when designs might have better alternatives
- Experience working and coordinating with a variety of other technical roles, especially Business Analysts, DevOps Engineers, Software Engineers, and Technical Architects
Key Skills/ requirements
- AWS: S3, Lambda, EMR, SMS, SQS, and additional services related to data infrastructure
- Terraform
- Databricks
- Data Lake, Warehouse, Lakehouse architecture and design
- Python/Pyspark
- Data platforms and notebooks: Jupyter, Databricks, Azure
- Gitlab: repository and CI/CD
- Java (Spring Boot) experience is a plus
All profiles will be reviewed against the required skills and experience. Due to the high number of applications, we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!