Menu

Software Developer

Job details
Posting date: 17 April 2026
Salary: £43,760 per year
Hours: Full time
Closing date: 14 May 2026
Location: Glasgow, Scotland
Remote working: Hybrid - work remotely up to 3 days per week
Company: Government Recruitment
Job type: Permanent
Job reference: 455315

Apply for this job

Summary

Job summary

We’re seeking a Software Developer experienced in back end (with focus on data pipelines) as well as wider technologies.

You can code in Python and have experience of ETL systems e.g., Apache Spark or similar. During the interview, it will be beneficial to share and walk the panel through any historic code you have written.

You will work as part of a multidisciplinary team, supporting project in areas such as Artificial Intelligence, Robotic Automation and Data analysis

The ideal candidate will be proactive and collaborative, with a passion for clean, maintainable code and modern development practices. They are familiar with AGILE methodology, but are equally able to take ownership for key deliverables and can deliver autonomously.

Knowledge working with AI solution implementation is advantageous but not essential.

Job description

You will join the Cabinet Office Digital, Data, Insights and AI team as a Software Developer in a multidisciplinary Agile team, delivering services for a large data warehouse and analytics platform (GRID).

As a software engineer, you will initially have a core focus on back-end development.

You will:
Lead development of data engineering requirements from requirement through to live support.
Complete information assurance activities for data transfers containing sensitive data (e.g., Data Protection Impact Assessments, Privacy Notices).
Carry out performance and incident monitoring to ensure the smooth running of data services.
Support the wider team with broader priorities, including front-end design.
Coach and mentor junior developers, sharing good engineering practices.
The role responsibility may slowly change over time.

Your main focus will be:
Python and Extract, Transform and Load (ETL) processes.
You will require knowledge of:
PySpark (AWS Glue) to build scalable data pipelines.
AWS Lake Formation to implement Attribute-Based Access Control (ABAC).
Apache Iceberg to store data and optimise query performance.
Key broader technologies used by the wider team:
Python as the primary programming language for general tasks.
TypeScript to design front-end services using React.
JavaScript/Node.js for backend services.
YAML/JSON (CloudFormation) and Terraform (HCL) for infrastructure as code.
Terraform, CodePipeline, and GitHub Actions for infrastructure deployment.
Amazon Web Services (AWS) for hosting digital services.
Kubernetes and Docker for containerisation.
Apache Iceberg, PostgreSQL, SQL Server, and Redis as databases.
OpenSearch for vector stores used for storing and querying high-dimensional embeddings.
Knowledge graphs, including the use of triplestores for storing and querying RDF data.
Amazon Bedrock for hosting LLMs.
GOV.UK Design System for interface design.

Apply for this job