Warning
This job advert has expired and applications have closed.
Senior Data Engineer
Posting date: | 14 April 2025 |
---|---|
Salary: | Not specified |
Additional salary information: | Competitive Salary Offered |
Hours: | Full time |
Closing date: | 18 April 2025 |
Location: | Reading, Berkshire |
Remote working: | On-site only |
Company: | OM Soft Solutions Limited |
Job type: | Permanent |
Job reference: | FA_OSS_SDE_2025-2 |
Summary
Job Title: Senior Data Engineer
Company: OM Soft Solutions Limited
Location: Reading, Berkshire, RG2 6UB, United Kingdom
Hours: 37.5 Hours per Week, Monday to Friday
Salary: Competitive Salary Offered
No Agencies Please
________________________________________
Description:
Job Title: Senior Data Engineer
We are currently seeking an experienced Senior Data Engineer to join our Cloud and Enterprise Information Management Practice. This role involves developing architectural models for cloud-based data management solutions leveraging Microsoft Azure, AWS, GCP, and Snowflake technologies to support large-scale, high-performance environments.
________________________________________
Key Responsibilities:
• Design and develop conceptual, logical, and physical data models that meet current and future business needs.
• Translate functional design requirements into robust cloud-based data management solutions.
• Build and manage cloud and data environments, including deep-dive troubleshooting and root cause analysis.
• Plan and execute Data Warehouse (DWH) migrations to cloud platforms.
• Drive the data engineering roadmap, ensuring the platform is automated, scalable, and resilient.
• Develop data pipelines and data lakes, manage ETL processes, and perform complex data transformations.
• Work with both relational and big data models to support data access and visualization tools.
• Build and manage cloud-based data platforms using services like Azure Data Factory, Data Lake, Databricks, Cosmos DB, AWS Redshift, Lambda, RDS, S3, EC2, and Kinesis.
• Create application integrations using REST APIs.
• Write clean, efficient, and reusable Python code to support data workflows and pipelines.
• Ensure high performance, quality, and responsiveness across systems and applications.
• Apply data mining, advanced analytics, and statistical techniques to uncover insights from complex data sets.
• Collaborate with developers and stakeholders to ensure solutions align with business needs.
________________________________________
Required Skills & Experience:
• Proven experience in Data Architecture, Data Management, and modern Cloud Data Platforms.
• Strong background in ETL/ELT development, including design and optimization.
• Deep understanding of relational and dimensional data models.
• Expertise in Data Modeling, Profiling, Analysis, Quality, Governance, and Lineage.
• Hands-on experience with Snowflake for building data lakes/warehouses is highly desirable.
• Proficient in SQL and experienced with ETL tools such as Informatica, Talend, Pentaho, Databricks, Spark, Alteryx.
• Proficiency with Snowflake tools: SnowSQL, SnowPipe, Tasks, Streams, Stored Procedures, etc.
• Strong programming skills in Python, including experience with libraries for data processing and visualization.
• Experience with SQL and NoSQL databases: Oracle, MS SQL Server, Cassandra, MongoDB, HBase, CouchDB, DynamoDB, and others.
• Familiarity with data pipeline orchestration tools such as Apache Kafka or Apache Airflow.
• Exposure to Big Data technologies: Hadoop, Spark, Kafka, etc., and IaaS/PaaS ecosystems (Azure, AWS, GCP).
• Hands-on experience with cloud-native services: Azure Data Factory, Data Lake, Databricks, Cosmos DB, Blob Storage, AWS Redshift, Lambda, RDS, S3, EC2, and Kinesis.
• Experience with BI tools such as Tableau and Power BI.
• Knowledge of CI/CD pipelines and DevOps practices.
• Experience using JIRA/Confluence and version control tools like Git/GitHub.
• Comfortable working in Windows and Linux/Unix environments.
• Experience in Agile methodologies (Scrum, Kanban).
• Excellent communication skills, including technical documentation, presentations, and collaboration with stakeholders.
• Professional Certifications related to Data Engineering is a big plus.
Education Required:
Applicants must have a degree (S/NQF6 or above) or a minimum of a bachelor’s degree in Computer Science, Mathematics/Science, Engineering, or Business Administration.
Company: OM Soft Solutions Limited
Location: Reading, Berkshire, RG2 6UB, United Kingdom
Hours: 37.5 Hours per Week, Monday to Friday
Salary: Competitive Salary Offered
No Agencies Please
________________________________________
Description:
Job Title: Senior Data Engineer
We are currently seeking an experienced Senior Data Engineer to join our Cloud and Enterprise Information Management Practice. This role involves developing architectural models for cloud-based data management solutions leveraging Microsoft Azure, AWS, GCP, and Snowflake technologies to support large-scale, high-performance environments.
________________________________________
Key Responsibilities:
• Design and develop conceptual, logical, and physical data models that meet current and future business needs.
• Translate functional design requirements into robust cloud-based data management solutions.
• Build and manage cloud and data environments, including deep-dive troubleshooting and root cause analysis.
• Plan and execute Data Warehouse (DWH) migrations to cloud platforms.
• Drive the data engineering roadmap, ensuring the platform is automated, scalable, and resilient.
• Develop data pipelines and data lakes, manage ETL processes, and perform complex data transformations.
• Work with both relational and big data models to support data access and visualization tools.
• Build and manage cloud-based data platforms using services like Azure Data Factory, Data Lake, Databricks, Cosmos DB, AWS Redshift, Lambda, RDS, S3, EC2, and Kinesis.
• Create application integrations using REST APIs.
• Write clean, efficient, and reusable Python code to support data workflows and pipelines.
• Ensure high performance, quality, and responsiveness across systems and applications.
• Apply data mining, advanced analytics, and statistical techniques to uncover insights from complex data sets.
• Collaborate with developers and stakeholders to ensure solutions align with business needs.
________________________________________
Required Skills & Experience:
• Proven experience in Data Architecture, Data Management, and modern Cloud Data Platforms.
• Strong background in ETL/ELT development, including design and optimization.
• Deep understanding of relational and dimensional data models.
• Expertise in Data Modeling, Profiling, Analysis, Quality, Governance, and Lineage.
• Hands-on experience with Snowflake for building data lakes/warehouses is highly desirable.
• Proficient in SQL and experienced with ETL tools such as Informatica, Talend, Pentaho, Databricks, Spark, Alteryx.
• Proficiency with Snowflake tools: SnowSQL, SnowPipe, Tasks, Streams, Stored Procedures, etc.
• Strong programming skills in Python, including experience with libraries for data processing and visualization.
• Experience with SQL and NoSQL databases: Oracle, MS SQL Server, Cassandra, MongoDB, HBase, CouchDB, DynamoDB, and others.
• Familiarity with data pipeline orchestration tools such as Apache Kafka or Apache Airflow.
• Exposure to Big Data technologies: Hadoop, Spark, Kafka, etc., and IaaS/PaaS ecosystems (Azure, AWS, GCP).
• Hands-on experience with cloud-native services: Azure Data Factory, Data Lake, Databricks, Cosmos DB, Blob Storage, AWS Redshift, Lambda, RDS, S3, EC2, and Kinesis.
• Experience with BI tools such as Tableau and Power BI.
• Knowledge of CI/CD pipelines and DevOps practices.
• Experience using JIRA/Confluence and version control tools like Git/GitHub.
• Comfortable working in Windows and Linux/Unix environments.
• Experience in Agile methodologies (Scrum, Kanban).
• Excellent communication skills, including technical documentation, presentations, and collaboration with stakeholders.
• Professional Certifications related to Data Engineering is a big plus.
Education Required:
Applicants must have a degree (S/NQF6 or above) or a minimum of a bachelor’s degree in Computer Science, Mathematics/Science, Engineering, or Business Administration.