Data Architect/Data Engineer
| Dyddiad hysbysebu: | 16 Mawrth 2026 |
|---|---|
| Cyflog: | £54,000 i £58,000 bob blwyddyn |
| Oriau: | Llawn Amser |
| Dyddiad cau: | 24 Mawrth 2026 |
| Lleoliad: | UK |
| Gweithio o bell: | Yn gyfan gwbl o bell |
| Cwmni: | DataProv Limited |
| Math o swydd: | Parhaol |
| Cyfeirnod swydd: |
Crynodeb
We are looking for a highly skilled Data Architect / Data Engineer to design and build data solutions as part of our product development application. This is a remote-based role responsible for architecting scalable data systems, developing reliable data pipelines, and enabling data-driven product capabilities across the organization
Key Responsibilities:
• Design and develop internal data products (dashboards, analytics tools, ML-enabled apps)
• Build and maintain data pipelines from multiple sources (ETL/ELT), including batch and streaming
• Transform raw data into analytics-ready datasets using modern tools
• Develop data models and semantic layers for BI and analytics teams
• Implement data governance, quality, and lineage frameworks
• Integrate predictive analytics and machine learning models into data products
• Deploy and maintain data products using cloud platforms and APIs
• Optimize performance and scalability of data solutions
Required Skills& Tools:
• Data Engineering & Pipelines: Snowflake / BigQuery / Redshift / Azure, DBT, Airflow, Apache Kafka/Spark
• Programming & Analytics: Python, SQL
• BI / Visualization: Power BI, Tableau
• Machine Learning / AI (Optional but highly valued): TensorFlow, PyTorch, scikit-learn, XGBoost, MLOps tools, LangChain / OpenAI API for advanced analytics/AI integration
• Data Governance / Quality: Collibra
• Deployment / Productization: Streamlit, Dash, FastAPI, Flask, Docker / Kubernetes.
Qualifications:
• Bachelor’s or master’s degree in computer science, Data Science, Statistics, or related field
• 3+ years in data engineering, analytics engineering, or data product development
• Experience building enterprise-level internal data products
Key Responsibilities:
• Design and develop internal data products (dashboards, analytics tools, ML-enabled apps)
• Build and maintain data pipelines from multiple sources (ETL/ELT), including batch and streaming
• Transform raw data into analytics-ready datasets using modern tools
• Develop data models and semantic layers for BI and analytics teams
• Implement data governance, quality, and lineage frameworks
• Integrate predictive analytics and machine learning models into data products
• Deploy and maintain data products using cloud platforms and APIs
• Optimize performance and scalability of data solutions
Required Skills& Tools:
• Data Engineering & Pipelines: Snowflake / BigQuery / Redshift / Azure, DBT, Airflow, Apache Kafka/Spark
• Programming & Analytics: Python, SQL
• BI / Visualization: Power BI, Tableau
• Machine Learning / AI (Optional but highly valued): TensorFlow, PyTorch, scikit-learn, XGBoost, MLOps tools, LangChain / OpenAI API for advanced analytics/AI integration
• Data Governance / Quality: Collibra
• Deployment / Productization: Streamlit, Dash, FastAPI, Flask, Docker / Kubernetes.
Qualifications:
• Bachelor’s or master’s degree in computer science, Data Science, Statistics, or related field
• 3+ years in data engineering, analytics engineering, or data product development
• Experience building enterprise-level internal data products