Data Architect/Data Engineer
| Posting date: | 16 March 2026 |
|---|---|
| Salary: | £54,000 to £58,000 per year |
| Hours: | Full time |
| Closing date: | 24 March 2026 |
| Location: | UK |
| Remote working: | Fully remote |
| Company: | DataProv Limited |
| Job type: | Permanent |
| Job reference: |
Summary
We are looking for a highly skilled Data Architect / Data Engineer to design and build data solutions as part of our product development application. This is a remote-based role responsible for architecting scalable data systems, developing reliable data pipelines, and enabling data-driven product capabilities across the organization
Key Responsibilities:
• Design and develop internal data products (dashboards, analytics tools, ML-enabled apps)
• Build and maintain data pipelines from multiple sources (ETL/ELT), including batch and streaming
• Transform raw data into analytics-ready datasets using modern tools
• Develop data models and semantic layers for BI and analytics teams
• Implement data governance, quality, and lineage frameworks
• Integrate predictive analytics and machine learning models into data products
• Deploy and maintain data products using cloud platforms and APIs
• Optimize performance and scalability of data solutions
Required Skills& Tools:
• Data Engineering & Pipelines: Snowflake / BigQuery / Redshift / Azure, DBT, Airflow, Apache Kafka/Spark
• Programming & Analytics: Python, SQL
• BI / Visualization: Power BI, Tableau
• Machine Learning / AI (Optional but highly valued): TensorFlow, PyTorch, scikit-learn, XGBoost, MLOps tools, LangChain / OpenAI API for advanced analytics/AI integration
• Data Governance / Quality: Collibra
• Deployment / Productization: Streamlit, Dash, FastAPI, Flask, Docker / Kubernetes.
Qualifications:
• Bachelor’s or master’s degree in computer science, Data Science, Statistics, or related field
• 3+ years in data engineering, analytics engineering, or data product development
• Experience building enterprise-level internal data products
Key Responsibilities:
• Design and develop internal data products (dashboards, analytics tools, ML-enabled apps)
• Build and maintain data pipelines from multiple sources (ETL/ELT), including batch and streaming
• Transform raw data into analytics-ready datasets using modern tools
• Develop data models and semantic layers for BI and analytics teams
• Implement data governance, quality, and lineage frameworks
• Integrate predictive analytics and machine learning models into data products
• Deploy and maintain data products using cloud platforms and APIs
• Optimize performance and scalability of data solutions
Required Skills& Tools:
• Data Engineering & Pipelines: Snowflake / BigQuery / Redshift / Azure, DBT, Airflow, Apache Kafka/Spark
• Programming & Analytics: Python, SQL
• BI / Visualization: Power BI, Tableau
• Machine Learning / AI (Optional but highly valued): TensorFlow, PyTorch, scikit-learn, XGBoost, MLOps tools, LangChain / OpenAI API for advanced analytics/AI integration
• Data Governance / Quality: Collibra
• Deployment / Productization: Streamlit, Dash, FastAPI, Flask, Docker / Kubernetes.
Qualifications:
• Bachelor’s or master’s degree in computer science, Data Science, Statistics, or related field
• 3+ years in data engineering, analytics engineering, or data product development
• Experience building enterprise-level internal data products