Who are we?
Qflow was founded in 2018 with a bold mission to empower the world to build responsibly. Our platform helps construction and development teams cut waste and reduce carbon emissions through real-time, data-driven insights.
We’ve built a cutting-edge platform that gives project teams real-time insights into materials, waste,
cost, carbon, and quality right at the source. Using AI, machine learning, and smart integrations, our tech makes sense of chaotic data streams by extracting information from messy receipts and documents, auditing it against project requirements, flagging risks instantly, and feeding insights directly into reporting workflows. Our mobile app makes it effortless for workers on the ground to capture data in seconds, turning construction sites into real-time, data-rich ecosystems.
The impact? Less waste. Lower carbon. Smarter decisions. A construction industry that uses only what it needs, building a future that works for people and the planet.
We’re backed by leading climate-tech and construction-tech investors, including Systemiq Capital, Greensoil PropTech Ventures, and Suffolk Technologies, and have raised £11.2M in Series A funding to accelerate our international growth.
You should apply if…
You’re a Data Engineer who wants to use your skills for impact, helping make one of the world’s most polluting industries more efficient, sustainable, and data-driven.
You enjoy building robust, well-engineered data pipelines that turn messy, real-world data into reliable foundations for AI and analytics. You care about data quality, not just data movement.
You’re curious, collaborative, and want to work in a place where what you build directly shapes how construction companies improve quality, reduce waste, and make smarter decisions at scale.
Your team and your role
We’re looking for a Data Engineer to join our Data group, a cross-functional, high-impact team at the
intersection of data engineering, machine learning, and data quality. The team
designs, develops, and operates the scalable data infrastructure that powers
Qflow’s platform and AI capabilities.
Reporting to our Senior Engineering Manager and working closely with ML Engineers and Data Quality experts, you’ll own the pipelines that get the right data, in the right shape, to the right place. Here’s what you’ll do day to day:
•Design, build, and maintain robust data pipelines ingesting data from multiple sources into our data infrastructure (currently 100M+ rows and growing).
•Work with Azure Cosmos DB, Microsoft Fabric, and relational databases to model, store, and serve data at scale.
•Build and manage data lake layers in Microsoft Fabric, including ingestion, transformation, and serving patterns that support both ML and analytical workloads.
•Collaborate with ML Engineers to ensure training data is clean, versioned, and correctly structured, including pipelines that feed generative AI features.
•Partner with Data Quality experts to implement validation, monitoring, and lineage tracking that give the team confidence in what flows through our systems.
•Optimise pipeline performance, reliability, and cost; debugging failures quickly and building resilience in.
•Contribute to data governance practices, including schema management, access controls, and documentation.
•Maintain high standards in code quality, testing, and reproducibility, and share knowledge across the team.
•Make informed trade-off decisions to manage the cost of Fabric compute.
Our tech stack
The Data Experience Team works with Python and SQL for data processing, Azure and Terraform for cloud infrastructure, and modern ML/AI tools such as OpenAI and Gemini. Our data infrastructure centres on Azure Cosmos DB, Microsoft Fabric, and relational databases. We’re continuously raising our engineering standards through robust testing, CI/CD, and shared code quality practices. We value curiosity and innovation, always exploring new technologies to stay ahead of the curve.
Your Skills
We’re looking for a
mid-to-senior engineer who is comfortable taking ownership of complex data
infrastructure in a fast-moving startup. You bring engineering rigour to data
problems, and you understand that the quality of what goes in determines the
quality of what comes out. What matters most is your ability to build reliable,
production-grade pipelines that real AI and analytics products depend on.
•4+ years of experience in a data engineering role, ideally in a product or SaaS environment.
•Ability to think about data quality from an end user perspective – i.e. the value of our data and customer trust in data – as well as an internal perspective (validity, uniqueness, etc).
•Hands-on experience with Azure Cosmos DB, including data modelling for document-oriented workloads.
•Experience with Pyspark.
•Strong working knowledge of Microsoft Fabric or Azure Data Lake Storage, including experience designing medallion or equivalent layered architectures.
•Solid SQL skills and experience with relational databases (PostgreSQL or similar).
•Proficiency in Python for pipeline development, transformation logic, and orchestration.
•Experience building data pipelines that feed ML or generative AI workflows — understanding of what ‘good’ training and inference data looks like.
•Familiarity with data quality practices: validation, monitoring, alerting, and lineage.
•Working knowledge of Azure cloud infrastructure and services; exposure to Terraform or infrastructure-as-code is a plus.
•Exposure to CI/CD practices and containerisation with Docker or similar.
•Experience using AI coding tools to accelerate development while maintaining the ability to audit and correct LLM output for performance at scale.
•Excellent communication skills, able to work across engineering, product, and non-technical stakeholders.
•Comfortable with ambiguity and incremental delivery in a startup environment.
•Nice to have: experience with Retool; familiarity with Medallion architecture.
Our offer
💸 Basic salary of £55,000 – £75,000, depending on experience.
🏡 Remote-first team
🇬🇧 Bi-weekly engineering team gatherings and
wide-company gatherings once a quarter in our London HQ
🚝 Work travel expenses covered by Qflow
🏝 25 days annual leave + 3 days company closure
at Christmas + bank holidays
🤒 Paid sick leave
🩺 Private medical insurance
🏥 Critical illness and life insurance
💰 Pension contribution up to 7%
👥 Enhanced family policy
🙋 Paid volunteering days
✈️ We allow our employees to work abroad for up
to 90 days
🌎 We’ll offset your annual carbon footprint on
your behalf via Ecologi
📚 Learning & development and career
progression opportunities
🤩 Company social events (online and in person!)
💻 Company laptop and tools
Our promise
Creating an environment where
everyone feels valued, respected and heard is at the forefront of everything we
do. We are committed to providing equal employment opportunities regardless of
race, colour, ancestry, religion, sex, national origin, sexual orientation,
age, citizenship, marital status, disability, gender identity, or veteran
status.
We created a culture that
extends to all aspects of our operations, including step-free access, as we
believe that everyone should have equal opportunities to access our facilities,
services, and digital platforms.
Important Notice: No Recruitment
Agencies
We kindly request that
recruitment agencies refrain from contacting us regarding this job posting. We
are solely interested in direct applications from candidates. Any unsolicited
communication or resumes received from agencies will not be considered or acknowledged.
We encourage candidates to apply directly through the provided application
process. Thank you for your understanding.