Data Engineer
Due to further expansion, we are seeking a Data Engineer to help us to continue to dominate one of the most buoyant, fastest-growing industries, that is changing the world and the speed at which we receive items.
What we do
LineTen is on a mission to crack the code on urban delivery. By leading with our values and putting carrier partners first, we are paving the way for better delivery experiences. LineTen is a new approach to delivery with a free carrier API and an operations management platform for shippers. We're building the first global data-driven last-mile network that has the potential to solve industry-wide problems.
Why work with us?
- We Are a Home-First Team: LineTen is committed to our home-first policy, which means that we honour remote working first, and offer office space in London, England and Porto, Portugal.
- We Believe in Having Fun: Our WellUs team organises monthly events like Pet Zoom Calls, 45-minute Yoga classes, and after-hours cocktail lessons.
- We Want You to Take a Break: We believe it is the quality of work that matters, not the hours spent “on the clock”. We offer flexible working hours and unlimited vacation.
- We Work in Teams: We work in teams rather than as individuals. Some teams, like our Product Teams, have agile ceremonies such as standups and sprint planning.
- We Pitch Our Ideas: We independently sketch ideas then come to team discussions with prototypes instead of trying to reach consensus through extended discussion.
- We Challenge Norms: We don’t want to be typical. We believe in challenging the status quo. We encourage our teammates to make their jobs their own.
Job Summary
We are looking for a highly skilled and innovative Data Engineer to join our team and play a critical role in building and optimizing our data infrastructure. As a Data Engineer, you will design and implement modern data solutions that enable the organization to leverage data at scale. Your work will support advanced analytics, reporting, and AI/ML initiatives, driving key business insights and innovation.
This is a hands-on technical role requiring deep expertise in Azure data services, scalable data pipelines, and data governance. You will work with structured and unstructured data, enabling self-service reporting and ensuring that data is secure, accessible, and optimised for analytics.
The goal of this role is to provide internal and external data services that allow anyone to self-serve.
Key Responsibilities
Data Architecture & Pipeline Development
- Design, implement, and manage scalable and reliable data pipelines to integrate and process data from multiple structured and unstructured sources.
- Develop and maintain Azure-based data infrastructure, including Azure Data Lake, Synapse Analytics, Data Factory, and Azure Functions.
- Optimise data ingestion, transformation, and storage to support real-time and batch processing workloads.
- Ensure data quality, consistency, and integrity through robust data governance policies.
Data Warehousing & Business Intelligence Enablement
- Develop and execute a data lake and data warehousing strategy, ensuring efficient storage, accessibility, and management of large datasets.
- Enable self-serve analytics by integrating data with visualisation tools such as Power BI, supporting teams in creating insightful data models and reports.
- Design and implement efficient backend data models to support enterprise reporting and analytics solutions.
AI/ML & Advanced Analytics Support
- Collaborate with data scientists and analysts to prepare, clean, and transform large datasets for AI/ML model training and predictive analytics.
- Work on data preparation workflows that enhance machine learning pipelines and support data-driven innovation.
- Optimise data solutions to enable scalable machine learning operations (MLOps) and advanced analytics.
Data Governance, Security & Compliance
- Implement and enforce data governance policies, ensuring compliance with security, privacy, and regulatory frameworks (e.g., GDPR, SOC 2, ISO27001).
- Develop strategies for data cataloging, lineage tracking, and metadata management to improve data discoverability and compliance.
Performance Optimisation & Cloud Cost Management
- Monitor, troubleshoot, and optimize data pipelines to ensure high performance, scalability, and cost efficiency.
- Implement cloud cost optimization strategies for data storage, processing, and analytics workloads.
DevOps & Automation for Data Engineering
- Contribute to CI/CD pipelines for data workflows, ensuring smooth deployments and operational efficiency.
- Automate infrastructure provisioning using Terraform, ARM templates, or Bicep to streamline cloud deployments.
Requirements
Core Technical Skills
- Proven experience as a Data Engineer, working on complex data integration, transformation, and storage solutions.
- Expertise in Azure data services, including Azure Data Factory, Data Lake, Synapse Analytics, and Azure Functions.
- Strong SQL proficiency for data modelling, transformation, and optimisation.
- Programming skills in C# for ETL, automation, and data pipeline development.
- Experience with self-serve reporting platforms (e.g., Power BI) and backend data model design for analytics.
- Deep understanding of data governance principles, including security, privacy, and compliance.
- Knowledge of modern data architectures, such as data mesh, data fabric, event-driven systems, and streaming architectures.
- Experience supporting AI/ML workflows by preparing, cleaning, and transforming large datasets.
Preferred Qualifications
- Experience with big data technologies such as Databricks, Apache Spark, or Hadoop.
- Familiarity with streaming platforms like Kafka, Azure Event Hubs, or Stream Analytics.
- Hands-on experience with cloud-native AI/ML frameworks (e.g., Azure Machine Learning, MLflow, TensorFlow).
- Infrastructure as Code (IaC) experience using Terraform, ARM templates, or Bicep.
- Experience implementing cost optimisation strategies for cloud-based data workflows.
Data Engineer
Loading application form
Already working at LineTen?
Let’s recruit together and find your next colleague.