
Data Engineer
Job Description
Posted on: December 11, 2025
Senior Data Engineer | Azure Databricks | up to £80k basic + bonus & benefits | Remote-first
I am recruiting for a global Insurance client seeking a hands-on technologist to join their team as a Senior Data Engineer. This is a key role focused on the development and architecture of their global data warehouse platforms. The successful candidate will be comfortable with a variety of data warehousing technology, architecture, and large-scale distributed systems.
This position is based in the UK, working remotely, with occasional visits to the London office, and reports to the Lead Engineer in a smaller team and tons of autonomous working.
What You Will Do
- Provide robust development and support for the company's variety of data platforms, including existing SQL/Synapse Data Warehouses and modern Databricks Lakehouse solutions.
- Work closely with the business and PowerBI developers to ensure the delivery of accurate and reliable reports.
- Deliver timely and accurate data for global reporting, enabling real-time business decisions across core departments such as Underwriting, Claims, Risk, and Analytics.
- Drive hands-on engineering activities, including leading design sessions, architectural choices, and managing the full DevOps lifecycle (code versioning, code reviews, unit testing, build processes, and deployments).
- Design comprehensive data architecture models, covering both logical and physical database models.
- Follow the standard Agile Software Development Life Cycle (SDLC) for development and ensure the overall roadmap is understood and delivered.
Key Requirements
You must have extensive experience in building scalable data-engineered solutions in the cloud, demonstrating proficiency in the following:
- Cloud Data Platforms: Deep, hands-on experience utilizing key components of the Azure ecosystem including Databricks (for Lakehouse solutions), Azure SQL DB, and Synapse (for data warehousing and analytics).
- Data Integration & Pipelines: Expertise in developing and managing complex ETL/ELT data pipelines using tools like Azure Data Factory.
- Coding & Scripting: High proficiency in core data programming languages, specifically Python and T-SQL.
- Data Modeling: Extensive practical experience with dimensional modelling techniques and their applications in designing robust data warehouse architectures.
- DevOps & Version Control: Proven experience with best-in-class code management and documentation practices using Git and leveraging Azure DevOps for CI/CD pipelines.
- Consultative Skills: Strong analytical, consultative, and communication skills, with the ability to exercise good judgment and collaborate effectively with both technical and business personnel.
Hit Apply To Find Out More!
Apply now
Please let the company know that you found this position on our job board. This is a great way to support us, so we can keep posting cool jobs every day!
RemoteJobsHub.app
Get RemoteJobsHub.app on your phone!

Fraud Analyst

Business Intelligence Data Analyst

Freelance Data Science Expert (Python & SQL) / AI Trainer

Data Engineer

