Data Engineer
Searches @ Wenham Carter is looking for a Data Engineer
Job description
We are currently recruiting a Data Engineer for one of our clients. The role is outside IR35 and is paying £400-500 per day, it will initially be for 6 months. It is also fully remote.
Key Responsibilities
• Design, develop, and maintain batch and streaming data pipelines using Databricks (Apache Spark)
• Build and optimize ETL/ELT workflows for large-scale structured and unstructured data
• Implement Delta Lake architectures (Bronze/Silver/Gold layers)
• Integrate data from multiple sources (databases, APIs, event streams, files)
• Optimize Spark jobs for performance, scalability, and cost
• Manage data quality, validation, and monitoring
• Collaborate with analytics and ML teams to support reporting and model development
• Implement CI/CD, version control, and automated testing for data pipelines
Required Qualifications
• 3+ years of experience as a Data Engineer
• Strong experience with Databricks and Apache Spark
• Proficiency in Python (required); SQL (advanced)
• Hands-on experience with AWS or Azure cloud services:
o AWS: S3, EMR, Glue, Redshift, Lambda, IAM
o Azure: ADLS Gen2, Azure Databricks, Synapse, Data Factory, Key Vault
• Experience with Delta Lake, Parquet, and data modeling
Extra information
- Status
- Open
- Education Level
- Secondary School
- Location
- United Kingdom
- Type of Contract
- Full-time jobs
- Published at
- 28-01-2026
- Profession type
- ICT
- Full UK/EU driving license preferred
- No
- Car Preferred
- No
- Must be eligible to work in the EU
- No
- Cover Letter Required
- No
- Languages
- English
Get similar vacancies sent to your mailbox
Fill in below which area you are searching in for a similar function and don't forget your e-mail address!