Senior Data Engineer - #1677193

Peaple Talent


Date: 6 hours ago
City: Bristol
Contract type: Full time
Work schedule: Full day
Peaple Talent

Senior Data Engineer | Bristol | Permanent Full-Time | Hybrid Working


Peaple Talent are pleased to be working with a returning client in Bristol looking to recruit for a Senior Data Engineer.


This is a full-time permanent role, and would require hybrid working in Bristol.


This Senior Data Engineer role will involve working closely with the team lead, helping to guide a team of engineers in building, optimizing and maintaining robust data pipelines and a cutting-edge analytics platform.


Key Responsibilities:



  • Design, develop, and maintain scalable data solutions that meet both technical guidelines and strategic business goals.

  • Take an active role in Agile practices, including sprint planning, peer reviews, and change control meetings.

  • Partner with the team lead to coach and support team members, offering hands-on mentorship throughout the engineering lifecycle.

  • Research and test emerging tools and technologies to drive future-focused innovation.

  • Promote a culture centred around automation and continuous process enhancement.

  • Assist in shaping and upholding high development standards, collaborating on best practices and code quality with the team lead.

  • Resolve advanced data-related challenges and fine-tune existing infrastructure for greater speed and dependability.

  • Translate complex engineering concepts into clear, accessible language for non-technical colleagues.

  • Establish and manage systems for data quality assurance, integrity checks, and monitoring in line with governance frameworks.

  • Work closely with various departments to gather requirements and ensure data solutions reflect real business needs.


Key Experience Required:



  • Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets.

  • Experience mentoring peers and supporting team growth by sharing knowledge and improving collective engineering practices.

  • In-depth knowledge of data modelling best practices and common patterns used in pipeline architecture.

  • Confident in version control with Git and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools.

  • Hands-on experience with orchestration tools like Apache Airflow for managing complex data workflows.

  • Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline development.

  • Strong experience working with cloud-based data services, especially within AWS environments, utilizing tools like Lambda, S3, Redshift, Glue, Athena and Secrets Manager.

  • Skilled in building modern data warehouses on platforms like Amazon Redshift, Snowflake, or Databricks.


If you are interested in this position please apply directly on Linkedin with an updated copy of your CV.

How to apply

To apply for this job you need to authorize on our website. If you don't have an account yet, please register.

Post a resume

Similar jobs

Urgent! x8 Fullstack Senior Software Engineers/Tech leads Needed – UK-based – – Rapid Growth – Fully Remote

Areti Group | B Corp,
4 hours ago
Urgent! x8 Fullstack Senior Software Engineers/Tech leads Needed – UK-based – – Rapid Growth – Fully Remote Areti Group is excited to partner with a Series funded startup poised for rapid expansion, growing by 40 heads in the next 3...

Rust Developer

Radley James,
7 hours ago
Hiring for a fully systematic crypto algo trading firm based in California. You can work fully remote from across Europe! You will be working within a fast paced environment but is able to keep it relaxed in order to build...

Solutions Architect

MBN Solutions,
7 hours ago
Data Solutions Architect - strong business & data focus Salary up to £100,000 + 15% bonus Remote / Hybrid - depending on proximity to an office Data Solutions Architect needed to drive how this business does data architecture! We are...