Job description
Our people work differently depending on their jobs and needs. From hybrid working to flexible hours, we have plenty of options that help our people to thrive.
This role is based in the United Kingdom and as such all normal working days must be carried out in the United Kingdom.
Join us as a Data Engineer
- You’ll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making
- We’ll look to you to drive the build of effortless, digital first customer experiences
- If you’re ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you’re looking for
What you'll do
As a Data Engineer, you’ll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank’s data, safe and secure.
You’ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You’ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture.
Your responsibilities will also include:
- Building advanced automation of data engineering pipelines through removal of manual stages
- Embedding new data techniques into our business through role modelling, training, and experiment design oversight
- Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets
- Sourcing new data using the most appropriate tooling for the situation
- Developing solutions for streaming data ingestion and transformations in line with our streaming strategy
The skills you'll need
To thrive in this role, you’ll need a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You’ll also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals.
Additionally, you’ll need:
- Good knowledge of data architecture, streaming ecosystem and distributed processing
- Strong Kafka, StreamSets, NiFi, Flume or Flink experience
- Experience of working in a fast-paced environment that has shifting priorities
- Experience of working in an Agile environment
- Experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, and exploration and analysis
If you need any adjustments to support your application, such as information in alternative formats or special requirements to access our buildings, or if you’re eligible under the Disability Confident Scheme please contact us and we’ll do everything we can to help.