Join us as a Data Engineer
- You'll be part of an evolving team and will play a key role in the transformation of our data ingestion processes for the Marketing and Analytics function
- You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers and the bank safe and secure
- Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank
What you'll do
As a Data Engineer, you'll work in a cross disciplinary team comprising data scientist and analysts. We’ll look to you to drive value for the customer through modelling, sourcing and data transformation. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering.
Additionally, you'll work closely with our business stakeholders to extract load and transform data from third parties to build and maintain marketing analytics that enable the bank to make better strategic marketing decisions. We'll also expect you to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences.
Day-to-day, you'll be:
- Delivering the automation of data engineering pipelines through the removal of manual stages
- Developing comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development
- Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight
- Delivering data engineering strategies to build a scalable data architecture and customer feature rich dataset for data scientists
- Developing solutions for streaming data ingestion and transformations in line with streaming strategy
The skills you'll need
To be successful in this role, you’ll need to be an intermediate level programmer and Data Engineer with a qualification in Computer Science or Software Engineering. You’ll also have a strong understanding of data usage and dependencies with wider teams and the end customer, as well as a proven track record in extracting value and features from large scale data.
It would be ideal if you have experience of using Snowflake, Unix scripting, cloud, Airflow, RestAPIs, NoSQL and Kafka, however we’ll provide you with necessary training to support your development.
You’ll also demonstrate:
- Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling
- Extensive experience using RDMS, ETL pipelines, Python and ANSI SQL
- A good understanding of modern code development practices, such as pairing, code reviews and source management
- Good critical thinking and proven problem solving abilities