Streaming Platform Lead

  • Job Reference: R-00123465-OTHLOC-GBR-5FEDI034
  • Date Posted: 29 April 2021
  • Employer: RBS
  • Website:
  • Location: Edinburgh
  • Salary: On Application
  • Sector: Banking & Financial Services
  • Job Type: Full Time

Job Description

Join us as a Streaming Platform Lead

  • This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences
  • You’ll be simplifying the bank through developing innovative data driven solutions, inspiring to be commercially successful through insight, and keeping our customers’ and the bank’s data safe and secure
  • Participating actively in the data engineering community, you’ll deliver opportunities to support our strategic direction while building your network across the bank

What you'll do

We’ll look to you to lead and inspire a team of data engineers to drive value from data for the customer. You’ll be working closely with core technology and architecture teams to deliver strategic data solutions, while driving data engineering strategies to build a scalable data architecture and customer feature rich dataset.

We’ll also expect you to be:

  • Driving the automation of data engineering pipelines through the removal of manual stages
  • Developing and sharing your knowledge of the bank’s data structures and metrics, advocating change where needed for product development
  • Developing a strategy for streaming data ingestion and transformations
  • Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight
  • Driving Agile and DevOps adoption in the delivery of data engineering

The skills you'll need

To be successful in this role, you’ll need to be an expert level programmer and data engineer with a qualification in Computer Science or Software Engineering. You’ll also need a strong understanding of data usage and dependencies with wider teams and the end customer, as well as extensive experience in extracting value and features from large scale data.

You’ll also demonstrate:

  • Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling
  • Working knowledge of messaging, event or streaming technology such as Apache Kafka
  • Knowledge of programming languages such as Python, SQL, Java, and Scala
  • An understanding of change data capture processes and stream processing tools such as StreamSets and Kafka Streams
  • A background of implementing programming best-practice, especially around scalability, availability and performance
  • Experience of managing engineering teams to deliver and support critical business services
  • A good understanding of modern code development practices