Hybrid Working Pattern - 3 days in Office & 2 WFH
About us
Cynergy Bank is the UK’s human digital bank serving the needs of ‘scale up’ or medium sized and fast-growing SMEs; professionals; high net worth and mass affluent individuals, in essence those market segments that still value human service enabled by great technology.
We recognise that professional and personal lives often overlap and our mission is to help empower our customers to achieve their ambitions by serving all their interdependent banking needs. We provide a comprehensive range of digitally enabled products and services to meet the property finance, business and commercial banking, private banking and personal savings needs of our customers.
Our human and digital model transforms banking for customers who still value a face-to-face relationship that is enabled by the latest digital technology.
We partner with firms such as Google Cloud, Cigniti and Slalom as we continue to innovate in the human digital space.
Cynergy Bank Limited is authorised by the Prudential Regulation Authority and regulated by the Financial Conduct Authority and the Prudential Regulation Authority. Eligible deposits with Cynergy Bank Limited are protected by the UK Financial Services Compensation Scheme.
For more information on Cynergy Bank visit www.cynergybank.co.uk
Company Benefits
- Competitive salary and Company Bonus
- 210 hours (30 days) holiday plus bank holidays
- Option to purchase an additional 10 days holiday
- Pension contribution and Life Assurance
- Income Protection Scheme and Season Ticket Loan
- Private Medical Insurance and Health Check (After Probation)
- Electric Car Scheme and Money Coach (After Probation)
The role
As part of Cynergy Bank’s digital transformation, the Data Engineer will perform a pivotal role in helping to build the company’s new data and analytics architecture.
Working closely with data architects, other Data Engineers and business stakeholders, the role offers a chance to shape the D&A landscape, from data modelling to tool selection. This role requires broad technical expertise and hands-on ability.
Responsibilities:
- Data Pipeline Development: Design, develop, and maintain data pipelines to efficiently extract, transform, and load data from disparate sources into our data warehouse or data lakes. Ensure data integrity, quality, and consistency throughout the process.
- ETL/ELT (Extract, Transform, Load) Processes: Create and optimize ETL/ELT workflows to handle large volumes of data, including data cleansing, enrichment, and transformation, adhering to best practices and industry standards.
- Data Modelling: Collaborate with data analysts, data scientists, and other stakeholders to understand their data requirements. Design and implement appropriate data models to support efficient data querying and analysis.
- Database Management: Administer and manage databases, ensuring their performance, security, and availability. Implement backup and recovery strategies to safeguard critical data.
- Data Governance: Enforce data governance policies, including data security, privacy, and compliance, in accordance with regulatory standards and internal guidelines.
- Data Integration: Integrate data from internal and external sources, including third-party vendors and APIs, to enrich the existing datasets and expand the scope of insights.
- Performance Optimization: Continuously monitor and optimize the performance of data pipelines and queries to reduce latency and improve overall data processing efficiency.
- Automation: Identify opportunities to automate data workflows, data validation, and monitoring processes to enhance operational efficiency and reduce manual efforts.
Essential Knowledge & Experience
- Bachelor’s degree in Computer Science or related field.
- Experience with Google Cloud Platform and working in multi-cloud environments
- Strong experience in Data Engineering or a related field and programming skills in Python and SQL.
- Proven experience in designing and implementing complex data pipelines using tools such as Apache Beam, Google Dataflow or DBT.
- Proven experience of building & maintaining data platform elements (ingestion layers, data lakes, data warehouses etc.)
- Hands-on knowledge of data modelling and related tools & techniques
- Demonstrable skills in data optimisation (storage, processing, transferring)
- Demonstrated ability to identify and troubleshoot data quality issues.
- Excellent communication and collaboration skills with a strong attention to detail.
Desirable knowledge & Experience:
- Data Architecture Experience
- Experience implementing data management frameworks – such as CDMC
- Experience of current-state data/analytics operating models (data mesh, data fabric etc.)
- Knowledge of computational data governance solutions
- Experience of data governance tooling – esp. Alation
- Knowledge of modern data analytics ways of working, technologies, and