Part of our parent company Great-West Lifeco, Canada Life UK has operated in the United Kingdom since 1903. We have hundreds of respected and supported employees committed to doing the right thing for our customers and colleagues.
Canada Life UK is transforming to create a more customer-focused business by providing our customers with expertise on financial and tax planning, offering home finance and annuities propositions, and providing collective fund solutions to third party customers.
Job Summary
The Data Engineer is a technical role responsible for designing, developing, and maintaining data pipelines within the IT department. The pipelines will be realised in a modern lake environment and the engineer will collaborate in cross-functional teams to gather requirements and develop the conceptual data models. This role plays a crucial part in driving data-driven decision-making across the organisation, ensuring data availability, quality, and accessibility for various business needs.
What you'll do
Data Pipeline Development
- Design, model, develop and maintain data pipelines to ingest, store, process and present data (Azure Data Factory, Azure Data Lake Storage, Data Bricks, Power BI).
- Ensure data quality, accuracy, and consistency.
- Collaborate with data architects to ensure data pipelines align with the overall data architecture strategy.
- Perform data transformation tasks, including data cleansing, enrichment, and aggregation, to prepare data for analytics and reporting.
- Integrate data from structured and unstructured sources, ensuring compatibility and alignment with data models and business requirements.
- Automate data transformation processes to improve efficiency.
- Implement and maintain data quality checks and validation processes to identify and resolve data anomalies and errors.
- Monitor data pipelines for data quality issues and implement data quality improvements.
- Collaborate with business stakeholders to define data quality requirements.
- Collaborate with data architects and data scientists to design and implement data models, schemas, and structures.
- Ensure that data models support business reporting and analytics needs while optimizing query performance.
- Maintain data dictionaries and metadata to document data structures and relationships.
- Optimize data storage, retrieval, and query performance by implementing indexing, partitioning, and caching strategies.
- Monitor data processing performance and address bottlenecks as they arise.
- Stay updated with best practices in data processing performance tuning.
- Create and maintain documentation for data pipelines, data transformation processes, and data integration procedures.
- Foster a culture of knowledge sharing within the data engineering team and across the organization.
- Collaborate effectively with cross-functional teams, data stakeholders, and business units to understand data requirements and deliver data solutions that meet business needs.
- Communicate technical concepts and data solutions to non-technical stakeholders in a clear and understandable manner.
- Extensive experience in data engineering, including designing and developing data pipelines for retrieval / Ingestion / presentation / semantics in an Azure environment.
- Solid Python programming and Azure experience along with Databricks and Purview.
- Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows.
- Agile delivery experience and capability to take data security into account in practice.
- Knowledge and experience with Azure analytics and data science components such as PaaS SQL databases, Stream Analytics, IoT Hub and Cosmos DB, Azure Logic Apps, Event Hubs, Azure Functions, Synapse, Azure Data Factory.
- Experience preparing data for Data Science and Machine Learning as well as AI experience.
- DevOps experience using e.g., Azure CLI, YAML and Bicep.
- Understanding of Lake House / Delta Lake and Data Mesh concepts
- Skills in data acquisition (landing, ingestion and metadata) of various data types including Salesforce, XML, json, parquet, flat file systems and Relational data.
- Skills in data manipulation: Python, R executing within a Spark environment, orchestrated by Power Automate / Databricks.
- Data presentation/visualisation e.g. Power BI, qlik
- Experience with Spark clusters, both elastic permanent and transitory clusters
- Effective communication and collaboration skills to work with cross-functional teams and gather data requirements.
- Ability to optimise data solutions for performance, scalability, and efficiency.
- Familiarity with data governance, data security, and compliance requirements.
We believe in recognising and rewarding our people, so we offer a competitive salary and benefits package that's regularly reviewed. As a Canada Life UK colleague, you'll receive a competitive salary and comprehensive reward package including a generous pension and bonus scheme, along with, income protection, private medical insurance and life assurance. We have a fantastic number of other benefits and support services as well as regular personal and professional development.
How we work at Canada Life
Our culture is unique and incredibly important to us. We care about doing the right thing for our people, customers and community and helping others to build better futures. Our blueprint behaviours shape and influence how we work, and are central to the relationships we have with others. Every day we are encouraged to be more curious, own the outcome, face into things together and find a way forward.
We want colleagues to have rewarding careers with us so we invest in the development of our people, technology and workplaces. That's why we offer a range of training, flexible working and opportunities to grow and develop.
Diversity and inclusion
Building an inclusive workplace wi