Our trusted partner, a global leader operating in over 35 different countries, is currently recruiting a Data Technical Architect with recent experience designing data platforms and integrating Databricks within AWS. This is a key role focusing on strategic road maps for each data platform and plays a key role, from a governance, standards and best practice perspective as our client enters a new phase of transformation. Our client is looking to pay £95,000 + 30% bonus + excellent benefits on a hybrid weekly basis in Luton.
Whilst this being a highly technical role, this individual will not be required to be hands on. They will have strong capabilities in creating road maps, understanding business requirements and prior capabilities in redesigning cost effective solutions fit for purpose within an AWS server-less ecosystem focusing on key objective to improve GD-PR, Security & Privacy and the CDP platform.
Core responsibilities:
- Understand business requirements and strategy in order to ensure our platform capabilities are fit for purpose.
- Invested interest in the day-to-day health of the platforms, ensuring they run optimally and remain cost-effective
- Research, test and recommend new emerging technology in a fast-moving market that could be relevant to our client- assess it and promote its adoption.
- Provide technical expertise, assurance, and guidance to development teams (both to our cleint and 3rd parties) around data acquisitions and data curations onto the platform
- Shape the future of the data domain and aid in landing new “data” capabilities into BAU – considering people, process, and technology
- Strong knowledge of the core Databricks platforms and architecture within AWS (must have)
- Understanding of cloud-native services (PaaS, SaaS) and how they can compliment the core data platform (e.g. AWS : Airflow, Glue, SageMaker, DMS etc.)
- Strong understanding of Data Management principles (security and data privacy) and how they can be applied to solutions, e.g. access management, data privacy, handling of sensitive data (e.g. GDPR)
- Understanding of Data Science Tooling for Machine Learning (e.g. MLFlow, Sagemaker)
- Stakeholder management skills at all levels.
- Automation first” mindset when building and deploying solutions; considerations for self-healing and fault-tolerant methods to minimise manual intervention and support activities.
- Understanding of open source big data processing frameworks such as Spark, Hadoop.