· Knowledge of Relational Database Systems and technologies like PostgreSQL, SQLServer, Oracle , and modelling.
· Domain knowledge in Data Engineering scheduling tools like Apache Airflow, Dagster, Apache NiFi, Apache Oozie
· Knowledge of Big Data Technologies like Snowflake, Google Big Query, Databricks, Azure Fabric
· Extensive experience coding Python Data Engineering pipelines and python applications
· Knowledge of Document Databases like MongoDB and Azure CosmosDB
· Knowledge of Distributed and Scalability for Databases including routing and sharding
· Experience setting up an Data Engineering function and running it with little business support
· Experience delivering production grade Data Engineering pipelines with monitoring and service notifications
· Domain knowledge in modern Architecture frameworks for Data like Data Mesh and Data Fabric
· Proven experience in PostgreSQL modelling of a Database schema, migration of data assets into that DB schema, validating the data into the new database.