The Role
This is highly transactional environment inside the business, growing a Data Services Science practice from the ground up. You will have the opportunity to use modern technologies and to work with a cutting-edge data engineering platform.
What you will bring…
- Knowledge of leading cloud platforms; GCP or AWS would be an excellent addition to your skills
- Experience of Data Warehousing and ETL technologies and Data Quality
- Strong expertise with Google Cloud and Azure and related technology
- The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying irregularities and linkages across disparate data sets.
- Meaningful experience in Distributed Processing (Spark, Hadoop, Hive/Impala, EMR, etc).
- Deep understanding of Information Security principles to ensure compliant handling and management of client data.
- Experience working collaboratively in a close-knit team and in clearly communicating complex solutions.
- Golang ,Python, T-SQL, MongoDB, Scala development skills would be beneficial.
- Experience and interest in cloud infrastructure (Google Platform, AWS, Databricks, Data Lake) and containerization (Kubernetes, Docker, etc).
- Worked in an Agile team producing frequent deliverables.
- Advance programming skills in at least one language (Go/Python/Scala/Java).
- Cloud based Datawarehouse Services like AWS RedShift, Google BigQuery or equivalent.
It would also be great if you had experience with…
- Golang Programming
- Data Modelling
- Data pipeline
- GKE
- Airflow
- Pub/Sub
- Terraform
- Cloud Build
- DataProc
- Eventstore