Skills and accountabilities.
- Solid experience in large scale data transformation, design and solution delivery experience of high-quality data processing pipelines, with a focus on Cloud Technologies:
- Strong SQL skills
- Develop in Scala or Python
- Open source data tooling (Apache Kafka, Beam, Spark etc.)
- Knowledge of, or experience with traditional and MPP Data Warehouses (Exadata, Teradata, RedShift, MS SSAS, etc.) and associated ETL tooling (Informatica, etc)
- OOP or functional design patterns
- Understanding and experience in dimensional models (star, Snowflake etc)
- CI/CD, Github
- Traditional RDBMS (MSSQL, Postgres, MySQL etc)
- Cloud: AWS (S3, EMR, EKS, EC2, lambda, etc.)
- Strong core SQL skills
- Is skilled in designing and building maintainable and scalable end to end data pipelines
- Is skilled in physical and logical data models
- Experience in corporate environment
- Experience in the finance and Data Services domain
- Experience in working on large, complex projects in an unpredictable environment with competing priorities.
- Experience and knowledge in reporting tools like Power BI
- Ingestion and curation of source data in the data lake, build and test of conformance and content zone layers.
- Ability to understand user requirements and to translate them into functional requirements
- Automate processes and controls to foster trust in data.
- Works collaboratively with source teams, product owners, platform teams and other consultant developers and QAs.
- Deliver improved productivity and efficiency.
Bachelor Degree in Information Technology, Computer Science or similar