About the Role
Reporting to the Data Delivery & Governance Manager, you'll work closely with internal teams to deliver data engineering solutions that enable data-driven insights for customers and commercial outcomes. In addition to this, the role is responsible for the smooth running of machine learning pipelines including automated testing and monitoring with a view to continual improvement of deployed models.
Responsibilities
- Regular contributions to technology innovation in the data science and data engineering domain to business and team members
- Provide technical expertise to Data Ops and Data Commercialisation teams
- Participate in deployment and implementation activities for data engineering solutions,
- Execute and refine a robust Machine Learning Operations
- Identify and streamline legacy processes through automation and re-engineering.
Mandatory experience
- Experience with cloud computing required and GCP services, in particular, is required.
- Experience in data manipulation and visualization through the use of enterprise tools including Tableau and ETL software
- Bachelor's degree in Engineering, Computer Science or related IT qualification
Desirable experience
- Experienced in Python, scripting languages and pipeline automation tools such as Airflow and Dataflow.
- Excellent understanding of DevOps best practice
- Understands Agile methodology and intermediate understanding of software development principles.
For more information, please contact Ben Neal on 0380807217 and quote the title or #209045
Looking forward to hearing from you!