As the selected candidate, you will:
- Design, develop, and maintain ETL/ELT pipelines on GCP using BigQuery, Cloud Storage, Dataflow, Cloud Functions, and other related services.
- Build and optimise data models within BigQuery to support analytics, reporting, and operational needs.
- Implement data quality measures, governance standards, and metadata management best practices.
- Collaborate effectively with stakeholders, including data analysts and data scientists, to understand their data requirements and deliver reliable solutions.
- Automate data workflows and create reusable frameworks using Python, SQL, and cloud-native tools.
- Monitor and tune the performance of data pipelines to ensure their reliability, cost-efficiency, and scalability.
- Support the migration of legacy datasets and workloads into GCP environments.
- Contribute to establishing enterprise data architecture standards and maintain comprehensive documentation.
- Proven experience as a Data Engineer, with strong knowledge of GCP services, particularly BigQuery.
- Expertise in designing and implementing scalable data pipelines.
- Proficiency in SQL and Python for automation and data processing.
- Experience in data modelling, data quality, and governance principles.
- Strong collaboration skills to work effectively with cross-functional teams.
- A solid understanding of cloud architecture and enterprise data strategies.
- Familiarity with legacy data migration projects.
- Knowledge of data architecture standards and data management practices.
Peoplebank and Leaders IT are committed to creating a diverse and inclusive workplace where everyone belongs. We welcome applications from people of all backgrounds, identities, and experiences. If you need adjustments to the recruitment process due to your circumstances, please let us know—we’re here to support you.












