Acting as the Hadoop Developer of the group your main responsibilities include:
- Designing and developing the bigdata application in Cloudera bigdata clusters
- Implement the design based on the requirements and architecture decision
- Administering and supporting Big Data clusters including Hortonworks/Cloudera and Opensource Bigdata eco-system
- Building robust, efficient and reliable data pipelines consisting of diverse data sources to ingest and process data into cloud platform
- Providing technical expertise for new and ongoing initiatives to work with project stakeholder to come up most feasible technical solution
You will need to have:
- Experience within the aspects of implementation, configuration and support of Hadoop environments
- Experience in designing, developing, and implementing Bigdata applications
- Experience in Java, Spring, Python, Shell scripting, Ansible, Puppet, Kubernetes, Docker
- and Terraform.
- Experience in Building and administration of Opensource Bigdata platforms is advantageous
- Experience in implementing DevOps tools highly advantageous such Git, Bamboo, Puppet etc.
- Knowledge in design, development, maintenance, troubleshooting and debugging application in production environments
- Hands on experience working with Advanced SQL, Java, Python and other Scripting languages.
- Good time management and ability to successfully juggle multiple projects at the same time.
- Good stakeholder management and communications skills will allow you to successfully keep the team updated on deliverables and ensure key project stakeholders are aware of progress against project milestones.
- A degree in Information Technology or similar