Senior Developer or Data Engineer (Java, Big Data) for REST

Job Title: Senior Developer or Data Engineer (Java, Big Data) for REST
Contract Type: Contract
Location: Sydney, New South Wales
Salary: Negotiable
Reference: 243213_1644278772
Contact Name: Pragna Katta
Contact Email:
Job Published: February 08, 2022 11:06

Job Description

We are seeking for a Senior Developer or Data Engineer (Java, Big Data) for REST in Canberra and Sydney. Apply now for a great opportunity!

About the Role
This role is responsible for working closely with the business analyst to build and maintain the "engine" that processes and standardises all data receives, including transaction reports as well as other datasets leveraged in the data analysis process in particular the Reporting Entity System Transformation (REST) program. This is a highly technical role as the processing logic implemented must both reflect the business requirements but also must be incredibly fast. As the Data Foundations team is responsible for both legacy and the strategic data processing environments, this role is required to provide vital services to work across both technology stacks. The Senior Developer/ Data Engineer will work collaboratively with a cross-functional delivery team and will be responsible for the following services:

  • Deliver high quality pipeline and API code aligned to approved designs and industry best practice implementations
  • Technical release co-ordination, within the team for weekly releases, and to all clients for cutover releases
  • Proactive management of potential API outages to internal and external clients
  • Perform other tasks as required by the team to meet business objectives.

The Senior Developer/ Data Engineer will have the following skills:

  • Degree in Computer Science or equivalent
  • Excellent development skills in Java and Scala, specifically building and performance tuning applications to processing big datasets (> 3TB) in parallel.
  • Commercial experience with Apache Spark, Kafka and the Elastic stack (Kibana, Logstash, Elasticsearch)
  • Ability to work with business analysts and testers to define feature files (which explain data processing logic).
  • Ability to write complex applications that use data from both SQL and NoSQL data stores (and optimise these applications for speed)
  • Experience in choosing & writing efficient processing algorithms.
  • Previous experience with containerisation and associated tools (including Rancher, Kubernetes)
  • An understanding of continuous delivery techniques and tools (including Jenkins).
  • Experience with automated testing frameworks.

For more information or to apply, please contact Pragna Katta on 61 2 6245 1706 quoting Job Reference: 18905