Data Engineer

Job Title: Data Engineer
Contract Type: Contract
Location: Sydney CBD, New South Wales
Salary: Negotiable
Reference: 195809_1561349601
Contact Name: Meg Geronimo
Contact Email:
Job Published: June 24, 2019 14:13

Job Description

Data Engineer

Permanent and contract position avaialable!

Immediate start


  • To design, build, maintain, and troubleshoot data processing pipelines with a particular emphasis on the security, reliability, fault-tolerance, scalability, fidelity, and efficiency of such solutions.
  • Able to analyse data to gain insight into business outcomes, build statistical models to support decision-making, and create machine learning models to automate and simplify key business processes.

The ideal candidate will have:

  • Minimum 5 years of experience in Data Engineering, working with both distributed architectures, ETL, EDW and Big Data technologies
  • Experience working in data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments.
  • Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Amazon Web Services, Azure and Google Cloud Platform.
  • Extensive experience working with SQL across a variety of databases
  • Experience working with both structured and unstructured data sources using cloud analytics (Google Cloud Big Query, Big Table, TensorFlow, etc.)
  • Demonstrated ability in one or more of the following programming or scripting languages - Python, JavaScript, or Java.
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB or similar
  • Experience with Big Data tools such as Pig, Hive, Impala, Sqoop, Kafka, Flume, Jupitor, Data Studio
  • Experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc , DFunc, Big Query & Big Table, Beam, AirFlow
  • Knowledge and demonstrated use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, hadoop, HDFS, mapr and spark.
  • Tertiary or equivalent in I.T. discipline
  • Certifications in Big Data &/or Cloud
  • Experience with Data Mapping and Modelling
  • Experience with Data Analytics tools

Should this role sound of interest to you, please click the "APPLY" button now!

For a more detailed discussion about this role, please contact Meg Geronimo at 9409 4716.