Data Operations Engineer - Remote, from anywhere in Australia at Zip Co

ZipBiz, Full-Time, Australia remote engineering full-time
Description
Posted 12 days ago

About us

Zip is a high-growth fintech, founded in Australia and expanding worldwide, with the purpose to give people the freedom to own it. We use technology and data to bring customers and partners together through a valued and fair payments experience.

Our strategic focus is underpinned by an aligned high-performance team, a reliable, secure and scalable platform, constantly-improving product experiences and a trusted consumer brand.

Zip is growing rapidly and globally; and our mission is to enable our consumers to choose Zip as their first payment choice, everywhere and every day. 

We’ve built an amazing culture at Zip, and our teams are proud to work hard to provide innovative solutions to our customers and partners. Our values are central to everything we do: Customer First, Own It, Stronger Together, and Raise the Bar.

Engineering at Zip

At Zip, we build cloud-native software applications that serve millions of customers and process billions of dollars in payments. We use microservices, domain-driven design and event-driven architectures to ensure scalability and performance as we grow, and all this presents some fascinating engineering problems for our squads to solve.

We strive to create high quality software and deliver at a high pace. As such we utilize test driven development and continuous integration / continuous deployment tools, along with agile planning and iterative development techniques, and we really value clean code, good design and automated testing.

The Role - Work from anywhere in Australia

  • The purpose of the Data Operations Engineer role is to ensure the efficient performance of our data pipelines, build and deploy effective ETL solutions, and to support the operation of our reporting tools.
  • Reporting to the Engineering Manager, you will iterate quickly, optimize, and pivot solutions to satisfy requirements. Working with the Zip Business Capital squad and relevant business areas such as Customer Experience and Finance, you will collaborate closely to build and maintain ETL services, data pipelines, data platforms and reporting solutions related to our Capital and Trade products.
  • What you'll do

  • Own the operational support for our data platforms and pipelines
  • Automation of reporting and ETL services
  • Contribute to and lead data solution architecture decisions
  • Operational support for data and reporting related requirements across Zip Business
  • Improve the codebase, and make suggestions on where we can improve further
  • Perform effective code reviews
  • Provide support when production issues occur with products your team owns
  • Explore new technologies and share your findings with the engineering and operations teams
  • Contribute to a culture of continuous improvement and innovation
  • Work in a team with a "customer first" and “team first” mentality
  • Promote best practice standards in terms of test automation, clean architecture, observability, continuous delivery
  • Your Skills and Experience

  • 3+ years experience in a Data Operations, ETL Development, Data Engineering or similar role
  • Very strong SQL skills with deep experience in ETL/ELT
  • Experience configuring and managing databases, data lakes and data warehousing (including Redshift, MySQL, PostgreSQL, S3, Athena)
  • Analytical and reporting experience with Tableau and PowerBI
  • Experience with data pipeline and reporting tools (e.g. Stitch, Talend, Tableau, PowerBi) 
  • Experience building containerized and serverless data management solutions using docker, kubernetes, and AWS Lambda
  • Exposure to modern monitoring tooling including NewRelic, ELK and CloudWatch
  • Polyglot software engineering and automation scripting experience (e.g. with Python, bash, JavaScript, etc)
  • Experience architecting, implementing, running and automating ETL solutions
  • Desirable skills, not mandatory

  • Some experience with NoSQL and document based data stores
  • AWS services such as Lambda, DynamoDB, S3, and Cloudfront, SNS/SQS, ECS/EKS, API Gateway, Cloudfront, etc
  • Infrastructure as code with Terraform / CloudFormation
  • Passionate about Continuous Delivery and rapid development
  • Stream processing with Kafka as well as visualization and analytics tools such as Grafana and Kibana