Skip to content

Data Engineer

  • Remote
    • Sydney, New South Wales, Australia
    • Melbourne, Victoria, Australia
    +1 more
  • Tech

Job description

About TRAILD 

TRAILD is a dynamic, fast-growing, remote-first SaaS company that streamlines, automates, and protects Accounts Payable (AP). 

Just as your bank provides always-on protection for your credit card, TRAILD delivers the same level of security to help businesses safeguard their B2B payments from fraud, errors, and mistakes. 

Our specialised accounts payable software integrates seamlessly with leading ERP systems, helping clients make their processes more streamlined, automated, and secure. 

We serve customers around the globe, and the market opportunity ahead of us is huge. We’re growing fast - it’s an exciting time to join! 

Here’s how one customer describes their experience with TRAILD: video 

  

Key Responsibilities:

  • Design, develop, and maintain a data lake/warehouse platform ensuring reliability, scalability, performance, and security.

  • Build and optimize data pipelines to ingest, process, and consolidate data from multiple sources, including both third-party systems and internal applications.

  • Implement and support both real-time streaming and batch data processing frameworks to keep the data platform up-to-date and analytics-ready at all times.

  • Develop complex code-based ETL/ELT data pipelines with performance optimised data modelling.

  • Collaborate with stakeholders across engineering, analytics, product, and business teams to define and execute data platform roadmaps, balancing business needs with technical sustainability.

  • Champion best practices in data governance, quality, lineage, privacy, and security for all workflows and datasets.

  • Proactively troubleshoot and resolve data issues, automate recurring processes, and optimize operational efficiency at scale.

  • Continually research and adopt new data engineering methods and technologies, especially within the GCP ecosystem, for real-time and batch processing.

Job requirements

Who You Are:

  • 5+ years' hands-on experience in data engineering, with a proven track record of architecting and operating enterprise-scale data platforms.

  • Building and deploying data lakes and data warehouses on GCP using services including BigQuery, Datastream, Dataflow, DataFusion, Pub/Sub, Composer(Airflow), CloudSQL, Cloud Functions and GCS

  • Good-level knowledge of PostgreSQL, Python and SQL.

  • Demonstrated experience integrating, transforming, and loading data from multiple heterogeneous sources into a unified warehouse environment.

  • The ability to engage with senior level stakeholders

  • Strong collaborator, able to work cross-functionally to understand business objectives and translate requirements into robust data platform solutions.

  • Experience with data security, privacy, and compliance frameworks.

  

Perks of Working at TRAILD 

  • Equity options - share in the company’s success. 

  • Remote-first flexibility: choose remote, hybrid, or a setup that works best for you. 

  • A team that genuinely loves working here: we scored 100% eNPS in 2023. 

  • Regular company events - from team dinners to Pilates sessions and bakery runs. 

 

Please note that the final applicants for this job will be asked to consent in writing to a police check / criminal background check to the extent permitted by law in your jurisdiction of employment. 

or