Senior Data Engineer Job Details | AusNet

Victoria
Full time
Posted
employer logo
AusNet
Mining, resources & energy
1,001-5,000 employees
25 jobs
Apply on company site

When you choose AusNet, you’ll join genuine people working together, making Real. Progress

At AusNet, we’re the link between renewable energy sources and local communities, playing a key role in the clean energy transition and providing essential energy to every Victorian family.

Help us lead the way, while enriching your career with real people who encourage you to bring your best and make real impact – for your career, our communities, and Australia’s cleaner energy future.

Purposeful work with genuinely good people. That’s refreshing.

Play a pivotal role as a Senior Data Engineer in implementing scalable data architectures that provide verified raw data, advanced datasets and insightful information to drive data driven decision making. You will work closely with the Data & Digital Enablement team, Enterprise Architecture, data architects’ data governance lead, data architects, analysts, and other engineers to ensure data pipelines are robust, efficient, and optimized for both operational and analytical needs.

This role is embedded within our Group Operations team that is currently guiding Ausnet to rapidly mature and provide outstanding strategic level technical and operational capabilities, whilst ensuring cross business coordination and collaboration.

With a hybrid split (Office/WFH), this role is based in Southbank, Melbourne and is part of the Data & Analytics team.

This role will deliver real outcomes, including but not limited to:

  • Collaborate with the Data & Digital Enablement team to ensure accurate, efficient data ingestion, transformation, and storage from multiple source systems.
  • Define business objects and success criteria for data validation and track performance using Power BI dashboards
  • Work with Data Owners and Data Governance Lead to define data quality standards and establish validation models to identify and isolate bad data.
  • Assist in addressing data inconsistencies across systems and align with the master data framework.
  • Partner with Data Architects, Data Scientists, and Data Analysts to design and develop scalable ETL pipelines that efficiently process verified raw data, advanced datasets and insightful information.
  • Optimise data models to support BI and analytical use cases with high performance.
  • Liaise with the Data Governance Lead to enforce data quality, version control, and governance standards across all data pipelines.
  • Automate repetitive data tasks and reporting processes to improve efficiency and reduce manual effort and continuously optimize data pipeline performance for scalability and reliability.
  • Serve as a technical lead to junior Data Engineers and strategic partner resources, providing guidance on AusNet best practices for data engineering, architecture design, and pipeline optimization
  • Oversee Data Model Deployment and Performance.

You don’t need to check every box; however, we are looking for a good combination of:

  • Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or a related field.
  • Proven understanding of utilities especially decision-making process and required insightful information
  • Advanced knowledge of business decision impacts and translating business challenges into data-driven solutions.
  • Significant experience as a Data Engineering or similar role, with at least 3+ years in a senior role.
  • Track record of building and optimizing large-scare data pipelines in production environments.
  • Strong experience in identifying and resolving data quality and integrity issues
  • Extensive experience with Azure and Databrick’s data capability, containerization (Docker, Kubernetes) and CI/CD pipelines.
  • Strong proficiency in programming languages such as SQL, Python, Java, or Scala.
  • Comprehensive experience with big data tools (e.g., Hadoop, Spark, Kafka, Flink) and data storage solutions (e.g., Redshift, Databricks, Azure Synapse, BigQuery, SQL/NoSQL databases).
  • Familiarity with cloud platforms (AWS, GCP, Azure) and their data services (e.g., S3, EC2, Lambda).
  • Proven record of data modeling, ETL processes, and performance optimization techniques.
  • Excellent communication skills with the ability to present complex data insights to both technical and non-technical audiences.

A real place to belong.

We celebrate unique voices, refreshing perspectives and diversity in our team. Engage and connect through our social club, family day, wide range of events or by joining one of our Employee Network Groups.


We believe in more than just competitive pay. Here’s what sets us apart:

  • Flexibility: whether this is hybrid work, flexible hours, or part time arrangements, we’ll work with you to help balance work and life.
  • Leave: more than typical personal leave and a generous 14 weeks of paid parental leave, with no minimum service.
  • Community giving: a paid day to volunteer with our social impact partner, Foodbank, or for a cause that matters to you.
  • Perks: we offer all the other perks you’ve come to expect like purchased leave, income protection insurance, novated leasing, corporate discounts, private health cover discounts and more.

As an industry in transformation, we’re excited by the possibilities ahead.

So, if you’re passionate about our purpose and committed to making real progress, bring your energy and join AusNet. Together, we can shape a new way forward.