Senior Data and Analytics Consultant
Alimetry
Data Science
Auckland, New Zealand
Posted on Mar 24, 2026
- Drive outcomes, build trust, and grow your career with Fusion5
- Design and implement modern data platforms and pipelines across Microsoft Azure, Microsoft Fabric, Databricks, and Snowflake.
- Join a leading transformation partner
What You'll Do
- Design, build, and optimise end-to-end data pipelines and platforms across Microsoft and other modern data ecosystems (e.g., Microsoft Fabric, Azure Synapse, Databricks and Snowflake).
- Develop robust data transformation and processing solutions using SQL, Python, Spark/PySpark, Azure Data Factory, dbt
- Design and deliver centralised data models for analytics and reporting (e.g., dimensional modelling and analytical models), supporting self-service BI outcomes.
- Apply modern engineering and DataOps practices: Git-based source control, CI/CD concepts, environment promotion, testing discipline, and release management.
- Ensure solutions are operationally sound (monitoring/logging considerations, performance tuning, and cost-awareness).
- Collaborate with clients to gather requirements, facilitate technical discussions/workshops, translate needs into actionable solutions, and produce clear documentation and handovers.
- Contribute to project delivery quality and consistency by creating and improving reusable assets (templates, patterns, standards, and documentation).
- Support enterprise-wide data strategies, governance frameworks, and transformation initiatives
You bring a strategic mindset, strong technical acumen, and a passion for solving complex problems. You're comfortable navigating ambiguity, learning on the fly, and making sound decisions with incomplete information.
- 3+ years (Intermediate) or 5+ years (Senior) experience in data engineering and delivery (consulting or equivalent product/platform delivery).
- Strong SQL skills and experience building data pipelines and transformations.
- Strong Python skills (and/or Spark/PySpark experience where relevant).
- Solid data modelling capability (dimensional modelling / star schemas, understanding of incremental patterns and data quality considerations).
- Experience delivering on at least two modern data platform ecosystems, drawn from:
- Microsoft Azure data services (e.g., Synapse, Azure Data Factory, Azure SQL, ADLS) and/or Microsoft Fabric
- Databricks
- Snowflake (potentially with dbt transformation/modelling patterns)
- Familiarity with DataOps/engineering practices: Git, CI/CD concepts, environments, and delivery discipline.
- Solid understanding of data quality, governance, and ingestion strategies
- Consulting capability: strong communication, stakeholder engagement, and documentation skills; comfortable working through ambiguity, managing complexity, and driving transformation.
- Familiarity with BI tools including Power BI (semantic modelling, reporting, performance considerations, deployment practices).
- Experience with AWS, Informatica
- Azure DevOps/GitHub Actions experience, automated testing patterns, and release automation.
- Infrastructure as Code exposure (e.g., Terraform, Bicep) for repeatable deployments.
- Observability/operational maturity experience (monitoring, alerting, logging standards, performance/cost optimisation) (nice-to-have).
- Relevant certifications across Microsoft Fabric/Azure (e.g. DP600/DP700), Databricks, and Snowflake.
Feeling valued goes beyond salary. Discover the additional benefits of working at Fusion5;
- Individual Performance and Development Plans
- Professional Self Care
- Diversity and Inclusion
- Social Connection and Recognition
- Environment and Community