logo

View all jobs

RQ10371 - Software Developer - ETL

Toronto, ON

Job Title: Software Developer – ETL

Work Arrangement: Onsite
Hours: 7.25 hours/day (Monday–Friday, 8:00 AM – 5:00 PM, excluding lunch)


Role Overview

The Software Developer – ETL is responsible for designing, building, and supporting scalable data pipelines and data platforms that power enterprise analytics and reporting. This role focuses on ELT/ETL development, data modeling, performance optimization, data quality, and cloud-based data engineering using modern Azure technologies.


Key Responsibilities

  • Design and implement technical solutions for data ingestion and storage into a centralized data repository

  • Develop and maintain ELT/ETL pipelines, data transformations, and business logic

  • Perform database modeling and schema design to improve performance and scalability

  • Produce technical design documents and solution artifacts for long-term support

  • Investigate and resolve data-related incidents and pipeline failures

  • Execute scheduled and ad-hoc data loads

  • Implement data quality checks and report on data integrity issues


Required Skills & Experience

Technical Skills

  • Strong experience designing and developing Medallion Data Lakehouse architectures

  • Hands-on expertise with Azure Databricks, Delta Lake, and distributed data processing

  • Experience building and optimizing ETL/ELT pipelines using:

    • Azure Data Factory (ADF)

    • Databricks (PySpark, SQL, Delta Live Tables)

    • dbt

  • Experience integrating structured and unstructured data into star and snowflake schemas

  • Strong knowledge of relational and analytical databases (Azure SQL, Synapse, PostgreSQL)

  • Advanced SQL skills including query optimization, indexing, and partitioning

  • Experience with Azure Data Lake Storage (ADLS), Event Hubs, Azure Functions

  • Strong understanding of cloud security, RBAC, and data governance

  • Proficiency in Python (PySpark), SQL, and PowerShell

  • Experience with CI/CD for data pipelines (Azure DevOps, GitHub Actions)

  • Experience implementing data quality, lineage, metadata, and cataloging frameworks

  • Familiarity with Unity Catalog for Databricks permissions management

  • Experience with Power BI, including data modeling and performance tuning


Core Engineering Experience

  • Strong experience translating technical requirements into working, tested solutions

  • Experience designing, developing, and maintaining complex, mission-critical data systems

  • End-to-end SDLC experience across development, testing, deployment, and support

  • Strong background in technical documentation and solution design

  • Experience evaluating design options and recommending optimal technical solutions


General & Soft Skills

  • Strong analytical and problem-solving skills

  • Proven ability to troubleshoot complex data issues

  • Strong communication and documentation skills

  • Ability to collaborate effectively with cross-functional teams

  • Detail-oriented, organized, and able to meet tight deadlines

  • Consulting mindset with strong stakeholder engagement skills


Nice to Have

  • Experience working in public-sector or regulated environments

  • Familiarity with enterprise architecture and governance frameworks

  • Experience supporting large-scale, data-intensive systems

Share This Job

Powered by