Google Cloud Data Engineer

🇨🇦 Canada
$900 - $1K Annual
Posted 4 weeks ago
Expires June 9, 2026

NMI is seeking a skilled Mid-Level Data Engineer to join our Data Platform team. In this role, you will be responsible for building, maintaining, and improving the pipelines and data models that power analytics and business intelligence across the company. You will own specific areas of our BigQuery data warehouse end-to-end, delivering reliable data products within a framework set by senior and staff engineers. This position is ideal for someone who takes well-defined problems and executes them with high craft and reliability. You will work closely with data analysts, analytics engineers, and product teams, and are expected to grow toward greater technical ownership over time.

Key responsibilities include building and maintaining production-grade ELT pipelines that ingest data from internal applications, third-party SaaS tools, and event streams into our BigQuery data warehouse. You will own specific data domains end-to-end, ensuring accuracy, testing, and comprehensive documentation. The role involves writing and maintaining dbt models, tests, macros, and documentation within our established dbt project conventions and code review process. Additionally, you will develop and manage Airflow DAGs on Cloud Composer or similar tools to orchestrate data workflows, implement data quality checks, optimize BigQuery queries for cost and performance, and collaborate with analysts and stakeholders to translate business data needs into well-scoped pipeline and modeling tasks. Participation in on-call rotations and contribution to team documentation and runbooks are also expected.

The ideal candidate will have 3–5 years of experience in data engineering or a closely related data infrastructure role. Proven experience designing and implementing scalable data pipelines and warehouse architectures is essential. Strong expertise in Google Cloud Platform services such as BigQuery, Cloud Storage, Cloud Composer, Pub/Sub, and Dataflow is required. Hands-on experience with dbt (data build tool) at production scale, proficiency in SQL (including advanced BigQuery SQL), and proficiency in Python for data engineering tasks are necessary. Familiarity with data modeling concepts, experience with version control (Git), and understanding of data quality, lineage, and observability best practices are also important. A startup or growth-stage mindset and excellent communication skills are highly valued.

Preferred qualifications include experience with Terraform or similar infrastructure-as-code tools, familiarity with streaming technologies such as GCP Pub/Sub, Dataflow, or Apache Kafka, knowledge of BI tools like Looker or Tableau, and a Google Cloud Professional Data Engineer certification.

NMI offers a competitive compensation package, including an annual salary plus bonus, flexible PTO, health, dental, and vision insurance, 13 paid holidays, and company volunteer days. The company fosters a remote-first culture and values engineering craft and continuous learning.

More Jobs at NMI