confiz
confiz17h ago
New

Principal Data Engineer (Databricks / PySpark, DBT & Astronomer / Airflow)

Information Technologylead
Data EngineerData
0 views0 saves0 applied

Quick Summary

Overview

We are seeking a Principal Data Engineer with 5+ years of hands-on experience in the modern cloud data stack. This role is ideal for someone passionate about building robust,

Technical Tools
Data EngineerData

We are seeking a Principal Data Engineer with 5+ years of hands-on experience in the modern cloud data stack. This role is ideal for someone passionate about building robust, scalable data solutions and who thrives in agile, collaborative environments. 

The core technical stack centers around Databricks / PySpark for large-scale data processing, DBT for transformation, Astronomer / Airflow for orchestration (with Cosmos to run DBT DAGs natively in Airflow), and Snowflake as the cloud data warehouse. 

Responsibilities

~1 min read
  • Build and maintain scalable data pipelines using DBT and Astronomer/Airflow for batch and incremental data processing. 
  • Develop modular DBT transformation layers (staging → intermediate → mart) with robust testing, documentation, and incremental model strategies. 
  • Orchestrate end-to-end workflows on Astronomer / Airflow, using Cosmos to trigger and manage DBT models as native Airflow DAGs. 
  • Design and optimize Snowflake schemas, queries, and cost controls for analytical workloads. 
  • Enforce data quality standards through DBT tests and pipeline observability tooling. 
  • Write clean, well-tested Python and SQL code and actively participate in code reviews. 
  • Lead and mentor a team of data engineers; contribute to sprint planning, architectural decisions, and technical standards. 

Requirements

~2 min read
  • Databricks & PySpark: Delta Lake, batch and streaming workloads, cluster and cost optimization. 
  • DBT (Core & Cloud): Advanced data modelling patterns, incremental strategies, custom tests, and CI/CD integration. 
  • Astronomer / Airflow: DAG authoring, Cosmos for DBT orchestration, and pipeline observability. 
  • Snowflake: Data modelling, query performance tuning, access control, and cost governance. 
  • Python and SQL: Strong programming skills with a focus on modular, maintainable, and testable code. 
  • Cloud platform: Hands-on experience with Azure or AWS. 
  • Agile & CI/CD: Comfortable working in agile teams with a CI/CD development mindset. 
  • Degree: Bachelor’s in Computer Science or a related discipline. 

We have an amazing team of 700+ individuals working on highly innovative enterprise projects & products. Our customer base includes Fortune 100 retail and CPG companies, leading store chains, fast-growth fintech, and multiple Silicon Valley startups.

What makes Confiz stand out is our focus on processes and culture. Confiz is ISO 9001:2015 (QMS), ISO 27001:2022 (ISMS), ISO 20000-1:2018 (ITSM), ISO 14001:2015 (EMS), ISO 45001:2018 (OHSMS) Certified. We have a vibrant culture of learning via collaboration and making workplace fun.

People who work with us work with cutting-edge technologies while contributing success to the company as well as to themselves. 

To know more about Confiz Limited, visit: https://www.linkedin.com/company/confiz-pakistan/

 

Location & Eligibility

Where is the job
Information Technology
On-site at the office
Who can apply
Same as job location

Listing Details

Posted
May 15, 2026
First seen
May 15, 2026
Last seen
May 15, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
51%
Scored at
May 15, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust
Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

confizPrincipal Data Engineer (Databricks / PySpark, DBT & Astronomer / Airflow)