Data Modeler - Data Lakehouse

PhilippinesPhilippines·Quezon Citymid
Data ModelerData & AI
0 views0 saves0 applied

Quick Summary

Overview

Data Modeler Pointwest is looking for a visionary Data Modeler to serve as the technical lead for our modern data initiatives. This is a role for a high-impact practitioner who has successfully navigated the transition from traditional Data Warehousing to modern Data Lakehouse architectures.

Requirements Summary

Experience: 5-7 years of dedicated Data Modeling experience within Data Warehouse or Lakehouse environments. Proven Track Record: Successfully delivered at least 5 distinct data projects from initial requirements through to production.

Technical Tools
bigquerydbtsnowflakesqldatabase-designperformance-optimization

Pointwest is looking for a visionary Data Modeler to serve as the technical lead for our modern data initiatives. This is a role for a high-impact practitioner who has successfully navigated the transition from traditional Data Warehousing to modern Data Lakehouse architectures.

Driven by our core values of Leadership, Excellence, and Innovation, you will design the blueprints that transform raw data into a strategic corporate asset. You will embody the Pointwest culture of Agility, Accountability, and Customer Centricity by ensuring the structural health of our data ecosystem and optimizing cloud performance to deliver world-class insights.

Delighting Our Customers & Stakeholders

  • Architecting Excellence: Design and implement conceptual, logical, and physical data models that support both structured SQL analytics and advanced Data Science workloads.

  • Medallion Strategy: Lead the implementation of Bronze, Silver, and Gold data layers to provide stakeholders with curated, high-quality data assets.

  • Reliability First: Act as the first line of defense against "Garbage In, Garbage Out" by designing automated validation scripts and performing root cause analysis on discrepancies.

Growing Our Business

  • Schema Evolution: Manage the lifecycle of data schemas to ensure the environment remains scalable and flexible as the business integrates new data sources.

  • Technical Leadership: Serve as the bridge between complex business logic and technical execution, ensuring all data movements are documented and transparent.

Improving the Way We Work

  • Performance Optimization: Collaborate with engineers to optimize partitioning, clustering, and indexing (e.g., Z-Ordering) to reduce cloud compute costs and improve speed.

  • Standardization: Enforce enterprise data standards and naming conventions to prevent the creation of "Data Swamps" and ensure a unified logical model.

Developing Myself and Others

  • Multi-Model Expertise: Apply and share knowledge of various modeling techniques, including Dimensional/Star Schema, 3NF, and Data Vault, to solve diverse business challenges.

Delighting Our Customers & Stakeholders

  • Data Trust Score: Maintain a high "Data Trust Score" (>95%) through successful automated validation and rapid resolution of discrepancy tickets.

  • Model Adoption: Target 80% or more of production reports being sourced from curated "Gold Layer" models rather than ad-hoc tables.

Growing Our Business

  • Architectural Integrity: Measurable reduction in "Schema Debt" by minimizing deprecated or redundant tables over time.

  • Mapping Accuracy: Achieve a "Logic Re-work" rate of less than 10% during the development phase.

Improving the Way We Work

  • Compute Efficiency: Deliver a 15-20% reduction in cloud compute costs (Snowflake/Databricks credits) through smart physical design and refactoring.

  • Documentation Coverage: Maintain 100% documentation coverage for Gold-layer assets, including full Source-to-Target Mappings (STTM).

Developing Myself and Others

  • Deployment Velocity: Ensure Data Engineers can move from model design to production with minimal clarification sessions, reflecting the clarity of your technical guidance.

Requirements

~1 min read
  • Experience: 5-7 years of dedicated Data Modeling experience within Data Warehouse or Lakehouse environments.

  • Proven Track Record: Successfully delivered at least 5 distinct data projects from initial requirements through to production.

  • Technical Mastery: Expert-level proficiency in Kimball/Star Schema and 3NF; hands-on experience with Medallion Architecture.

  • Cloud Platforms: Technical proficiency in Snowflake, Databricks, or Google BigQuery.

  • Tooling: Experience with industry-standard tools such as erwin, SQLDBM, SAP PowerDesigner, or dbt.

  • Education: Bachelor’s degree in Computer Science, Information Systems, or a related quantitative field.

Location & Eligibility

Where is the job
Quezon City, Philippines
On-site at the office
Who can apply
PH

Listing Details

First seen
May 6, 2026
Last seen
May 8, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
51%
Scored at
May 6, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust
Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

pointwest-communityData Modeler - Data Lakehouse