Pattern
Pattern1d ago
New

Senior Data Engineer

United StatesUnited States·LehiFull-time (Salary)senior
Data EngineerData
0 views0 saves0 applied

Quick Summary

Overview

Are you obsessed with data, partner success, taking action, and changing the game? If you have a whole lot of hustle and a touch of nerd, come work with Pattern! We want you to use your skills to push one of the fastest-growing companies headquartered in the US to the top of the list.

Technical Tools
airflowawsbigquerydbtkafkapythonsegmentsnowflakesparksqlterraformdatabase-designecommerceetlforecastingmachine-learningmentoringnetworkingperformance-optimization
Are you obsessed with data, partner success, taking action, and changing the game? If you have a whole lot of hustle and a touch of nerd, come work with Pattern! We want you to use your skills to push one of the fastest-growing companies headquartered in the US to the top of the list. 
 
Pattern accelerates brands on global ecommerce marketplaces leveraging proprietary technology and AI. Utilizing more than 66 trillion data points, sophisticated machine learning and AI models, Pattern optimizes and automates all levers of ecommerce growth for global brands, including advertising, content management, logistics and fulfillment, pricing, forecasting and customer service. Hundreds of global brands depend on Pattern’s ecommerce acceleration platform every day to drive profitable revenue growth across 60+ global marketplaces—including Amazon, Walmart.comTarget.com, eBay, Tmall, TikTok Shop, JD, and Mercado Libre. To learn more, visit pattern.com or email press@pattern.com.
 
Pattern has been named one of the fastest growing tech companies headquartered in North America by Deloitte and one of best-led companies by Inc. We place employee experience at the center of our business model and have been recognized as one of Newsweek’s Global Most Loved Workplaces®.

As a Senior Data Engineer, you will be a high-impact "Game Changer" responsible for architecting and building the very foundation of Pattern's data-driven future. You will tackle massive, petabyte-scale challenges, transforming raw data into high-octane fuel for our AI models and global marketplace strategies. This is your chance to lead high-stakes technical initiatives that directly accelerate growth for hundreds of global brands in a fast-paced, elite engineering environment.
  • Designing and implementing robust ETL/ELT pipelines using Airflow, DBT, and cloud-native architectures.

  • Writing sophisticated, production-grade Python code to automate data orchestration and processing.

  • Building/optimizing complex SQL queries and dimensional models for OLAP and OLTP based systems

  • Collaborating with cross-functional teams to ingest and harmonize data from dozens of global marketplaces.

  • Building and maintaining infrastructure-as-code and containerized workflows to ensure platform reliability.

  • Leveraging AI thoughtfully to optimize processes and workflows

  • Bachelor’s degree in Computer Science, Data Science, or a related technical field (or equivalent experience).

  • 7+ years of professional data engineering experience with a heavy focus on ETL/ELT and data modeling.

  • 5+ years of expert-level SQL mastery, including window functions, CTEs, and deep performance tuning.

  • 4+ years of professional Python development specifically tailored for data pipelines and tooling.

  • 3+ years of hands-on experience building/optimizing large-scale data warehouses like Snowflake, BigQuery, or Redshift.

  • Proficiency with open-source frameworks such as Apache Spark, Trino, Kafka, and Debezium.

  • A "Data Fanatic" mindset with experience handling petabyte-scale diverse datasets.

  • Successfully executing the migration or optimization of massive data streams with zero downtime.

  • Consistently delivering clean, well-documented, and high-quality code that sets the standard for the engineering team.

  • Acting as a 'Doer' by taking the initiative to resolve platform bottlenecks before they impact partners.

  • Elevating the technical bar of the team through mentorship and the introduction of innovative engineering practices.

  • Opportunity to lead major architectural shifts within a rapidly expanding global tech company.

  • Regular networking and collaboration with high-level technical leadership and AI experts.

  • Upward mobility toward Staff Data Engineer or specialized technical leadership roles.

  • Continuous learning opportunities with cutting-edge technologies like Apache Iceberg and real-time streaming architectures.

  • 30 Days: Complete onboarding, gain a deep understanding of current data architectures, and begin contributing to existing projects.

  • 60 Days: Identify and implement at least one major performance optimization within the data environment and lead a small-scale pipeline project.

  • 90 Days: Take responsibility for a significant segment of data processes, collaborating with other engineers and contributing to the long-term roadmap for lakehouse integration.

  • This role reports directly to the Director of Data Engineering.

  • You will be joining a growing team of data professionals that span multiple geographies.

  • In this role, you will collaborate closely with Data Scientists, Software Engineers, AI Engineers, and Product Managers as well as other departments including Marketing and Sales.

  • Game Changers- A game changer is someone who looks at problems with an open mind and shares new ideas with team members, regularly reassesses existing plans and attaches a realistic timeline to goals, makes profitable, productive, and innovative contributions, and actively pursues improvements to Pattern’s processes and outcomes.

  • Data Fanatics- A data fanatic is someone who recognizes problems and seeks to understand them through data, draws unbiased conclusions based on data that lead to actionable solutions, and continues to track the effects of the solutions using data.

  • Partner Obsessed- An individual who is partner obsessed clearly explains the status of projects to partners and relies on constructive feedback, actively listens to partner’s expectations, and delivers results that exceed them, prioritizes the needs of your partners, and takes the time to create a personable experience for those interacting with Pattern.

  • Team of Doers- Someone who is a part of a team of doers uplifts team members and recognizes their specific contributions, takes initiative to help in any circumstance, actively contributes to supporting improvements, and holds themselves accountable to the team as well as to partners.

  • Phone Interview with Talent Acquisition

  • Video Interview

  • Onsite Interview

  • Executive Review

  • Offer

  • Strong Nice-to-Haves: Expertise in AWS services (Terraform, EKS, Lambda), experience with Apache Iceberg or Delta Lake, and a background in real-time streaming (Kafka/Kinesis).

  • Interview Tips: Be prepared to discuss your experience managing large-scale data outages or complex optimizations; highlight any 'Partner Obsessed' moments where your data work solved a critical business problem; and demonstrate your 'Data Fanatic' nature through a deep dive into a past side project or complex pipeline you built.

  • Location & Eligibility

    Where is the job
    Lehi, United States
    Hybrid — some on-site time required
    Who can apply
    US

    Listing Details

    Posted
    May 6, 2026
    First seen
    May 7, 2026
    Last seen
    May 8, 2026

    Posting Health

    Days active
    0
    Repost count
    0
    Trust Level
    62%
    Scored at
    May 7, 2026

    Signal breakdown

    freshnesssource trustcontent trustemployer trust
    Pattern
    Pattern
    lever

    Pattern is a leading ecommerce acceleration platform that provides a comprehensive range of services to help brands grow in the digital marketplace.

    View company profile
    Newsletter

    Stay ahead of the market

    Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

    A
    B
    C
    D
    Join 12,000+ marketers

    No spam. Unsubscribe at any time.

    PatternSenior Data Engineer