Quick Summary
Yext (NYSE: YEXT) is the leading brand visibility platform, built for a world where discovery and engagement happen everywhere — across AI search, traditional search, social media, websites,
Yext (NYSE: YEXT) is the leading brand visibility platform, built for a world where discovery and engagement happen everywhere — across AI search, traditional search, social media, websites, and direct communications. Powered by over 2 billion trusted data points and a suite of integrated products, Yext provides brands the clarity, control, and confidence to perform across digital channels. From real-time insights to AI-driven recommendations and execution at scale, Yext turns a brand's digital presence into a competitive advantage, which is only possible through our team of innovators and enthusiastic collaborators. Join us and experience firsthand why we are consistently recognized as a ‘Best Place to Work’ globally by industry leaders such as Built In, Fortune, and Great Place To Work®!
We are looking for a Data Engineer to join our data management and integration team. This role will focus on building and optimizing scalable data pipelines, improving data flow, and supporting cross-functional analytics initiatives.
The ideal candidate is analytical, curious, and experienced in working with large datasets, with the ability to derive insights using exploratory analysis and machine learning techniques. You will partner with data analysts and business teams to ensure reliable, efficient, and scalable data delivery across projects.
This is a hands-on role requiring a self-driven individual who can support multiple systems, teams, and data needs in a fast-paced environment.
Responsibilities
~1 min read- →Own and contribute to the end-to-end data engineering lifecycle, including requirements gathering, data modeling, pipeline design, development, testing, deployment, and maintenance.
- →Design, build, and optimize scalable ETL/ELT pipelines using Matillion DPC (Data Productivity Cloud), dbt, and cloud-native platforms like Snowflake and AWS.
- →Experience working with AWS cloud infrastructure leveraging services such as S3, Lambda, Glue where applicable.
- →Develop and manage transformations using dbt, following modular, testable, and version-controlled practices.
- →Leverage Snowflake Cortex capabilities for advanced analytics, AI-driven insights, and data enrichment use cases.
- →Collaborate with business stakeholders, analysts, and cross-functional teams to understand data requirements and resolve data quality issues.
- →Develop and maintain robust SQL transformations and Python-based data processing for complex workflows and automation.
- →Ensure data quality, integrity, and reliability through validation, monitoring, and troubleshooting.
- →Create and maintain clear technical documentation to support maintainability and knowledge sharing across the team.
- →Identify and implement improvements to data processes, pipeline performance, and operational efficiency.
- →Stay current with industry trends and incorporate best practices from the data engineering and modern data stack communities.
- 3–5 years of experience in Data Engineering, ETL/ELT development, or Data Warehousing.
- Strong hands-on experience with Matillion DPC or similar ETL tools (Informatica, Talend, Pentaho, etc.).
- Experience working with AWS data services, including S3, Glue, Lambda.
- Strong experience with Snowflake, including performance optimization and data modeling.
- Experience with dbt (data build tool) for transformation, testing, and deployment workflows.
- Proficiency in SQL with experience writing complex, optimized queries.
- Working knowledge of Python for data processing, scripting, and automation.
- Familiarity with Snowflake Cortex or similar AI/ML-powered data capabilities is a plus.
- Solid understanding of data warehousing concepts including:
- Change Data Capture (CDC)
- Slowly Changing Dimensions (SCD)
- Data modeling (star/snowflake schemas)
- Experience with version control (Git) and ticketing systems such as JIRA or Zendesk.
- Bachelor’s degree in Computer Science, Engineering, or a related technical field (or equivalent practical experience).
Nice to Have
~1 min read- Matillion DPC certification (e.g., Building a Data Warehouse with Matillion).
- Experience with orchestration tools (e.g., Airflow).
- Exposure to CI/CD pipelines for data workflows.
- Experience working with modern data stack tools and frameworks.
- Familiarity with infrastructure-as-code (Terraform/CloudFormation) in AWS environments.
Requirements
~1 min readAll legitimate Yext communications come from @yext.com email addresses. Messages from other domains (for example, @yext.team) are not authorized and are likely fraudulent. If you receive a message that seems suspicious, do not share personal information, click on links, or provide payment. Instead, please report the communication to security@yext.com.
Location & Eligibility
Listing Details
- Posted
- April 13, 2026
- First seen
- April 13, 2026
- Last seen
- April 29, 2026
Posting Health
- Days active
- 15
- Repost count
- 0
- Trust Level
- 36%
- Scored at
- April 29, 2026
Signal breakdown
Yext is an innovative digital presence platform that empowers multi-location brands to manage and enhance their online engagement effectively.
View company profilePlease let Yext know you found this job on Jobera.
3 other jobs at Yext
View all →Explore open roles at Yext.
Similar Data Engineer jobs
View all →Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.