intersnackitkg
New

Data Engineer - AI Pipelines & DataOps

GermanyGermany·Düsseldorf Headquartermid
Data EngineerData
0 views0 saves0 applied

Quick Summary

Overview

We Want You to Grow With Us High-quality, reliable data is the foundation on which every AI use case is built, and this role is responsible for making that foundation unshakeable.

Key Responsibilities

Design and implement scalable data ingestion pipelines for structured and unstructured data sources, including manufacturing systems, edge devices, and enterprise data platforms, ensuring consistent data quality from source to consumption Build and…

Technical Tools
awsazuregraphqlapi-designci-cdetlmachine-learningstreaming-data

High-quality, reliable data is the foundation on which every AI use case is built, and this role is responsible for making that foundation unshakeable. As our Data Engineer for AI Pipelines & DataOps, you will design and deliver the ingestion pipelines, streaming architectures, and APIs that feed Intersnack's AI and analytics systems with the data they need to perform at scale. You will report into the AI Programme and work in close collaboration with AI engineers, data architects, and business teams, with a particular focus on manufacturing environments, where edge device data presents unique ingestion challenges. At Intersnack, we build on a solid digital foundation, and your engineering work will be central to extending that foundation into the AI era.

This role offers the opportunity to work across a genuinely diverse data landscape, from manufacturing edge devices and IoT sensors to structured enterprise data systems across procurement and sales, giving you breadth of technical challenge that few data engineering roles can match. You will have direct influence over how AI and analytics use cases are enabled across a 4.5bn euro business, with the autonomy to define pipeline patterns, API standards, and DataOps practices that others will build on. Collaboration is at the heart of how we work, and you will be embedded in a programme team that spans data science, AI architecture, and business enablement. Dusseldorf is the home base, with flexibility for remote working.

You will design and build the data infrastructure that powers Intersnack's AI programme, from scalable ingestion pipelines that handle both structured enterprise data and unstructured signals from manufacturing environments, to the APIs that expose data, models, and AI services for consumption across the organisation. Your work will span architecture, implementation, and operations, with a strong focus on quality, observability, and continuous improvement.

Responsibilities

~2 min read
  • Design and implement scalable data ingestion pipelines for structured and unstructured data sources, including manufacturing systems, edge devices, and enterprise data platforms, ensuring consistent data quality from source to consumption

  • Build and maintain both batch and streaming data pipelines for analytics and AI use cases, leveraging cloud-native tooling on Microsoft Azure and/or AWS

  • Design and expose REST or GraphQL APIs for data assets, machine learning model endpoints, and AI services, enabling reliable, governed consumption by internal applications and analytical systems

  • Implement CI/CD practices and DataOps principles across pipeline development and deployment, supporting automated testing, versioning, and release management for data infrastructure

  • Ensure data quality, lineage, and observability across all pipelines, implementing monitoring and alerting that surfaces data issues before they affect AI or analytics outputs

  • Support the integration of manufacturing and edge device data, including IoT and OT systems, into the central data platform, addressing the specific latency, format, and volume challenges of operational technology environments

  • Collaborate with data scientists and AI engineers to design and optimise data flows that support model training, inference, and knowledge retrieval pipelines

  • Apply security-by-design practices to all pipeline and API design, including access control, encryption, and protections against data leakage, in line with Intersnack's sovereignty and compliance standards

  • Contribute to the AI literacy and enablement programme by supporting colleagues in understanding data pipeline health, data quality standards, and the role of reliable data in AI outcomes

Location & Eligibility

Where is the job
Düsseldorf Headquarter, Germany
On-site at the office
Who can apply
DE

Listing Details

First seen
May 6, 2026
Last seen
May 8, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
51%
Scored at
May 6, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust
Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

intersnackitkgData Engineer - AI Pipelines & DataOps