Quick Summary
Overview
function googleTranslateElementInit() { new google.translate.TranslateElement({pageLanguage: 'en'}, 'google_translate_element'); } DataOps Engineer (WY)…
Technical Tools
airflowbashcppdockerjavakafkakubernetesmysqlprometheuspythonsqlagileci-cdetlmachine-learningstatistical-modeling
Requirements
~1 min read- Bachelor's degree in Computer Science, Computer Engineering, or a related technical degree; four years related experience; or equivalent combination of education and experience
- 2+ years experience in data streaming technologies, such as Kafka
- 2+ years experience using ETL (Extract, Transform, and Load) concepts
- Experience with querying and designing databases using one or more of the following: MySQL, MS SQL, Oracle SQL, or other professional database system
- Ability to work in teams and collaborate with others to clarify requirements, quickly identify problems, and collaboratively find creative solutions
- Ability to assist in documenting requirements as well as resolve conflicts or ambiguities
Nice to Have
~1 min read- 4 or more years experience in programming using one or more of the following: Java, C++, Perl, Python, or advanced Shell scripting.
- 3 or more years of experience in implementing data-driven solutions using tools such as Hadoop, Impala, Hive, NiFi, Athena, Redshift, BigTable, or Airflow.
- 2 or more years experience in machine learning and statistical modeling
- Experience in Cloud Native tools, such as Kubernetes and Docker
- Experience with using the R statistical computing language
- Experience with Agile at Scale, SAFe, and Lean Systems Engineering
Responsibilities
~1 min read- →Develop high-volume, low-latency, data-driven solutions utilizing current and next generation technologies to meet evolving business needs
- →Acquire big data input from numerous partners. Key technologies may include Python, Airflow, Prometheus, and Kafka.
- →Normalize complicated data sources to convert potentially unusable data into a format that can be efficiently used by software and/or employees. Key technologies may include Spark, Kinesis, Lambda
- →Build a CI/CD pipeline for our data software to ensure we keep quality high and time to market low. Key technologies may include Gitlab.
Department: Preferred Vendors
This is a contract position
Location & Eligibility
Where is the job
—
Location terms not specified
Listing Details
- First seen
- May 6, 2026
- Last seen
- May 8, 2026
Posting Health
- Days active
- 0
- Repost count
- 0
- Trust Level
- 49%
- Scored at
- May 6, 2026
Signal breakdown
freshnesssource trustcontent trustemployer trust
4 other jobs at dt
View all →Explore open roles at dt.
Similar Engineer jobs
View all →P
PearceservicesOSP Engineer
USD 30-38
Lead Water-Wastewater Engineer
USD 145000–172000
Full Time (40)
Senior Staff SoC STA/Timing Engineer
VTCJM-PE Engineer (Local)-Jiangmen
【線上面試】AOI自動光學檢測工程師/Auto Optical Inspection Engineer
Cloud Backup Engineer (Mumbai)
Browse Similar Jobs
Manager5.9kAssistant Manager5.6kTeam Member5.1kDirector2.9kAssistant2.7kConsultant2.5kAssociate2.5kData Collector2.2kFitness & Wellness2.1kTechnician2kCoordinator1.8kSupervisor1.8kRestaurant General Manager1.7kTeam Leader1.6kAnalyst1.5kBehavioral Health1.3kPart Time1.2kCrew Member1.2kSocial Worker1.1kAssistant General Manager1.1k
Newsletter
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
A
B
C
D
No spam. Unsubscribe at any time.
.jpg)