Quick Summary
Overview
function googleTranslateElementInit() { new google.translate.TranslateElement({pageLanguage: 'en'}, 'google_translate_element'); } Data Engineer - Hybrid (Newark, NJ)…
Technical Tools
awsjenkinspostgresqlpower-bipythonsparksplunksqlagilelinux
Job Description
Looking for data engineers with 9+ years of experience.
1. Experience developing and deploying application code using SQL, Python, spark.
2. 3 to 5 years’ experience developing and deploying data pipeline in cloud.
3. 3 to 5 years experience in AWS using glue, athena, lambda, secrets manager, redshift, redshift spectrum, postgreSQL, cloudformaton, step functions, s3, ec2, boto3.
4. Securing data using IAM and Active Directory.
5. Experience developing Linux, UNIX shell scripts.
6. Developer experience in big data or warehouse (Hadoop, DB2 LUW) environment.
7. Experience with monitoring and logging techniques to be used in conjunction with Cloudwatch and Splunk.
8. Working knowledge of version control tools and branching techniques using Artifactory, Jenkins, and Bitbucket.
9. 3-5 years’ experience with Power BI.
Additional information on background:
10. Agile experience expected.
11. Leadership skill/ Soft skills: Collaborative, team player. Self-starter. Curious about technology.
12. Total level of experience: Seeking developers with python, spark, pyspark, and SQL developer experience in AWS.
13. AWS Cloud Practitioner certification, or other AWS certification is a plus.
14. Industries: Financial services tech experience is plus.
Looking for data engineers with 9+ years of experience.
1. Experience developing and deploying application code using SQL, Python, spark.
2. 3 to 5 years’ experience developing and deploying data pipeline in cloud.
3. 3 to 5 years experience in AWS using glue, athena, lambda, secrets manager, redshift, redshift spectrum, postgreSQL, cloudformaton, step functions, s3, ec2, boto3.
4. Securing data using IAM and Active Directory.
5. Experience developing Linux, UNIX shell scripts.
6. Developer experience in big data or warehouse (Hadoop, DB2 LUW) environment.
7. Experience with monitoring and logging techniques to be used in conjunction with Cloudwatch and Splunk.
8. Working knowledge of version control tools and branching techniques using Artifactory, Jenkins, and Bitbucket.
9. 3-5 years’ experience with Power BI.
Additional information on background:
10. Agile experience expected.
11. Leadership skill/ Soft skills: Collaborative, team player. Self-starter. Curious about technology.
12. Total level of experience: Seeking developers with python, spark, pyspark, and SQL developer experience in AWS.
13. AWS Cloud Practitioner certification, or other AWS certification is a plus.
14. Industries: Financial services tech experience is plus.
Location & Eligibility
Where is the job
—
Location terms not specified
Listing Details
- First seen
- May 6, 2026
- Last seen
- May 8, 2026
Posting Health
- Days active
- 0
- Repost count
- 0
- Trust Level
- 49%
- Scored at
- May 6, 2026
Signal breakdown
freshnesssource trustcontent trustemployer trust
4 other jobs at dt
View all →Explore open roles at dt.
Similar Data Engineer jobs
View all →Data Engineer Snowflake
Analytics Engineer
USD 85000–95000
Temporary/InternRemote
Sr Principal Data Engineer
Senior Data Engineer I
Remote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Newsletter
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
A
B
C
D
No spam. Unsubscribe at any time.
.jpg)