(Contract) Data Platform Engineer (AWS & Data Pipelines)
Quick Summary
Type: Contract, per-project.
We are looking for a highly motivated Data / Platform Engineer to join our team and help design, build and operate scalable data pipelines and cloud-based solutions.
In this role, you will work closely with engineering and product teams to build both streaming and batch data pipelines, contribute to system design, and help drive automation and monitoring across our data platform.
Responsibilities
~1 min read- →
Design, build, maintain and primarily operate scalable streaming and batch data pipelines, with a strong focus on maintenance, monitoring, troubleshooting and continuous improvement of existing pipelines.
- →
Work with AWS services, including Redshift, EMR and ECS, to support data processing and analytics workloads.
- →
Develop and maintain data workflows using Python and SQL.
- →
Orchestrate and monitor pipelines using Apache Airflow.
- →
Build and deploy containerized applications using Docker and Kubernetes.
- →
Break down high-level system designs into well-defined, deliverable tasks with realistic estimates.
- →
Collaborate with cross-functional teams in a fast-paced and distributed environment across the US and Europe.
- →
Drive automation, observability and monitoring to improve reliability, performance and operational efficiency.
- →
Support knowledge transfer and ownership handover as part of the planned transition to the consuming team.
Requirements
~1 min read-
Strong professional experience with Python and SQL.
-
Hands-on experience with AWS, specifically Redshift, EMR and ECS. AWS experience is mandatory (other cloud providers are not considered equivalent for this role).
-
Proven experience building and operating both streaming and batch data pipelines.
-
Professional experience with Apache Airflow, Docker and Kubernetes.
-
Ability to translate high-level system designs into actionable technical tasks and realistic estimates.
-
Comfortable working in dynamic and fast-paced environments and in distributed teams.
-
Strong interest in automation and monitoring.
-
Strong hands-on experience with Apache Spark.
-
Senior-level profile with strong autonomy, communication skills and ability to work effectively in distributed teams.
-
Proven ability to transfer knowledge and support ownership handovers.
-
Fluent or professional working proficiency in English (both written and spoken).
Nice to Have
~1 min read-
Previous experience in the telecom industry.
-
Experience with machine learning systems and/or event-driven architectures.
-
Experience with Apache Iceberg.
(*) SOUTHWORKS only hires individuals from countries that are not blocked or sanctioned by the United States, including those identified on the United States Office of Foreign Asset Control (OFAC).
Location & Eligibility
Listing Details
- Posted
- April 30, 2026
- First seen
- April 30, 2026
- Last seen
- May 4, 2026
Posting Health
- Days active
- 3
- Repost count
- 0
- Trust Level
- 68%
- Scored at
- May 4, 2026
Signal breakdown
Southworks is a leading software development firm that specializes in providing flexible, transparent, and scalable development solutions tailored to client needs.
View company profilePlease let SOUTHWORKS know you found this job on Jobera.
1 other job at SOUTHWORKS
View all →Explore open roles at SOUTHWORKS.
Similar Data Engineer jobs
View all →Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
No spam. Unsubscribe at any time.