luxehouze~1d ago
New
New
Freelance Data Engineer
Data EngineerData
0 views0 saves0 applied
Quick Summary
Key Responsibilities
● Design, develop, and maintain scalable ETL/ELT pipelines to process structured and unstructured data. ● Manage and optimize databases, data warehouses, and data lakes (SQL/NoSQL, BigQuery, etc.).
Technical Tools
bigquerygcpmongodbmysqlpostgresqlpythonpytorchscikit-learnsqltensorflowdatabase-designdistributed-systemsetllinuxmachine-learningrest-apissecurity-best-practices
Responsibilities:
● Design, develop, and maintain scalable ETL/ELT pipelines to process structured and unstructured data.
● Manage and optimize databases, data warehouses, and data lakes (SQL/NoSQL, BigQuery, etc.).
● Ensure data quality, governance, and reliability through validation, monitoring, and automation.
● Optimize pipelines and queries for large datasets and high-volume transactions.
● Explore and analyze datasets using statistical methods to identify trends and insights.
● Build, validate, and fine-tune predictive and prescriptive machine learning models relevant to the business.
● Deploy models into production by integrating them with data pipelines and business applications.
● Communicate findings and recommendations through dashboards, visualizations, and reports.
● Set up, maintain, and monitor Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer).
● Manage ETL workflows on GCP, ensuring reliability, scalability, and cost efficiency.
● Implement security, access control, and compliance in cloud-based data systems.
● Collaborate with cross-functional teams (business, sales, operations) to ensure data is actionable and accessible.
Requirements & Qualifications:
● Bachelor’s degree in Computer Science, Information Technology, or related field (or equivalent experience).
● Minimum 2+ years of experience in Data Engineering, especially in data modeling, warehousing, and distributed systems.
● Strong proficiency in Python (for data pipelines, ML).
● Strong knowledge of SQL and NoSQL databases (MySQL, PostgreSQL, MongoDB) and experience with ClickHouse is a plus.
● Familiarity with RESTful APIs (design, development, integration).
● Familiarity with machine learning frameworks (TensorFlow, scikit-learn, PyTorch).
● Hands-on expertise with Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Composer, Cloud Storage); experience in GCP infra setup is a strong plus.
● Proficient in Linux-based servers and deployment workflows.
● Skilled in Git/Bitbucket for version control and collaborative development.
● Experience implementing security best practices in both web applications and data infrastructure.
● Strong problem-solving, analytical, and collaboration skills.
● Experience with Zoho Deluge scripting is a plus.
Location & Eligibility
Where is the job
Jakarta, Indonesia
On-site at the office
Who can apply
ID
Listing Details
- First seen
- May 6, 2026
- Last seen
- May 8, 2026
Posting Health
- Days active
- 0
- Repost count
- 0
- Trust Level
- 51%
- Scored at
- May 6, 2026
Signal breakdown
freshnesssource trustcontent trustemployer trust
External application · ~5 min on luxehouze's site
Please let luxehouze know you found this job on Jobera.
4 other jobs at luxehouze
View all →Explore open roles at luxehouze.
Similar Data Engineer jobs
View all →Analytics Engineer
USD 85000–95000
Temporary/InternRemote
Sr Principal Data Engineer
Senior Data Engineer I
Remote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Newsletter
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
A
B
C
D
No spam. Unsubscribe at any time.