Quick Summary
Overview
Why Koantek? Koantek stands at the forefront of Data and GenAI solutions, specializing in healthcare, life sciences, manufacturing, and financial services.
Technical Tools
awsazuregcpkafkapythonpytorchscalasparksqltensorflowci-cddatabase-designetlmachine-learningproject-management
Why Koantek? Koantek stands at the forefront of Data and GenAI solutions, specializing in healthcare, life sciences, manufacturing, and financial services. As a global provider of technology services and solutions with a focus on Artificial Intelligence and Machine Learning, we deliver tailored solutions that enable businesses to leverage data for growth and innovation. Our team of experts utilizes deep industry knowledge combined with cutting-edge technologies, tools, and methodologies to drive impactful results. By partnering with clients across a diverse range of industries—from emerging startups to established enterprises—we help them uncover new opportunities and achieve a competitive advantage in the digital age. Data Engineer Description: As a Data Engineer at Koantek, you will leverage advanced data engineering techniques and analytics to support business decisions for our clients. Your role will involve designing and building robust data pipelines, integrating structured and unstructured data from various sources, and developing tools for data processing and analysis. You will play a pivotal role in managing data infrastructure, optimizing data workflows, and guiding data-driven strategies while working closely with data scientists and other stakeholders. The impact you will have: Guide Big Data Transformations: Implementation of comprehensive big data projects, including the development and deployment of innovative big data and AI applications. Ensure Best Practices: Guarantee that Databricks best practices are applied throughout all projects to maintain high-quality service and successful implementation. Support Project Management: Assist the Professional Services leader and project managers with estimating efforts and managing risks within customer proposals and statements of work. Architect Complex Solutions: Design, develop, deploy, and document complex customer engagements, either independently or as part of a technical team, serving as the technical lead and authority. Enable Knowledge Transfer: Facilitate the transfer of knowledge and provide training to team members, customers, and partners, including the creation of reusable project documentation. Contribute to Consulting Excellence: Share expertise with the consulting team and offer best practices for client engagement, enhancing the effectiveness and efficiency of other teams. Requirements Minimum qualifications: Educational Background: Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience). Experience: 3+ years of experience as a Data Engineer, with proficiency in at least two major cloud platforms (AWS, Azure, GCP). Proven experience in designing, developing, and implementing comprehensive data engineering solutions using Databricks, specifically for large-scale data processing and integration projects Develop scalable streaming and batch solutions using cloud-native components. Perform data transformation tasks, including cleansing, aggregation, enrichment, and normalisation, utilising Databricks and related technologies. Experience in applying DataOps principles and implementing CI/CD and DevOps practices within data environments to optimize development and deployment workflows. Technical Skills: Expert-level proficiency in Spark Scala, Python, and PySpark. In-depth knowledge of data architecture, including Spark Streaming, Spark Core, Spark SQL, and data modeling. Hands-on experience with various data management technologies and tools, such as Kafka, StreamSets, and MapReduce. Proficient in using advanced analytics and machine learning frameworks, including Apache Spark MLlib, TensorFlow, and PyTorch, to drive data insights and solutions. Databricks Specific Skills: Extensive experience in data migration from on-premises to cloud environments and in implementing data solutions on Databricks across cloud platforms (AWS, Azure, GCP). Skilled in designing and executing end-to-end data engineering solutions using Databricks, focusing on large-scale data processing and integration. Proven hands-on experience with Databricks administration and operations, including notebooks, clusters, jobs, and data pipelines. Experience integrating Databricks with other data tools and platforms to enhance overall data management and analytics capabilities. Good to have Certifications: Certification in Databricks Engineering (Professional) Microsoft Certified: Azure Data Engineer Associate GCP Certified: Professional Google Cloud Certified. AWS Certified Solutions Architect Professional
Location & Eligibility
Where is the job
Hyderabad, India
On-site at the office
Listing Details
- Posted
- November 2, 2025
- First seen
- May 6, 2026
- Last seen
- May 8, 2026
Posting Health
- Days active
- 0
- Repost count
- 0
- Trust Level
- 14%
- Scored at
- May 6, 2026
Signal breakdown
freshnesssource trustcontent trustemployer trust
External application · ~5 min on koantek's site
Please let koantek know you found this job on Jobera.
4 other jobs at koantek
View all →Explore open roles at koantek.
Similar Data Engineer jobs
View all →Analytics Engineer
USD 85000–95000
Temporary/InternRemote
Sr Principal Data Engineer
Senior Data Engineer I
Remote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Data Engineer, Azure - Remote, Latin America
Full-TimeRemote
Newsletter
Stay ahead of the market
Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.
A
B
C
D
No spam. Unsubscribe at any time.