codvo-team
codvo-team~1d ago
New

Senior Data Engineer – Databricks & SaaS (Software-as-a-Service) Integrations (Pune)

IndiaIndia·Punesenior
Data EngineerData
0 views0 saves0 applied

Quick Summary

Overview

Job Title: Senior Data Engineer – Databricks & SaaS (Software-as-a-Service) Integrations

Key Responsibilities

Design and implement integrations between Databricks workflows and SaaS application backends Enable batch and streaming data flows, event triggers, and API-based communication Build backend services such as REST APIs, webhooks, and data interfaces…

Requirements Summary

Event-driven systems using Kafka, SNS, SQS, or similar Experience with streaming and real-time data processing- Exposure to data observability and monitoring tools What Success Looks Like Reliable, secure, and scalable Databricks–SaaS integrations…

Technical Tools
awsazurejavakafkapythonscalasparketlrest-apissaasstreaming-data
Job Title: 

At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.



We are seeking a Senior Data Engineer with strong Databricks and backend integration experience to design, build, and operate secure, scalable data integrations between Databricks data pipelines and Software-as-a-Service (SaaS) application platforms. This role focuses on batch and streaming data flows, API-driven integrations, event-based triggers, and enterprise-grade data reliability.

Responsibilities

~1 min read

Design and implement integrations between Databricks workflows and SaaS application backends

Enable batch and streaming data flows, event triggers, and API-based communication

Build backend services such as REST APIs, webhooks, and data interfaces

Implement data quality, validation, monitoring, and observability mechanisms

Design secure and auditable pipelines including authN, authZ, and RBAC

Troubleshoot pipeline failures, performance bottlenecks, and data inconsistencies- Collaborate with application, platform, and data engineering teams

Requirements

~1 min read

Strong experience with Databricks, Apache Spark, and data pipelines

Backend development experience (Python, Java, or Scala)

Experience integrating data platforms with SaaS application backends

Strong knowledge of REST APIs and event-driven architectures

Experience working with AWS or Azure cloud platforms

Understanding of enterprise security (authN, authZ, RBAC)

Experience with spec-driven development and coding agents

Nice to Have

~1 min read

Event-driven systems using Kafka, SNS, SQS, or similar

Experience with streaming and real-time data processing- Exposure to data observability and monitoring tools

Reliable, secure, and scalable Databricks–SaaS integrations

High data quality and observability across pipelines

Well-designed APIs and workflows supporting business growth


Location & Eligibility

Where is the job
Pune, India
On-site at the office
Who can apply
IN

Listing Details

First seen
May 6, 2026
Last seen
May 8, 2026

Posting Health

Days active
0
Repost count
0
Trust Level
51%
Scored at
May 6, 2026

Signal breakdown

freshnesssource trustcontent trustemployer trust
Newsletter

Stay ahead of the market

Get the latest job openings, salary trends, and hiring insights delivered to your inbox every week.

A
B
C
D
Join 12,000+ marketers

No spam. Unsubscribe at any time.

codvo-teamSenior Data Engineer – Databricks & SaaS (Software-as-a-Service) Integrations (Pune)